Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Issue loading Phi 3.5 vision models #657

Open
andrzejwp opened this issue Jan 23, 2025 · 0 comments
Open

Issue loading Phi 3.5 vision models #657

andrzejwp opened this issue Jan 23, 2025 · 0 comments

Comments

@andrzejwp
Copy link

Dear team,

I'm having an issue loading either of the Phi 3.5 vision models.

Tested models:

  • Phi-3.5-vision-instruct q0f16
  • Phi-3.5-vision-instruct q3f16_1
  • Phi-3.5-vision-instruct q4f16_1
  • Phi-3.5-vision-instruct q4f32_1

Tested platforms:

  • Mac OS Sequoia 15.3, Macbook Pro M2 Max
    • Edge Version 132.0.2957.115 (Official build) (arm64)
    • Arc Version 1.78.1 (57736
  • Windows 11 24H2
    • Chrome 131.0.6778.265

Tested code

  • Version deployed on https://chat.webllm.ai
  • locally executed code from web-llm/examples/vision-model/ (commit sha 632d347 - current main branch)

The error

In all cases I get the same error when loading the Phi 3.5 vision models. After downloading the model this error appears in the browser console log:

sw.js:64 [FATAL] /Users/cfruan/Documents/tvm/web/../src/runtime/relax_vm/ndarray_cache_support.cc:333: ValueError: Cannot find parameter in cache: vision_embed_tokens.img_processor.vision_model.embeddings.position_embedding.q_weight
Image

Can you please suggest how to address this?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant