Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How to use sentence-transformers/static-similarity-mrl-multilingual-v1 model? #1160

Open
michalkvasnicak opened this issue Jan 19, 2025 · 1 comment
Labels
question Further information is requested

Comments

@michalkvasnicak
Copy link

Question

If I try to use sentence-transformers/static-similarity-mrl-multilingual-v1 it fails on tokenizer.json not found. Is it possible to somehow convert the model to use it ? ONNX runtime is already there.

@michalkvasnicak michalkvasnicak added the question Further information is requested label Jan 19, 2025
@xenova
Copy link
Collaborator

xenova commented Jan 19, 2025

Sure, you can do the following:

import { AutoModel, AutoTokenizer, matmul } from '@huggingface/transformers';

// Load model and tokenizer
const model = await AutoModel.from_pretrained('sentence-transformers/static-similarity-mrl-multilingual-v1', {
    config: { model_type: 'bert' },
});
const tokenizer = await AutoTokenizer.from_pretrained('Xenova/bert-base-multilingual-uncased');

const sentences = [
    'It is known for its dry red chili powder .',
    'It is popular for dry red chili powder .',
    'These monsters will move in large groups .',
];
const inputs = tokenizer(sentences, { padding: true, truncation: true });
const { sentence_embedding } = await model(inputs);

const normalized = sentence_embedding.normalize();
const scores = await matmul(normalized, normalized.transpose(1, 0));
console.log(scores.tolist());
// [
//   [ 0.9999996423721313, 0.8800840973854065, 0.0017940629040822387 ],
//   [ 0.8800840973854065, 1.0000007152557373, 0.03064093366265297 ],
//   [ 0.0017940629040822387, 0.03064093366265297, 1.0000004768371582 ]
// ]

There are some warnings printed out when loading the model, but you can safely ignore those.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Further information is requested
Projects
None yet
Development

No branches or pull requests

2 participants