Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Only get Lower accuracy than reported in the paper #26

Open
justgogorunrun opened this issue Sep 24, 2024 · 1 comment
Open

Only get Lower accuracy than reported in the paper #26

justgogorunrun opened this issue Sep 24, 2024 · 1 comment

Comments

@justgogorunrun
Copy link

Is anyone evaluating pre-trained parametric models with much lower accuracy than reported in the paper? What are the prompts you are using in your evaluations? When I set it to take 8 frames, the accuracy was only 44, almost 4 less than in the paper.

@kcz358
Copy link
Contributor

kcz358 commented Oct 2, 2024

May I ask which dataset you are evaluating with? I evaluated videomme recently and the score seems fine. We do notice that with the upgrade of the lmms-eval, the score changes a bit but in most of the cases the scores are increasing

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants