Skip to content

Fix max_tokens handling in vllm_vlms.py (#2637) #7

Fix max_tokens handling in vllm_vlms.py (#2637)

Fix max_tokens handling in vllm_vlms.py (#2637) #7

Triggered via push January 22, 2025 18:51
Status Success
Total duration 20s
Artifacts

new_tasks.yml

on: push
Scan for changed tasks
7s
Scan for changed tasks
Fit to window
Zoom out
Zoom in

Annotations

1 warning
Scan for changed tasks
ubuntu-latest pipelines will use ubuntu-24.04 soon. For more details, see https://github.com/actions/runner-images/issues/10636