We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Terminal stopped responding
ramalama serve tinyllama --host 0.0.0.0
main: server is listening on http://0.0.0.0:8080 - starting the main loop srv update_slots: all slots are idle slot launch_slot_: id 0 | task 0 | processing task slot update_slots: id 0 | task 0 | new prompt, n_ctx_slot = 2048, n_keep = 0, n_prompt_tokens = 7 slot update_slots: id 0 | task 0 | kv cache rm [0, end) slot update_slots: id 0 | task 0 | prompt processing progress, n_past = 7, n_tokens = 7, progress = 1.000000 slot update_slots: id 0 | task 0 | prompt done, n_past = 7, n_tokens = 7 slot release: id 0 | task 0 | stop processing: n_past = 99, truncated = 0 slot print_timing: id 0 | task 0 | prompt eval time = 47.81 ms / 7 tokens ( 6.83 ms per token, 146.41 tokens per second) eval time = 1350.11 ms / 93 tokens ( 14.52 ms per token, 68.88 tokens per second) total time = 1397.92 ms / 100 tokens srv update_slots: all slots are idle ^C^C^C^C^C^C^C^C^C
To get shell back, used:
ps aux | grep ramalama | awk '{print }' | xargs kill -9 ps aux | grep 8080 | xargs kill -9
Terminal macOS v2.14 (155)
The text was updated successfully, but these errors were encountered:
You need to do ramalama stop in a separate window.
ramalama stop
Could you check if you do an exec into the container and see if llama-serve is ignoring SIGTERM
podman exec -l kill TERM 1
Sorry, something went wrong.
I just checked this is definitely a llama-serve issue.
ggerganov/llama.cpp#11742
c71a148
Successfully merging a pull request may close this issue.
Terminal stopped responding
To get shell back, used:
Terminal macOS v2.14 (155)
The text was updated successfully, but these errors were encountered: