Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ramalama fails to detect that I have a podman machine with libkrun #760

Open
benoitf opened this issue Feb 7, 2025 · 7 comments
Open

ramalama fails to detect that I have a podman machine with libkrun #760

benoitf opened this issue Feb 7, 2025 · 7 comments

Comments

@benoitf
Copy link

benoitf commented Feb 7, 2025

It happens when manually setting the type of the podman machine

Let's start a podman machine using the command CONTAINERS_MACHINE_PROVIDER=libkrun podman machine init --now

The podman machine list command is not seing the machine

podman_machine_list = ["podman", "machine", "list"]
conman_args = ["podman", "machine", "list", "--format", "{{ .VMType }}"]
try:
output = run_cmd(podman_machine_list).stdout.decode("utf-8").strip()
if "running" not in output:
return None

so it won't use podman to launch the model.

That said the machine is started

podman machine list --all-providers
NAME                     VM TYPE     CREATED      LAST UP            CPUS        MEMORY      DISK SIZE
podman-machine-default*  libkrun     3 hours ago  Currently running  6           2GiB        100GiB

and if I use RAMALAMA_CONTAINER_ENGINE=podman ramalama serve instructlab/merlinite-7b-lab it works

@rhatdan
Copy link
Member

rhatdan commented Feb 7, 2025

Does the podman machine list command always fail, or only if there are two machines, one with vfkit and one with libkrun?

@benoitf
Copy link
Author

benoitf commented Feb 7, 2025

it fails it my machine is not using the default provider ( applehv) or if libkrun is default but my machine using applehv

In short: it fails to recognize if there is a non default provider type machine

@rhatdan
Copy link
Member

rhatdan commented Feb 7, 2025

So we should just specify it in RamaLama, although@ericcurtin reported the option was not available on his laptop.

This this option come in post 5.2?

@ericcurtin what version of podman do you have installed?

@ericcurtin
Copy link
Collaborator

ericcurtin commented Feb 7, 2025

5.2.3

I think we need a bigger coding fix in general closing for now, also the container_manager() function is called 4 times in a typical run, not good for performance that level of forking. It's too late it my timezone closing.

@ericcurtin ericcurtin reopened this Feb 7, 2025
@ericcurtin
Copy link
Collaborator

Reopening, meant closing the PR

@ericcurtin
Copy link
Collaborator

#762

@ericcurtin
Copy link
Collaborator

#763 reopened here and simplified the solution

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants