Replies: 8 comments 14 replies
-
let me see if I have this right, you are unable to point Ollama to ROCm because Ollama expects ROCm to be in the same directory? |
Beta Was this translation helpful? Give feedback.
-
That's a difficult choice. GPUs from NVIDIA are too expensive today and AMD got really good with their 9000 series which I will buy later. So I have to maybe use ROCm or the AI Accelerator on the GPU can be used without it. Maybe you can make a inquiry by startup of Alpaca which Instance should be installed or should be used if there are multiple instances installed. It's a huge thing I know but every AMD GPU users will be forced to use a similar app if you stop implementing AMDs ROCm. |
Beta Was this translation helpful? Give feedback.
-
The main reason why I use Alpaca is because of how easy it is to get going. Just install Alpaca and the extension and you are good to go with your AMD GPU. So I think preserving ollama support as an extension is worth doing to keep things accessible for everyone. |
Beta Was this translation helpful? Give feedback.
-
One question: alpaca's ability to communicate with and control an external ollama server(localhost or otherwise) won't be affected by the separation/removal of ollama from alpaca, right? I was looking forward to the integration of the two projects, but if it is infeasible as you described, I think it is best to manage ollama separately, given that it is possible. I'd like to keep using my 7900XTX for local LLMs, and at the same time I don't want to go back to using the ollama CLI; Alpaca is so nice to use. |
Beta Was this translation helpful? Give feedback.
-
My Personal OpinionI wanted to wait a couple of hours to get some votes before sharing my personal opinion, so here it is. Alpaca started as an interface for Ollama, it actually didn't include the instance at first. Then it became an easy way of just installing something and chatting with bots directly. It's been a fun year with a lot of changes for my little project and I think this could probably be a big change. With the introduction of the instance manager users can now connect to Gemini, ChatGPT, etc etc. Making Alpaca more of an AI hub than just an Ollama client. To reflect that I think the best approach would be to get rid of the Ollama instance... by default. So, this is what I want to do: Make a new extension called Then I would transform the existing I'm not sure if I can make it so that Flathub detects conflicts if you try to install both extensions or something like that but I hope I can figure it out. This approach would also make it possible for people who just want a client for an existing Ollama instance or Gemini to have a minimal app with no instance that they won't use, the same thing as the Of course, I would have to make it so that the welcome screen now has an explanation or at least a link to a wiki article in the repo or something. |
Beta Was this translation helpful? Give feedback.
-
I GOT IT OMGSo get this, Ollama originally didn't include this library called Numa, so I added it manually to the flatpak, apparently it now includes it, having both just broke ROCm. I went to bed at like 4 am and couldn't figure this out, THE SOLUTION CAME TO ME IN MY FUCKING DREAM. I don't know if I should be impressed or scared. Anyways, this is what I want to do now
So if you have AMD you should install both extensions |
Beta Was this translation helpful? Give feedback.
-
It worksSo I just built Alpaca with it's new manifest and both extensions, everything works, I'm so tired but overjoyed hahahaha |
Beta Was this translation helpful? Give feedback.
-
Update is UnderwayI will take the weekend off but I wanted to push the update! 5.1.0 is coming to Flathub, it is the first version of Alpaca to not include Ollama ever since it was introduced, it has a little popup that will explain everything. And yeah, the Ollama extension is already out, the nice people at Flathub were really fast and helped me polish it up! I hope everything works because I won't be able to do any work this weekend, please report any issues and I will take a look at them sunday night! Thanks again for all the patience! |
Beta Was this translation helpful? Give feedback.
-
Hello everyone I'm the developer of Alpaca, I've heard you all, AMD ROCm stopped working with Alpaca's Ollama instance in Flatpak.
Why is this happening?
To put it simple, when installing the ROCm extension Alpaca moves ROCm into /app/plugins/AMD/... inside the Flatpak, now, this was fine because there's an env variable that told the Ollama instance where to find the library.
Does this has to do with the instance manager?
Nope, just an unfortunate coincidence.
Ok so what's the problem then?
Well apparently there's either something hardcoded into Ollama that expects the Ollama libraries to be in the same directory as the ROCm stuff or I'm just stupid and can't figure this out.
Now then, why don't you just move the extension to the existing library directory?
Nope can't do, something actually really nice about Flatpak is that the extensions can't modify existing directories, I actually made a version of Alpaca internally that has both the Ollama instance and ROCm preinstalled.
So just preinstall ROCm
Some people don't even like Ollama being included, adding a couple of gigs to people that don't even use the instance or don't have an AMD card is not the best way of fixing this.
What do I want to do?
If it's up to me I would just stop including Ollama altogether, it's been a real pain in the ass to keep it updated and fixing stuff. To be fair Ollama is a really great piece of software that's excellently packaged, I just don't know what I'm doing most of the time.
Even though I don't believe in free software being a democracy I will make an exception here because this is a big change.
So, my lovely users, what do you guys want me to do about this?
Feel free to add other options, call me lazy or whatever, I love reading your comments!
26 votes ·
Beta Was this translation helpful? Give feedback.
All reactions