Must fight temptation to buy an overpriced raspberry pi
catty @ catty @lemmy.world Posts 13Comments 260Joined 1 mo. ago
catty @ catty @lemmy.world
Posts
13
Comments
260
Joined
1 mo. ago
NSFW Deleted
Permanently Deleted
NSFW Deleted
Permanently Deleted
What can I use for an offline, selfhosted LLM client, pref with images,charts, python code execution
What can I use for an offline, selfhosted LLM client, pref with images,charts, python code execution
What can I use for an offline, selfhosted LLM client, pref with images,charts, python code execution
What can I use for an offline, selfhosted LLM client, pref with images,charts, python code execution
See here's the thing. Why would anyone want to host ALL the stuff on one pi? That is not what they were designed for. Ollama on a pi? Are you out of your mind? I'd run the biggest model I can on a modern gpu not some crappy old computer or pi....Right tool, right job. And why is dropping containers "less secure"? Do you mean "less cool"? Less easy to deploy? But you're not deploying it, you're installing it. You sound like a complete newb which is fine, but just take a step back from things and get some more experience. A pi is a tool for a purpose, not the end all. Using an old laptop is not going to save the world and arguing that it's just better than a pi (or similar alternative) is just dumb. Use a laptop for all I care, I'm not the boss of you.
As for an arr stack, I'm really disappointed with the software and don't use it and those who do have way too much time to set it up, and then make use of it!