Running Local LLMs with Ollama on openSUSE Tumbleweed
brucethemoose @ brucethemoose @lemmy.world Posts 21Comments 1,978Joined 1 yr. ago
brucethemoose @ brucethemoose @lemmy.world
Posts
21
Comments
1,978
Joined
1 yr. ago
Poll: Zohran Mamdani's policies are popular with Americans outside New York — even if Mamdani is not
Scoop: Four reasons Musk attacked Trump's "big beautiful bill"
Niche Model of the Day: Openbuddy 25.2q, QwQ 32B with Quantization Aware Training
How does Lemmy feel about "open source" machine learning, akin to the Fediverse vs Social Media?
What model size/family? What GPU? What context length? There are many different backends with different strengths; it's complicated, but I can tell you the optimal way to run it with a bit more specificity, heh.