Any of you have a self-hosted AI "hub"? (e.g. for LLM, stable-diffusion, ...)
bobburger @ bobburger @fedia.io Posts 4Comments 184Joined 1 yr. ago
bobburger @ bobburger @fedia.io
Posts
4
Comments
184
Joined
1 yr. ago
Llamafile is a great way to get use an LLM locally. Inference is incredibly fast on my ARM macbook and rtx 4060ti, its okay on my Intel laptop running Ubuntu.