What can I use for an offline, selfhosted LLM client, pref with images,charts, python code execution
catty @ catty @lemmy.world Posts 13Comments 281Joined 1 mo. ago
catty @ catty @lemmy.world
Posts
13
Comments
281
Joined
1 mo. ago
What can I use for an offline, selfhosted LLM client, pref with images,charts, python code execution
What can I use for an offline, selfhosted LLM client, pref with images,charts, python code execution
Try the beta on the github repo, and use a smaller model!