I'm using AI and I feel terrible now
HumanPerson @ HumanPerson @sh.itjust.works Posts 35Comments 530Joined 2 yr. ago
HumanPerson @ HumanPerson @sh.itjust.works
Posts
35
Comments
530
Joined
2 yr. ago
Deleted
Permanently Deleted
Removed Deleted
Permanently Deleted
Ollama can pull info from the web using multiple sites, but yes local AIs are more prone to hallucination. Google did release Gemma3 which has a 27B model which is probably the most cost effective way to get into local models that rival chatgpt (if you can call about 2k cost effective). That was why I recommended duck.ai as well, as it has access to gpt and llama3.3:70b which will do a lot better.