Chat GPT appears to hallucinate or outright lie about everything
bungleofjoy @ bungleofjoy @programming.dev Posts 0Comments 8Joined 1 yr. ago
bungleofjoy @ bungleofjoy @programming.dev
Posts
0
Comments
8
Joined
1 yr. ago
Removed
Funny guy indeed
LLMs don’t “feel”, “know”, or “understand” anything. They spit out statistically most significant answer from it’s data-set, that is all they do.