That is because they don't have any baked in concept of truth or lie. That would require labelling each statement as such. This doesn't scale well for petabytes of data.
This. LLMs are great for information retrieval tasks, that is, they are essentially search engines. Even then, they can only retrieve information that they are trained on, so eventually, the data can get stale and the model will require retraining on more recent data. Also, they are not very good with tasks that require reasoning such as solving complex engineering problems.
I despise influencer culture. Turning everyone with even a tiny bit of following into a walking, talking billboard is peak shit ad industry has subjected us to. That is why I block ads indiscriminately now, no exceptions. This enshittification on the behalf of ads industry needs to be stopped.
I have been a huge AC fan since AC 2. In fact, I am currently replaying AC 2 to recapture the good times. But, Valhalla was the last straw. AC games have become too bloated for their own good. I gave up on AC for good.
The only way AI is going reach human-level intelligence is if we can actually figure out what happens to information in our brains. No one can really tell if and when that is going to happen.
Need a longer Carlos. This Carlos is too short for accurate measurement.