Jeez, I wear eye- and earpro when doing power tool stuff, but somehow never thought of this scenario... Thanks for the warning and glad nothing worse happened!
Ok, maybe slightly :) but it surprises me that the ability to emulate a basic human is dismissed as "just statistics", since until a year ago it seemed like an impossible task...
Agree, I have definitely fallen for the temptation to say what sounds better, rather than what's exactly true... Less so in writing, possibly because it's less of a linear stream.
Yeah, I was probably a bit too caustic, and there's more to (A)GI than an LLM can achieve on its own, but I do believe that some, and perhaps a large, part of human consciousness works in a similar manner.
I also think that LLMs can have models of concepts, otherwise they couldn't do what they do. Probably also of truth and falsity, but perhaps with a lack of external grounding?
And this tech community is being weirdly luddite over it as well, saying stuff like "it's only a bunch of statistics predicting what's best to say next". Guess what, so are you, sunshine.
Jeez, I wear eye- and earpro when doing power tool stuff, but somehow never thought of this scenario... Thanks for the warning and glad nothing worse happened!