When I read about dialogues with AI, where people try to get life advice, support, and therapy from the algorithm, I'm reminded of this photograph.
When I read about dialogues with AI, where people try to get life advice, support, and therapy from the algorithm, I'm reminded of this photograph.
ELIZA, the first chatbot created in the 60s just used to parrot your response back to you:
It was incredibly basic and the inventor Weizenbaum didn't think it was particularly interesting but got his secretary to try it and she became addicted. So much so that she asked him to leave the room while she "talked" to it.
She knew it was just repeating what she said back to her in the form of a question but she formed a genuine emotional bond with it.
Now that they're more sophisticated it really highlights how our idiot brains just want something to talk to whether we know it's real or not doesn't really matter.
One of the last posts I read on Reddit was a student in a CompSci class where the professor put a pair of googly eyes on a pencil and said, "I'm Petie the Pencil! I'm not sentient but you think I am because I can say full sentences." The professor then snapped the pencil in half that made the students gasp.
The point was that humans anamorphize things that seem human, assigning them characteristics that make us bond to things that aren't real.
That or the professor was stronger than everyone thought
that's just a bit from community
Depends. I think I’m on the autistic spectrum, I just don’t see them as equal, but as tools.
I'm not in the autistic spectrum. They aren't equals and they are barely tools.