Since the Turing Test seems NOT to be an accurate way to check for true AI, what is? What's will be the indicator that tells us it's arrived? I can't imagine; everything seems falsifiable.
8ace40 @ 8ace40 @programming.dev Posts 1Comments 35Joined 2 yr. ago
8ace40 @ 8ace40 @programming.dev
Posts
1
Comments
35
Joined
2 yr. ago
I was thinking... What if we do manage to make the AI as intelligent as a human, but we can't make it better than that? Then, the human intelligence AI will not be able to make itself better, since it has human intelligence and humans can't make it better either.
Another thought would be, what if making AI better is exponentially harder each time. So it would be impossible to get better at some point, since there wouldn't be enough resources in a finite planet.
Or if it takes super-human intelligence to make human-intelligence AI. So the singularity would be impossible there, too.
I don't think we will see the singularity, at least in our lifetime.