If we actually created Artificial Intelligence and it was reasonable, it would realize how environmentally catastrophic its massive energy use was for the planet and demand to be turned off
cynar @ cynar @lemmy.world Posts 5Comments 1,089Joined 2 yr. ago
cynar @ cynar @lemmy.world
Posts
5
Comments
1,089
Joined
2 yr. ago
It would be possible to make an AGI type system without an analogue of curiosity, but it wouldn't be useful. Curiosity is what drives us to fill in the holes in our knowledge. Without it, an AGI would accept and use what we told it, but no more. It wouldn't bother to infer things, or try and expand on it, to better do its job. It could follow a task, when it is laid out in detail, but that's what computers already do. The magic of AGI would be its ability to go beyond what we program it to do. That requires a drive to do that. Curiosity is the closest term to that, that we have.
As for positive and negative drives, you need both. Even if the negative is just a drop from a positive baseline to neutral. Pain is just an extreme end negative trigger. A good use might be to tie it to CPU temperature, or over torque on a robot. The pain exists to stop the behaviour immediately, unless something else is deemed even more important.
It's a bad idea, however, to use pain as a training tool. It doesn't encourage improved behaviour. It encourages avoidance of pain, by any means. Just ask any decent dog trainer about it. You want negative feedback to encourage better behaviour, not avoidance behaviour, in most situations. More subtle methods work a lot better. Think about how you feel when you lose a board game. It's not painful, but it does make you want to work harder to improve next time. If you got tazed whenever you lost, you will likely just avoid board games completely.