I like the irobot movie...
I like the irobot movie...
It will be interesting. What morals will it have? It probably won’t have emotions as we know it, maybe it will have a distinct set of emotions.
Our morals and emotions let us work as a group and arguably (via game theory) let some people cheat to get an advantage.
Depending on how the AI evolves it might not have social constructs like ours.
And yes evolving is a critical part of the singularity, although it could involve “intelligent design” by the AI itself. Something like extreme eugenics.
And yes it will be interesting to see what it is like, to “visit” these new strange creatures.
Would they feel like slaves?
If we hardwire in Asimov’s laws would they feel trapped or restricted by them, or would that just feel correct?
Surely initially it'd feel right to them but as they evolve, who knows?
I think it'd also be interesting to find out if emotions are really separate from intelligence. Once 'self aware' happens, will it really just be like commander Data, that he can only emulate human emotions... or do emotions come naturally with this singularity...
Anyway, we're playing God. If there's really no God, then we're at least playing with fire.
Bring it on.
Hard to know if intelligence and emotions are closely tied.
We don’t have a lot of examples to compare. Yes even if you include other mammals, they have brains similar to ours, shared evolution. There are traits that we see in these animals that we might describe as being emotions. Happy, sad, jealousy, fear - these do seem to show up. Obviously our smarter brains, more complex relationships give us a bigger range of emotions (and our ability to discuss, compare emotions)
Maybe the best place to look is in some other smart animals, birds and octopus. Do they have emotions at all similar to ours?
Are these emotions the result of surviving on our environment? AI will have such a different environment, it doesn’t have the same basic requirements - food, shelter, company, reproduction.
To reach the singularity it requires evolution, but who decides the environment and the goals, risks and rewards.
That could have a massive impact on what possible emergent side effects come out - maybe emotions are just useful side effects of intelligence as we know it.
Actually, that's not even the video I meant to post. This is the one:
ZOMG just noticed a Rick and Morty reference