Technological Singularity: The Aftermath
What happens when Skynet becomes self aware? Nuclear War? Robots take over? Oh my! What if we actually create Cylons? If you are still wondering what would happen for real, then there will be place for you in a hot discussion that might plead – anything worse than shown in the movies could happen or nothing but a happy life prevails. I lately answered a question about the impact of this super intelligence to our human life. What would happen to our life after the Technological Singularity?
Once the world watches their technical preferment has reached its pinnacle, the first moments of joy would soon be transformed into chaos when these machines/technologies created by us, take over almost every tasks of humans with its own super intelligence. Life would become harder, as we will find it hard to understand the intelligence of the machines and moreover, it almost makes human life a docile one!
Human emotions would be artificially stimulated, so there would be a substantial loss of emotions amongst the population. And, if we could still prevent them from ruling the world (i.e. if their AI is still bound to principles e.g. Asimov’s laws), the super intelligent machines can be of good use to explore the universe, considering the fact that these machines may be unreliable, when they have the ability to learn themselves and simulate the human brain. In a sentence, they are nothing but a crowd of super machines networked with each other. You could get a picture now. Whatever you’ve seen about the Skynet machines or Cylons, well… you could keep that scenarios as a preview of the initial period of life for humans. Ultimately, their intelligence would grow to the extreme and our life, spirituality, morality and any other emotions will stand no chance! When it’s a molecular level machine, say a nanobot… It’s different!
Say if we could control them… Life would turn more chaos as researches escalate quickly and we will end up with abundance of new techs and resources, new medicines, life extension pills, new weapons. Then, we would have ended our entire civilization by ourselves, by just using the singularity as a tool.
But the other scenario, if we could achieve the singularity in a controlled manner with no logical AI errors, it can be controlled and with some constraints over their intelligence and keep them as a good pet :) More like a nuclear reactor…
And we can live happily ever after with super-speed shift in our technical preferment.
So whatever the scenario maybe, we can deduce a few facts that might sound inconvenient to our ears, besides some perks. Technological preferment is inevitable and so does a Technological Singularity. The profound question is: “Are we prepared?”
This post was first published on September 15, 2013.