Saturday 21 March 2015

Some Thoughts on Consciousness and Robots

So I have been quiet for a number of weeks. Perhaps that is a merciful thing or you may have been inundated with a load of nonsensical posts. As it is you get to put up with my insanity in smaller doses. Hooray for you!

Anyway as promised, I thought I'd share some thoughts I've had of late concerning robots and AI and how we look at the idea of consciousness. Obviously, I went to see Ex Machina last month and that contributed to this way of thinking but I also indulged myself by buying a book of Isaac Asimov's robot short stories. I've read quite a bit of his earlier stuff. I read I, Robot (quite different from the Will Smith I assure you) and The Rest of the Robots ages ago so I'd read a lot of the stories before. That didn't mean that I didn't enjoy them all over again as well as having the wonderful treat of a few new ones in there.

There is a point to this and I'm getting there if you give me a second.

Basically, Asimov set up three laws that the robots in his stories were built with. The First Law being that they couldn't cause harm to humans or allow harm to come to them. In other words, they can't kill humans, harm them in anyway or stand by if anyone else tries to do it. They have to save them from dangerous situations and accidents. The Second Law is that they have to follow all orders given to them by humans unless those orders conflict with the First Law. The Third Law is that the robot should protect itself unless that affects the first two laws. If it causes harm to a human in some way or goes against orders then they can't protect themselves from destruction.

The point is coming now, provided I don't lose the train of thought in the meantime. Just bear with me here, it might turn out to be worth it. What I'm getting at is the fact that they're created, they're restricted in such a way that they should be predictable and incapable of developing consciousness.

What's consciousness? We consider ourselves conscious or sentient but for the most part, we don't consider animals to be on par with us. They might be intelligent, yes but humanity would never consider them as sentient because they'd be somehow equal. We wouldn't go looking for sentient life on other planets if we thought we could find another sentient species at home. Yes, I'm rambling and making assumptions about things but I think I'm right, or I hope I am so try not to judge me.

So consciousness is what? During the Enlightenment, it was discussed that humans possess Reason, something that animals apparently didn't. Unless you're looking at the Houyhnhnhms in Gulliver's Travels, of course. Sentience for us seems to be things like having a sense of self, possessing the capacity for reason or logic, creativity and an idea of abstract concepts. That's what I can think of anyway.

So robots. There was indeed a point to my mention of Asimov's writing, aside from expressing my love for it, and the clue should be in the title of this post. Consciousness and Robots. If we create a robot or AI, we create artificial consciousness. There's an idea that it's limited, unlike our own, the "true" consciousness. There's also an idea that this is not only all right but the way that things should be. We don't want to create a slave class that is truly equal to us after all. If they're less than us then that's perfectly okay. The idea that they might somehow wiggle out of the set parameters that we set for them and reach a level of consciousness equal to our own then we get a bit panicky.

Consciousness+Robots=The doom of humanity

That's the way of thinking. What we create will destroy us and we are afraid of this. There/s a term for it: the Frankenstein Complex. The idea that you're creation will turn upon you and destroy you and everything you love. Are we really as paranoid as all that? Damn right we are! We have Cylons in Battlestar Galactica, we have the robots from the Terminator films, V.I.K.I in I, Robot (the Will Smith film this time) and the list goes on.

Are we justified in this way of thinking? If we create robots,will they inevitably turn upon us? I think if they were sentient, they might. I'm not saying "down with robots", I'm far from a technophobe, I'm just saying that it's a real possibility. If you are a slave but you don't know any difference then are you bothered to rebel? No, probably not. If you're aware of the injustice, that you are no different than your masters and yet are being treated as inferior and forced to work, then might you not get a little bit annoyed and fight back?

I don't know why we have such a great fear of robots. I suppose it's the idea that they will be superior to us physically and therefore more capable of destroying us. We've treated other human beings in the same way and received the same retaliation. Some would argue that we didn't create those humans but many colonies were set up and told that they'd been created by their colonisers. It's a similar situation. We've done the same to humans in the past but robots are pseudo humans, right? They aren't supposed to rise up. If you try to make something like yourself, don't be surprised if you get exactly what you wished for.

And I should probably stop now because this has become highly bizarre.