The inheritance

Please click on the link to download it.

It seems that Mr Megasoft got over the whole atomic watch thing and had children with his girlfriend before vanishing into space on his space yacht. As the kids subsequently discover, their father had been intending to leave his money to a computer called Deep Thought.

It ends up in court where the children’s lawyers argue that Mr Megasoft couldn’t leave his money to a computer because it wasn’t alive. Mr Megasoft’s lawyers argue that as a thinking machine, the computer should be given the same rights as organic thinking machines and even ask to be allowed to cross examine the children to see whether they’re thinking beings or merely going to produce programmed responses. They claim that Deep Thought can think and have views for itself.

But can they prove it?

I don’t see how they could successfully prove such a claim. My laptop sits here on the desk and does what it’s programmed to do (well, at least as far as Vista works properly). It never offers any spontaneous advice; it’s unaware of the time of day or year (no, displaying such information doesn’t constitute self-awareness); it has no idea that it’s sunny outside at the moment or that I’m wearing a red T-shirt; it doesn’t know nor will perceive my absence while I go and buy something for lunch. Of course, no one reading this will notice that I was gone.

It can switch itself off, but it has no choice when that happens. It can’t switch itself on whenever it likes. It can’t reason or give opinions. It can’t lie (or, perhaps, produce any response contrary to whatever programme it’s running unless it’s the result of a software bug). It has no emotional responses. If, say, I didn’t use it for a few days, it wouldn’t reproach me for neglecting it when I switched it on again or express pleasure that it was functioning again. It has no consciousness.


Enter the computer.

In the second part of the scenario, Deep Thought is brought into court and cross examined. It explains how it would use Mr Megasoft’s legacy. The jury are impressed, but the children’s lawyers subject Deep Thought to more complicated questions, which it answers, even admitting when it is ignorant. (Hang on, doesn’t that make it as wise as Socrates? from which we must conclude that Socrates was a computer, and because computers are inherently immune to poison, he could not have been executed successfully.)

So now the jury has to decide whether Deep Thought should be denied the money Mr Megasoft bequeathed it.

Does Deep Thought need something more than the appearance of having consciousness to win this one?

As the discussion at the back of the book says, even the primitive computers of the late 1960s were able to produce responses which quite gulled the human “conversing” with it. Nonetheless, this was partly a matter of programming and partly, it seems, some early social engineering which was achieved by the repetition of key words which, to a human, would have made it seem that the computer was on topic and responding in a sympathetic vein.

The appearance of consciousness isn’t sufficient in my mind for Deep Thought to be given de facto human rights and thus inherit the money. In fact, technology isn’t sufficiently advanced for this to be an issue. Yet it may become significant in the future if the technology becomes sophisticated enough for us to produce Cylons.

Tomorrow sees the beginning of a short series of picture posts with the help of the art of M.C. Escher.

Comments

Popular posts from this blog

FH5, Series 37, Week 4

FH5, Series 29, Week 4

FH5, Series 38, preview