The news was going around this summer that "A Supercomputer Has Passed the Turing Test for the First Time." That article has several issues with it, not the least of which is the misleading headline, but when you read through, you notice irregularities.  The initial press release was quietly corrected to note that the program, "Eugene Goostman" was not a "supercomputer." The press release mentions that the program "pretends" to be a 13-year-old boy from the Ukraine, not fully fluent in English. Right away I see warning flags... but hey.

TURING TEST SUCCESS MARKS MILESTONE IN COMPUTING HISTORY

Let's set aside the fact that the "Eugene Goostman" claim is not the first.  Let's set aside that the participants didn't even manage to "beat" the test by the biggest recorded margin as they claim. Let's ignore that this program wasn't a "supercomputer" but a chatbot. Let's forgive that it passes the test by gaming the rules in a specific way. Set that ALL aside. Let's say just for the sake of argument that by some miracle we have on our hands a truly sentient machine -- That this is not simply further evidence that the "Turing Test" as presently modified to a state Turing wouldn't recognize, is bunk.

If the machine is sentient, what then?

 

Someone once argued that a machine could not be sentient because anyone could walk by and pull the plug on it. Actually, I've seen this argument several times. I'd like to point out the core fallacy behind it.

Killing you won't prove that you're not sentient.

I can kill you. Egad! That wouldn't mean that you were never sentient. It would mean that I am now a murderer. You can't define thinking by "being able to shut it off."

Oh, but you can turn the computer back on, restore its program, and it will continue on as if nothing had changed! Ergo, NON-SENTIENCE!... No, again sorry... Just because we understand the process well enough to repeat it, doesn't mean we don't have a sentient machine.

So, I can kill you (man, is that scary, yikes), but we've already established in this scenario that doing so merely makes me a murderer. But let's say in addition to my violent streak, I'm also a super-genius, with an advanced understanding of human physiology. So, you're dead, but I rush your body into the lab, patch you up, and resuscitate you no worse for wear. My behavior toward you is shocking and mystifying, and you have every right to be angry and alarmed. However, your death and resurrection doesn't somehow render you non-sentient. It also doesn't make my killing you "okay" just because I know how to bring you back again.

You can't define thinking by being unable to turn it back on, either.

Also, shutting a machine intelligence "off" and then turning it back "on" will be as shocking and mystifying a behavior to it, as killing you and reviving you is.

So let us set aside me as a murderous super-genius (Because damn! That guy is scary!). Instead, let's build the world's oddest ice arena. A large, flat arena, the floor of which is a sheet of four-inch thick ice, and the walls of which are lined with enormous fans. Every morning a technician enters the arena and arranges thousands of curling stones on the ice, and turns the fans on them. At the end of the day, the technician records the positions of the stones and removes them. Weird.

Only by watching the daily records of this arena can we see that the stones move from day to day. They collide with each other. Each collision represents a logical operation. The arena and surrounding fans are in fact a logic circuit, but only when the stones are present. Very slowly, over the course of decades, if we continue to monitor the daily records of this "device" we can see what it's calculating. This is a very slow computer.

Time passes. The technician's job changes hands over the generations, but each technician performs his or her job flawlessly.

Now, at the end of every day, the technician takes that report, and sends it off to corporate HQ. Where it's combined with the reports of billions of other technicians running identical setups except for small changes in the nature of the arrangement of stones. This is a very slow neural network. It is spread across a multitude of planets in a multitude of galaxies. We have no idea what, or even if it's thinking. It works at too slow a speed for us to comprehend. However, its sheer complexity suggests at a minimum, a staggering amount of computing power based upon fans, ice, and curling stones.

Each ice arena is a single logic gate. Not even a full-fledged Turing machine. It is impossible to call the ice arenas "sentient" as this property only emerges if and when they are considered as a whole. And that whole is vast, and slow, and we cannot see it at our scale, let alone speak to it slowly enough, or listen to it patiently enough to communicate.

Does the being composed of neural networks formed of ice arenas, fans, and curling stones know that among its components is a sentient technician who goes home at the end of the day to a spouse and a family? Are you aware that there are entire populations interdependent living organisms that do the same within your own body? You would die or at a minimum become gravely ill without them. They may not be sentient, but you have a microbial flora which you are dependent upon and which in turn depends upon you for its livelihood. So this technician is the microbial flora of a being he or she interacts with, but cannot ultimately comprehend or communicate with. And so a sentient being (the technician) does a day-to-day job, unaware that he/she is playing a role in the life of a vastly larger being. Likewise, that larger being has no knowledge of the individual technician or his/her work.

Then again, among the components of this being is also a massive corporate bureaucracy solely for the purpose of collating and tracking the state reports from each logic circuit. Is the sentience the ice arenas, or have we created a thinking corporation? Or is the machinery irrelevant?

Are we capable of creating a sentient mechanism which would nonetheless not be something we recognize as a thinking being? Would we know if we somehow managed to completely incorporate the functioning of a business, or a government into such a sentient mechanism that we had done so? Would individuals who serve supporting roles in such a mechanism still be themselves considered independent persons?

However, building bigger and slower machines isn't the technological direction we're headed in, is it? Instead, we're making machines which are ever smaller, and ever faster.

So imagine: We build sentient machines the size of grains of dust and smaller.

Would they then not have the same issues with us as we would have with the ice-arena logic machine?