The Imitation Game

I propose to consider the question, "Can machines think?" This should begin with definitions of the meaning of the terms "machine" and "think." The definitions might be framed so as to reflect so far as possible the normal use of the words, but this attitude is dangerous, If the meaning of the words "machine" and "think" are to be found by examining how they are commonly used it is difficult to escape the conclusion that the meaning and the answer to the question, "Can machines think?" is to be sought in a statistical survey such as a Gallup poll. But this is absurd. Instead of attempting such a definition I shall replace the question by another, which is closely related to it and is expressed in relatively unambiguous words.

The new form of the problem can be described in terms of a game which we call the 'imitation game." It is played with three people, a man (A), a woman (B), and an interrogator (C) who may be of either sex. The interrogator stays in a room apart front the other two. The object of the game for the interrogator is to determine which of the other two is the man and which is the woman. He knows them by labels X and Y, and at the end of the game he says either "X is A and Y is B" or "X is B and Y is A." The interrogator is allowed to put questions to A and B thus:

C: Will X please tell me the length of his or her hair?

Now suppose X is actually A, then A must answer. It is A's object in the game to try and cause C to make the wrong identification. His answer might therefore be:

"My hair is shingled, and the longest strands are about nine inches long."

In order that tones of voice may not help the interrogator the answers should be written, or better still, typewritten. The ideal arrangement is to have a teleprinter communicating between the two rooms. Alternatively the question and answers can be repeated by an intermediary. The object of the game for the third player (B) is to help the interrogator. The best strategy for her is probably to give truthful answers. She can add such things as "I am the woman, don't listen to him!" to her answers, but it will avail nothing as the man can make similar remarks. 

We now ask the question, "What will happen when a machine takes the part of A in this game?" Will the interrogator decide wrongly as often when the game is played like this as he does when the game is played between a man and a woman? These questions replace our original, "Can machines think?" 

Notes:

Turing describes what would become the Turing test, the method for determining if a machine is comparable to a human in intelligence.

Folksonomies: artificial life

Taxonomies:
/technology and computing/hardware/computer peripherals/printers, copiers and fax/fax machines (0.495445)
/business and industrial (0.452409)
/law, govt and politics/government/legislative (0.445649)

Keywords:
interrogator (0.982958 (positive:0.112622)), game (0.935631 (negative:-0.294963)), Imitation Game Turing (0.803128 (neutral:0.000000)), relatively unambiguous words (0.770496 (positive:0.532726)), question (0.697812 (negative:-0.208933)), Turing test (0.631315 (neutral:0.000000)), Gallup poll (0.614351 (neutral:0.000000)), statistical survey (0.610185 (neutral:0.000000)), new form (0.605867 (negative:-0.206616)), machine (0.597334 (positive:0.147649)), longest strands (0.596302 (negative:-0.214306)), wrong identification (0.595324 (negative:-0.542105)), ideal arrangement (0.591178 (positive:0.762724)), best strategy (0.585014 (positive:0.610543)), similar remarks (0.583167 (neutral:0.000000)), truthful answers (0.577572 (positive:0.610543)), woman (0.575935 (neutral:0.000000)), man (0.575888 (neutral:0.000000)), object (0.535513 (negative:-0.542105)), meaning (0.535407 (negative:-0.237578)), terms (0.494370 (negative:-0.236420)), definitions (0.487180 (negative:-0.266224)), hair (0.486591 (neutral:0.000000)), questions (0.486132 (neutral:0.000000)), conclusion (0.448402 (negative:-0.208933)), method (0.446985 (positive:0.372167)), human (0.446892 (positive:0.372167)), intelligence (0.446874 (positive:0.372167)), attitude (0.444041 (negative:-0.332281)), room (0.443509 (negative:-0.286895))

Entities:
Turing test:FieldTerminology (0.852313 (neutral:0.000000)), Gallup poll:FieldTerminology (0.746151 (neutral:0.000000)), nine inches:Quantity (0.746151 (neutral:0.000000))

Concepts:
Turing test (0.952883): dbpedia | freebase
Meaning of life (0.899928): dbpedia | freebase | yago
Sentence (0.725674): dbpedia | freebase | yago
Question (0.711955): dbpedia | freebase
Answer (0.704793): dbpedia | freebase
Alan Turing (0.699838): dbpedia | freebase | opencyc | yago
Chinese room (0.590849): dbpedia | freebase | yago
Philosophy of artificial intelligence (0.570567): dbpedia | freebase

 Computing Machinery and Intelligence
Periodicals>Journal Article:  Turing, Alan (October 1950), Computing Machinery and Intelligence, Mind: A Quarterly Review of Psychology and Philosophy, 59(236):433-460, Retrieved on -0001-11-30
 


Triples

23 APR 2012

 Evolving Philosophy on Machines Imitating Humans

Why a Machine Cannot Fully Imitate a Man > Comparison > The Imitation Game
Descartes thought it impossible for a machine to replicate a human because machines can't learn. Turing lived in a world where machines can learn, so he invented the Imitation Game to determine how well a machine could think.