Computing Machinery and the Individual: The Personal Turing Test
Rollo Carpenter and Jonathan Freeman propose an advancement of the Turing Test - a challenge that calls for the identification of key features that define us not only as human, but as individuals.
(PRWEB) July 20, 2005
THE IMPERSONATION GAME
ÂCan machines be? Can they be, for all intents and purposes, a specific human individual?
Alan Turing, in his 1950 paper, asked ÂCan machines think?Â, and proposed the Imitation Game, testing machine against human in ability to converse. Turing wrote, Â...the odds are weighted too heavily against the machine. In a new paper, Rollo Carpenter of Jabberwacky and Jonathan Freeman of i2 media research propose to weigh those odds more heavily still. We propose an Impersonation Game, and a "Personal Turing Test", in which the machine must convince that it is a known human individual.
The new Game will be played, like the old, with a human, a machine and a remote interrogator. The human must be known to the interrogator, and the machine must impersonate that human. The interrogator may be remotely present via the web in a way that Turing is unlikely to have foreseen.
The Impersonation Game is not limited to text, involving a level of technological presence representation that best supports the goal. The interface could be audiovisual, though the test may be passed without. It could use other subtle social communication cues, both those of face-to-face communication like gaze or expression, and representations of near-imperceptible information such as a ÂpersonÂs emotional internal and physiological states.
To what degree should the human and the interrogator know each other? Should they have recently met, have known each other as colleagues, or socially? Should they be friends and family? To overcome the unknowable degree of knowing, in order to pass the Personal Turing Test, a machine will play 100 Impersonation Games as 100 different people, known to the interrogators to every different degree, and will win 50% or more.
CHATBOTS & JABBERWACKY
Hundreds of software programs talk daily with the public over the web, and the term Âchatbot has been coined. Most such programs are variations on the work of a few originators, downloaded and edited. It is impossible for programmers to codify the infinity of language, and responses often take the form of avoidance and diversion. This is Âlight conversationÂ, in which, at a surface level, responses make sense, yet are difficult to confuse with human life.
A possible exception is the Artificial Intelligence technology behind Jabberwacky. com. Despite clear descriptions, some visitors become convinced that they are talking to volunteers or other visitors - that there is no machine. This AI is different because it has learnt to talk by talking, from scratch, in any language, using the context of complete conversations. When you chat, it learns what you say and when, and may use your words on someone else. ItÂs a positive feedback loop - an imitator of humanity at large.
Jabberwacky has a database of 5 million full sentences, which are replayed verbatim when selected. This huge simplification of language was chosen for performance, and to demonstrate the power of context, and its relationship to human learning. It doesn't always make sense, especially in its current entertainment-centric guise. It can be illogical, inconsistent, contradictory, or plain silly - still too often Âunexpected regularly to pass the Turing Test. Yet that very unexpectedness is a spark of the chaotic that we humans all possess.
We believe that with 10 million entries, Jabberwacky will pass as human most of the time for most people. If given the serving power of search engines, with billions of entries, it would become a new mode of human entertainment and communication.
PERSONALISATION
The AI that powers Jabberwacky is becoming an impersonator of individuals. People can teach it their name, age, sex, location, work, interests, favourite topics, languages, and word usage patterns. One may choose to talk to Âpeople that meet any criteria of choice or to any individual that allows themselves to be ÂpublishedÂ.
The AI will take the Personal Turing Test. Passing it will be a considerably greater challenge. 10 million entries would take one person a lifetime, but fortunately, humans are predictable. When selecting what to say, the AI will favour the teaching of the individual, dropping back as necessary to things said by those most similar. Around 200,000 responses will needed - the far more manageable work of about one year.
NEW DOMAINS
A key feature of AI is the predictive power of experience. As Jabberwacky can learn how people in general, and particular people, respond verbally, a similar algorithm will learn how individuals ÂareÂ; verbally and non-verbally.
This raises the exciting, if not disruptive, prospect of multiple, convincing, distributed selves, and the manifold applications such technological innovation promises - from satisfying remote communications, to partner vetting, to salespeople 'being with' many clients at once. The prize for the developers of the machines that will pass the Personal Turing Test is clear.
FUTURE CONCLUSION
In 1950 Alan Turing predicted that Imitation Game would be won within 50 years. The year 2000 has sailed by, yet we believe that the Turing Test will be passed by 2015, and that contextual learning techniques are those most likely to succeed.
So... ÂCan machines think? The machines that pass will not actually think, but will be superb imitators that have borrowed the interaction skills of millions of people, and achieved sufficiently high-fidelity playback. Yet a non-human form of understanding of language will have emerged. Believably 'human' machine communicators will entertain, accompany, educate and assist.
The Personal Turing Test is much more challenging, yet an accelerating pace of technological change will deliver a pass within 20 years, by 2025, with profound implications for our futures: our privacy, communication, education, work and location. It may even be a step towards a virtual life extension.
So, again... ÂCan machines be? If a machine can pass the Personal Turing Test, it can, too, be an individual in its own right. As we understand it, it won't Âthink or have ÂemotionsÂ, yet it will appear so, and be complex beyond analysis, as is a brain.
What rights should and will it be afforded?
###