Tuesday, 22 December 2015

The imitation game

In my kitchen, my devices 'talk' to me. My American fridge lets me know if I have left the door open too long (which is often), and my microwave tells me if I have heated something up and then forgotten to remove it. I can communicate with my central heating wherever I am in the world via my phone. Siri and Cortana  talk to us through our mobile phones (but don't let them speak to each other - if they do, all you get is nonsense), and many people now also have conversations with their in-car Sat Navs. Just a generation or two ago, such experiences would have been science fiction. Now they are common place and we take our conversation with technology for granted.

I say conversations, but I really mean basic interaction.

With the best will in the world, you could never have an intelligent conversation with these tools - they merely alert you to what you need to know, or enable you to maximise your use of technology. You can have some fun though. Ask Siri what to wear for Halloween and you're likely to laugh out loud. It may respond with 'Go as an eclipse. Just dress in black and stand in front of people.' Or it may respond with 'Just go as yourself, pumpkin.' But Siri hasn't developed a sense of humour, nor is it about to embark on a lucrative career as a stand-up comedian. It's simply doing what its coders have told it to do.

Alan Turning
Such intelligent personal assistants blindly follow algorithms to function, and have been doing so for a long time. In the 80s I discovered a program called Basically Eliza (created by Joseph Weizenbaum) and obtained a version for the BBC Computer. It had been written to mimic a psychiatrist consultation - a sort of early artificial intelligence demonstration. The British computer scientist Alan Turing had argued that for a computer to be 'intelligent', it would need to imitate a human so well that people would believe they were conversing with a fellow human being.

Eliza responded to the questions typed into the keyboard by using a simple string matching algorithm. Eliza would first ask you to state your problem. Mainly, it reflected your statements back to you as questions. Occasionally it went a little further. If for example, if you told it you had a family problem, Eliza would ask how many brothers or sisters you had. Alternatively, it might ask about your marital status. But this wasn't a ghost in the machine - an emerging computer intelligence, simply Eliza following a random routine in its code. When I showed it to my nursing students back in 1985, such a semblance of artificial intelligence made quite an impression on them.*

So I rewrote the algorithm to be verbally abusive.

Instead of Basically Eliza, the psychiatric consultant, it became Dr Fraud, the psychiatric insultant. In itself, Dr Fraud was fairly meaningless. But my students loved it. They queued up to use it, and laughed as they were continually insulted by a machine. The more they sat there, the worse the insults became. Everyone was intrigued by this demonstration of artificial intelligence, and in turn, it switched them on to other, more educational programs on the menu. It was a gateway into computer supported learning. Soon, the computer suite had become one of the most popular places in the entire nursing college.

Personal assistant software is a little more advanced than it was in the 80s, but not that much. It leads me to wonder just how advanced it will need to be before it can convince us that we are talking to a human rather than a machine. But that's another discussion, another (Turing) test and something to look for in our future. How do you see personal assistants developing in the future? What are your thoughts? The comments box below awaits you (....and I will respond intelligently, I promise).

* The full story about my work programming BBC computers can be read here.

Photos by N3WJack and Parameter_bond on Flickr

Creative Commons License
The imitation game by Steve Wheeler was written in Plymouth, England and is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 3.0 Unported License.

No comments:

Post a Comment