We’ve always been afraid of the robots. Afraid that one day, somebody will invent the algorithm that will turn our particular life’s work into a machine-readable, endlessly repeatable process.
It’s inevitable. The robots always catch up. If your job is to follow a set of rules, you’re doing a robot’s job, and they want what’s theirs.
Humans automate. We’re good at simplifying, working out procedures to solve problems. What makes us human is being able to solve the problem that very first time. Then we refine our solution, making it neater and cleaner, and with each refinement, we get a little closer to perfecting the algorithm we can teach our robots. We get worried that once the robots learn how to do everything, there won’t be a place for us anymore. And yet, we keep teaching the robots. Teaching the robots is something else that makes us human.
In each round of the Loebner Prize, a judge engages in textual conversation for five minutes with a chatterbot and a human being. At the end of the round, the judge must make the call: which entity was human, and which was machine? We understand (if only vaguely) how a researcher might go about teaching a machine how to ‘speak’, how a well-tuned algorithm could, conceivably, fool a judge into believing the machine they’re talking to is a real person. But what if you are the real person, chatting with the judge? How do you convince the judge that you’re not the robot? You need to make sure you do the human dance as well as you possibly can. You need to make sure you’ve out-danced the robot, because somebody else has been teaching the robot all your best moves.
Just as we teach the robots, technology teaches us what we’re not. We aren’t robots. We’re something more… but what are we, exactly? The only way we can tell for sure is by dancing the human dance, inventing new steps and elaborate moves, giving each twist and shake more nuance, giving it all more soul, recognising that the robots will soon be able to dance this version of our dance, by which time it won’t be our dance any longer, and we’ll have moved onto a version that’s even stranger and more intricate and more complex.