Skip to main content

Chatbots faking it as they pretend to be us

The idea of a far-distant future where we can upload our very essence to some digital repository in the ether is both compelling and repulsive in equal measure, but a less sci-fi alternative may be closer than we think.

According to New Scientist, chatbots - the online programs that attempt to mimic human conversation - could soon learn to talk like us so well that they may as well be us.

Let's pretend

Existing services like MyCyberTwin are capable of taking information from a personality questionnaire and using it to masquerade as a real human in text chats on services like MSN Messenger.

MyCyberTwin uses 79 questions, which may seem like a lot, to get to the core of what makes each person tick, but there are chatbots in development that use many more - 20,000 in the case of this one you can download to keep you busy on these cold winter nights.

Better still, future chatbots could take years of written records of conversations we've had - whether in chat or email - and analyse the lot to extract a synthetic version of the way we communicate. Once that happens, would there be any real difference between the imposters and us?