How ‘Lolita style’ virtual robots pose as teenage girls to expose online paedophiles
A virtual robot has been designed by scientists in such a way that it poses as a 14-year-old schoolgirl to lure and trap paedophiles on social networking sites.
The highly sophisticated conversational agent, known as Negobot, called ‘Chatbot’, is a computer program that is capable of tricking potential sexual predators into thinking she actually exists
A police force in Spain, where she was made, is now looking at ways in which it could employ the undercover cybercop.
Placed in a forum where a paedophile is thought to be lurking, Negobot starts off as a passive and neutral participant in general online conversations.
As conversations become more intimate or suggestive, and the ‘target’ begins to employ grooming tactics, Negobot’s behaviour changes in ways designed to lure him/her in.
The chatbot can appear insistent or offended, and an attempt to obtain personal information will cause Negobot to try to find out more about the suspect.
Scientists used game theory, a mathematical system of strategic decision making, to make its behaviour more convincing.
Negobot has been programmed to have a split personality, with seven different conversation patterns, to make it more convincing.
Other tricks include inserting ‘typos’, abbreviations, and deliberate language errors to mimic the actions of a young teenager.
One of Negobot’s creators, Dr Carlos Laorden, from the University of Deusto, said: ‘Chatbots tend to be very predictable
‘Their behaviour and interest in a conversation are flat, which is a problem when attempting to detect untrustworthy targets like paedophiles.
‘What is new about Negobot is that it employs game theory to maintain a much more realistic conversation.’
In tests, not only could the program resume discussions separated in time, but it was able to ‘take the lead’ in conversations and use slang expressions.
The team had a collaborative agreement with the Ertzaintza, the Basque Country police force, which has shown ‘considerable interest’, said Dr Laorden.
‘Negobot has already been implemented and trialled actively on Google’s chat service and could also be translated into other languages,’ he added.
‘We do not discard the possibility of bringing it to new channels in the future and we believe it could be a very useful tool for social networks to incorporate.’
Despite having a high degree of artificial intelligence, Negobot’s powers of conversation are still limited, the researchers point out and it is unable to detect linguistic subtleties such as irony, for example
This flow diagram demonstrates how the Negobot goes about having a conversation with potential predators on social network sites. It has been programmed with seven different personality types and can add typos to conversations as well as feign offense to appear more life-like