Academia.eduAcademia.edu
School of Systems Engineering How the Good Life is threatened in Cyberspace Huma Shah | Kevin Warwick Introduction Deception-detection project The extent to which the good life has been threatened in cyberspace by artificial dialogue systems (chatbots), designed to deceive and defraud, is not yet known. Since January 2012 over a hundred human participants (75% male; 25% female; age range 13-65), involved as Turing test judges in Reading University s Alan Turing centenary project. The aim of the judges: detect deception via text-based conversation and distinguish the machines (hidden from hearing and view) imitating human conversation (Fig 4). Chatbots in E-commerce Virtual Assistants: Chatbots are increasingly used in cyberspace as virtual humans. These systems provide natural conversation information-seeking to augment the keyword search function on e-commerce websites. The most well-known virtual assistant is Ikea s Anna Fig concurrently speaking with thousands of users across the Internet in at least six languages. Anna has helped the Swedish furniture company increase sales from its online catalogue while driving down costs. Among many other companies, including Asda, SKY.com deploys a virtual assistant Ella Fig , which in the blink of an eye extracts answers from the website s Help & Support section to help customers resolve problems, or get answers to queries about Sky s service. Figure 4: Turing Test Internet immersion Before participation all judges completed a short questionnaire to ascertain depth of Internet immersion. Table 1 displays the answers: 90% of participants used social media; 28% had experience of chatbots; 18% experienced stolen ID or had their bank card misused. 16% used the same password for all media channels. Participants >100 Use social media? Experience of chatbots? Use same password? Stolen ID / bank card misused? 90% 28% 16% 18% Table 1: Participants feedback Results One machine fooled almost 30% of the people into thinking it was human (see Graph 1). Deception Rate Figure 1: Ikea s Anna Figure 2: Sky s Ella 30.00 25.00 20.00 15.00 10.00 5.00 0.00 Chatbots used in deception: Hiding across the Internet in chatrooms are flirtbots such as CyberLover (Fig 3) computer programmes developed to form relationships with multiple human users mimicking humanlike conversation. Flirting bots purpose is malfeasant: • to draw in the susceptible, deceive them that they are engaging with another human, • steal identity, and • conduct financial fraud. 25.00 12.50 29.17 20.83 14.58 Graph 1: Machine deception Conclusion Some people easily fooled. Education needed to ensure that, as the sophistication of chatbot conversation increases, the good life remains with users and not cybercriminals. References 1. Huma Shah, Malware and Cybercrime: the Threat from Artificial Dialogue Systems , House of Commons Science & Technology Committee Report on Malware and Cybercrime,12th Report of Session 2010-2012, - Additional Written Evidence (House of Commons, 2 February 2012). Vol II, pp. Ev w1Ev w7 Acknowledgements • European Union Seventh Framework Programme (FP72007-2013) under grant agreement no. 289092. RoboLaw – Regulating Emerging Robot Technologies in Europe: Robotics facing law and ethics Figure 3: CyberLover Contact information • School of Systems Engineering, University of Reading, Whiteknights, RG6 6AH • Email: h.shah@reading.ac.uk | www.reading.ac.uk/sse/about/staff/h-shah.aspx