They are going to be the future; they will be a trend in the coming decades; They may look humanoid and even be endearing, but “social” robots have a hidden face and can pose numerous privacy and security risks.
Would you allow a stranger into your building? What if that stranger was a robot? Would you let a stranger take your picture? What if a robot asks you?
The cybersecurity company Kaspersky and experts from the University of Ghent (Belgium) conducted a study and found that robots can effectively extract sensitive information from people who “trust” them.
They also verified that the presence of a robot can have a great impact on the will of many people, who are inclined, for example, to allow them access to a building.
Increasingly, industries and households rely on automation and the use of robotic systems capable of providing some “social” services, and different studies suggest that these will be widespread by mid-century, although only among classes with a lower income. greater purchasing power.
At the moment, most of these systems are in the academic research phase, but this study has delved into the social impact and potential dangers of robots in their interaction with people.
Work done at Ghent University focused on the impact produced by a robot designed and programmed to interact with people using human ‘channels’ such as language or non-verbal communication; Tests were carried out with fifty people and the experts verified how the robots were capable of entering restricted areas or extracting sensitive information from those people.One of these “social” robots was positioned near a security entrance to a mixed-use (residential and office) building that can only be entered through doors with access readers and, although most people denied the entrance to the machine, 40 percent did comply with his request and allowed him to pass.
When the robot was positioned as a pizza delivery man and holding a box from a well-known food delivery brand, most people did allow its access and did not question its presence or the reasons why it needed to enter the building.
The second part of the study focused on trying to obtain personal information through a robot that engaged in a friendly conversation, but the researchers found that it was capable of obtaining personal information at a rate of one piece of information per minute.
The researchers thus corroborated that “trust” in robots, and especially in “social” robots capable of interacting with humans, is real and that, therefore, these could be used to persuade people to do something or to reveal sensitive information; the more “human” the more power it has to persuade and convince.
The British David Emm, principal security researcher at the Kaspersky company, has stated that “indeed” there is a potential security problem related to the use of robots.
In statements to Efe, Emm has observed that fully equipped robots are still in the research phase “but there is already a growing number of smart devices deployed in the home.”
“People are very vulnerable when they are in a familiar environment; they tend to ignore the potential of the sensitive information that these devices have and even go so far as to share data with them that they probably would not be willing to enter in a physical form or upload to a social network”, this cybersecurity specialist has pointed out.In his opinion, this will be accentuated when that domestic assistant is a humanoid robot and ends up becoming a “friend” because the developer of that machine can design it to collect sensitive information, as is already the case -he has warned- with smart speakers.
It will take, according to David Emm, much more research to conclusively ensure that people will trust robots more than people, but the studies already known reveal that there is a significant level of trust “and probably enough for attackers of the future feel that it is worth looking for vulnerabilities”.
Like all technology, robots can become “double-edged swords”, since, compared to the benefits they can bring to people, there is the possibility of accessing very valuable data for organizations and companies for commercial purposes. “and for criminals”, Emm has corroborated.
He has also pointed out that all machines, and also robots, are going to be programmed by humans and that this programming can always be done with biases “unless positive measures are adopted to minimize those risks and their impact when they are deployed”.
David Emm has warned that this is already happening today with machine learning systems (the ability of many machines or devices to learn from experience) and has been convinced that it will also happen in the future with fully equipped robberies. .