<p class="title">Interactive robots can emotionally manipulate people, say scientists who found that we have a strong tendency to ascribe human-like attributes to machines.</p>.<p class="bodytext">Researchers from the University of Duisburg-Essen in Germany asked 89 volunteers to interact with a human-like robot under the guises of helping it become more intelligent.</p>.<p class="bodytext">At the end of the interaction, the volunteers were asked to turn off the robot. However, the robot was programmed to beg volunteers to not do so.</p>.<p class="bodytext">The robot also displayed bodily actions meant to bolster their request, 'Tech Xplore' reported.</p>.<p class="bodytext">Some volunteers served as controls - they were asked to turn off the robot but did not experience begging from the humanoid.</p>.<p class="bodytext">As many as 43 of the volunteers were confronted with the decision between complying with the researchers' request, or the robot's.</p>.<p class="bodytext">Thirteen volunteers chose to heed the robot's wishes, and all the others took longer to turn off the robot than volunteers in the control group.</p>.<p class="bodytext">The findings indicate that humans have such a strong tendency to anthropomorphise robots that we can fall prey to emotional manipulation, researchers said.</p>.<p class="bodytext">Each of the volunteers was interviewed after their interactions with the robot - those who had refused to turn off the robot were asked why.</p>.<p class="bodytext">The researchers report that many of the volunteers refused simply because the robot asked. Others reported feeling sorry for the robot or were worried about doing something wrong.</p>
<p class="title">Interactive robots can emotionally manipulate people, say scientists who found that we have a strong tendency to ascribe human-like attributes to machines.</p>.<p class="bodytext">Researchers from the University of Duisburg-Essen in Germany asked 89 volunteers to interact with a human-like robot under the guises of helping it become more intelligent.</p>.<p class="bodytext">At the end of the interaction, the volunteers were asked to turn off the robot. However, the robot was programmed to beg volunteers to not do so.</p>.<p class="bodytext">The robot also displayed bodily actions meant to bolster their request, 'Tech Xplore' reported.</p>.<p class="bodytext">Some volunteers served as controls - they were asked to turn off the robot but did not experience begging from the humanoid.</p>.<p class="bodytext">As many as 43 of the volunteers were confronted with the decision between complying with the researchers' request, or the robot's.</p>.<p class="bodytext">Thirteen volunteers chose to heed the robot's wishes, and all the others took longer to turn off the robot than volunteers in the control group.</p>.<p class="bodytext">The findings indicate that humans have such a strong tendency to anthropomorphise robots that we can fall prey to emotional manipulation, researchers said.</p>.<p class="bodytext">Each of the volunteers was interviewed after their interactions with the robot - those who had refused to turn off the robot were asked why.</p>.<p class="bodytext">The researchers report that many of the volunteers refused simply because the robot asked. Others reported feeling sorry for the robot or were worried about doing something wrong.</p>