Want to get along with the robots? Pretend they are animals


WIRED: I wanted to talk about navigating relationships with domestic robots or companions, especially when it comes to empathy and developing fairly complex relationships. What can we learn from what we have been doing for thousands of years with pets?

KD: One of the things we’ve learned from examining the history of pets and other emotional relationships we’ve developed with animals is that there is nothing inherently wrong – this that people often immediately jump with robots. They immediately say, “This is wrong. It’s a fake. It will harm human relations. So I think comparing robots to animals is an immediate change in conversation, where people are like, “If it’s more like a pet rabbit, then maybe this isn’t going to blow my friends away.” child.”

One of the other things we have learned is that animals, even in the companionship realm, are actually very useful for health and education. There are methods of therapy that have really been able to improve people’s lives through emotional connection with animals. And it shows that there may be some potential for robots to help in a similar, but different, way, again, as some sort of new breed. It’s a new tool, it’s something new that we might be able to harness and use to our advantage.

One of the things that was important for me to put in the book is that robots and animals are do not the same. Unlike animals, robots can reveal your secrets to others. And robots are created by companies. There are a lot of issues that I think we tend to overlook – or forget – because we’re so focused on this aspect of human replacement. There are a lot of issues with putting this technology in place in the capitalist society we live in, and letting corporations rule freely how they use these emotional connections.

WIRED: Let’s say you have a domestic robot for a child. In order to unlock some kind of feature, you have to pay extra money. But the child has already developed a relationship with this robot, which you think harnesses emotions, exploits this bond that a child has developed with a robot, in order to make you pay more.

KD: It’s a bit like everything in-app purchases scandal this happened awhile ago, but it will be the one on steroids. Because now you have that emotional connection, where it’s not just the kid who wants to play an iPad game, but the kid actually has a relationship with the robot.

For children, I’m actually less worried because we have so many watch organizations looking for new technology that tries to exploit children. And there are laws that protect children in many countries. But what’s interesting to me is that it’s not just the kids – you can exploit anyone that way. We know that adults are likely to reveal more personal information to a robot than they would willingly do in a database. Or if your sex robot has enough convincing buys, it could be a way to actually tap into consumers’ willingness to pay. And so I think there needs to be broad consumer protection. For reasons of confidentiality, for reasons of emotional manipulation, I think it’s extremely plausible that people are spending money to keep a robot “alive”, for example, and that companies are trying to exploit that. .

WIRED: So, what does the relationship between robots and humans look like in the near future?

KD: Roomba is one of the very simple examples where you have a robot that is not very complex, but is in people, it moves on its own. And people named their Roombas. There are so many other cases, like military robots. The soldiers worked with these demining units and began to treat them like pets. They gave them names, they gave them medals of honor, they would have funerals with gun salutes, and they really related to them in a way similar to how animals have been a support. emotional for soldiers in intense situations throughout history.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *