Why you should be nice to your robots

“it’s hard to bark orders at a machine without feeling like the kind of obnoxious person who barks orders at waiters”

[…]

“We love our Amazon Echo… but I fear it’s also turning our daughter into a raging asshole,”

[…]

“How’s a four-year-old supposed to learn that other household members aren’t simply there to do her bidding, when one (electronic) household member was designed to do exactly that?”

[…]

“Such worries will grow more urgent as we interact with more convincingly humanesque devices. As the tech writer John Markoff puts it: “What does it do to the human if we have a class of slaves which are not human, but that we treat as human?” Most of us would agree with Immanuel Kant that it’s unethical to treat others as mere means to our own ends, instead of ends in themselves. That’s why slavery damages the slaveholder as well as the slave: to use a person as if they were an object erodes your own humanity. Yet Alexa (like Google Home, and Siri, and the rest) trains us to think of her as both human yet solely there to serve. Might we start thinking of real humans that way more frequently, too?”

(source accessed 8.7.2016)