Worried about the robo-dogs who can open doors?

While kids are usually scared not so much of the things they can see but of the things they can't (like a monster under the bed), adults seem only to be adequately frightened of the things we can confirm with our own two eyes -- and much too willing to ignore the things of which we have no visual evidence.

Perhaps that is why people are having a hard time digesting the video posted this week showing two robot "dogs" from Boston Dynamics cooperating to open a doorand pass through it.

 

The reactions on social media have been exactly what one might imagine, especially from a culture that has been indoctrinated by movies like "Terminator": People are quick to declare the video frightening and almost as quick to rate it some kind of existential threat to human life as we know it.

But just because the reaction is common doesn't mean it's right. Sure, there's something a little unsettling about something that resembles a dog in so many ways (except, critically, having a head) doing something sophisticated that audiences were once terrified to watch a velociraptor pull off in "Jurassic Park". It's a case where fears programmed into us by popular culture overlap with the existence of something that falls into the "uncanny valley".

The fear, though, is misplaced. Robots, when programmed by conscientious technicians and scientists, have the potential to do amazing things -- like acting as reliable, hypo-allergenic, ultra-capable companion animals for disabled humans. After the recent kerfuffle over someone's attempt to travel with an emotional-support peacock, we really ought to be more open to the idea of people getting their help from predictable, programmable machines. Could robotic dogs be used for violence? Sure -- much of the research funding for robotics in general has come from DARPA, so obviously there's a military use for the technology. But virtually all of the applications that make these things attractive for use by and with soldiers also makes them valuable for civilian use, from carrying heavy loads to navigating difficult terrain to, yes, opening doors.

But beyond the potential for robots to do a great deal of good for human beings who need the help, we're also misplacing our anxiety over things we can see instead of appropriating it to truly frightening things that we can't.

For instance: More than two billion people are now regular users of Facebook. That's bigger than the population of any single country on the planet -- in fact, it's roughly the population of the two largest countries combined.

But have you ever read the terms and conditions of your use of that site? For example, term 2.1 of your relationship with Facebook says, "[Y]ou grant us a non-exclusive, transferable, sub-licensable, royalty-free, worldwide license to use any IP content that you post on or in connection with Facebook". Facebook undoubtedly would argue that those terms are necessary for them to be able to show your content on their applications everywhere. But a plain reading of that text also says "If you post a picture or a video here, we can do with it whatever we want, wherever we want, anytime, for free."

And it's not just your relationship with Facebook that's like this. Google's tools are constantly being refined by your very use of them -- from voice recognition to facial recognition. Snapchat can be set to track your location in real time. And if you don't think there are organizations everywhere trying to put artificial intelligence to work figuring out a deep profile of you and your behavior, your wants and needs, and even your insecurities, then you're simply putting your head in the sand.

The point here is that it's perfectly easy to find reasons to turn what will almost certainly be an innocuous and ultimately very helpful technology into something scary -- but, apparently, only if we can see it (like we can see the robot dogs). But we shouldn't paper over the truly frightening things that we choose blindly to accept just because we're too lazy, overwhelmed, or disinterested to read the terms and conditions. Sure, they're much harder to visualize -- but given the choice between privacy-encroaching terms and conditions that are almost impossible to avoid (at least, if you want to be an active participant in popular culture) and an assistive technology like a robotic dog that can open doors, I'll take the robo-pup any day.

Brian Gongol

Brian Gongol

Want to know more about Brian Gongol? Get their official bio, social pages & articles on WHO Radio! Read more

title

Content Goes Here