Mark Loveless, aka Simple Nomad, is a researcher and hacker. He frequently speaks at security conferences around the globe, gets quoted in the press, and has a somewhat odd perspective on security in general.

Robot Death and Data Lessons

Robot Death and Data Lessons

Photo by    Franck V.    on    Unsplash

Photo by Franck V. on Unsplash

This year has so far seen the end of two small robots - Jibo and now more recently Anki's Vector. If one follows such things like advances in social companion robots, this would be considered a setback for the entire social robot movement. The problem with their demise is that after paying good money for these products (they weren't cheap), the main functions of these robots such as facial or voice recognition was cloud based. Cloud companies don't give away much of anything for free, so if the companies that pay for the cloud instances for these robots stop paying their bills, the instances go away as well. This means that Jibo and Vector will no longer work (or only perform a fraction of what they could do before), basically rendering them useless but expensive reminders of the dangers of early adoption.

Most of us that work in security regularly discuss protecting data and enforcing policies to protect user privacy. If someone were to come along and purchase the assets of Anki then they could get Vector up and running again. Maybe they might just use the code to build a better (or completely new) Vector. Or maybe they might take all of the user data and think of evil advertising ideas to be exploited with it.

Just because a startup company has a great data policy, it does not mean that the venture capital firm who is trying to get some of their investment back won't sell that user data to some advertising firm during the post-closing fire sale. There might be laws to protect some of that data, but that may only apply to the company itself - there may not be anything applicable to asset liquidation. My best guess from what little investigation I did told me that if there is anything out there it is worded too vague and is waiting some type of precedent-setting court case to interpret things.

So who gives two shits about a couple of things called "social robots" anyway? I do, but not for the reason you might think.

Imagine the product is not a social robot, imagine it is something else. Maybe some service that uses a camera and has taken a lot of images of you. Or images of parts of you. Maybe it is a product that has gathered a lot of personal data on you that it uses to construct shopping lists, errand tasks, manage money, or whatever. Let's even forget about a hardware product - this could be an app on your phone. As a person who has reverse engineered apps and played with weird IoT hardware, a lot of times they gather and upload all kinds of personal data including location information, contacts, photos, you name it. And it is protected by that Terms of Service written in Lawyer. Yes, the company in question may have a great approach to security and privacy and you feel safe sharing your data with them. But think about the lifetime of that company, and where that data ends up.

I will pose this (mainly rhetorical) question to everyone - are you aware of a company that states as a part of their data retention plan that in the event of business failure they will wipe your personal data? Do they state what will happen to your data if they are acquired, if they are the subject of a hostile takeover, or decide to change the company direction? If a company gives you the option to delete your account and wipe your own data, does the company status or ownership change this? Will you be able to delete and wipe after the fact?

What if you don't want data wiped? Instead of your PII, it is your company's data sitting inside some WhateverAsAService cloud instance, and their cloud service goes under? People seem to have forgotten the Nirvanix situation, which impacted both cloud-based and conventional businesses augmenting their infrastructure with Nirvanix cloud technology. This means that a medical application storing PII on a cloud instance of a cloud company that goes belly up could actually lose access and control of that data.

This is certainly not a new concern, in fact it is discussed a lot in security and privacy circles. But maybe we as security professionals can make a point by using these social robots as examples of data-collecting devices where the providence of the data is now in question, and maybe get some awareness outside of our echo-chamber security circles. As infosec people we often struggle to explain security concepts to a non-technical audience - I think that modern social companion robot death is a great example to explain the dangers of data and where it ends up.

The Password

The Password