#deletefacebook won’t fix anything.
Ok, here we go again… Following the Cambridge Analytics ‘incident’, almost predictably we see #deletefacebook popping up all across most of the social media channel, with several more or less familiar names out there stating that perhaps now it’s time to completely delete your account on worlds biggest social media platform and move elsewhere.
It happened before and it possibly will happen again. And again, it possibly will remain a storm in a teacup and calm down just as fast as it started, just like people claiming to leave Instagram (when sold to Facebook) or stop using WhatsApp just in case they would start linking WhatsApp user data to Facebook user data. Still, all these might be considered worlds most crowded players in their segment, like it or not; and this is what will leave most of the #deletefacebook attempts end up without much effort. People will possibly choose to stay in the (people) networks they’re used to, and as long as there’s not a critical mass on any other network that might make evaluating it worthwhile, they will stay where they are. Overcoming this is pretty difficult I guess…
… but even worse, at least to me, is that even if they did actually move, it wouldn’t possibly change much. Our problems aren’t merely just technical. They’re about awareness, education and conscious decisions. Is it better to know your precious personal data are stored on hard drives in a locked-up Facebook compute center in some part of the world? Or is it better to know these data are spread across a load of loosely-knit distributed services, hosted in various environments ranging from large-scale cloud infrastructure to, in worst case, someones “home server” connected to a single power line and just usable for these purposes because its uplink is fast enough? Deleting Facebook accounts and simply moving to another platform in an unreflected, mindless way doesn’t make these things better. Maybe it doesn’t hurt to keep two things in mind:
- If you store data in some cloud infrastructure, you store data on someone elses computers.
- If you provide “someone else” (other users in terms of a platform such as Facebook as well as, say, operating system users on those platforms in charge of maintaining file and database stores) with unencrypted access to your data, you are responsible for knowing or controlling what that “someone else” does in order to process your data.
Both things aren’t necessarily all bad. They just should be kept in mind. Both are about giving up a load of control over how your data is used and for which purpose to some third party, and doing (or not doing) is something one has to actively decide. This, too, involves nasty things such as thoroughly reading and understanding Terms Of Services when registering with a particular provider.
This, too, might require to evaluate privacy requirements for each and every bit of information now more or less “unconsciously” shared via digital media. This, too, would require to do all the same not just for Facebook or Twitter but for each and every other network and digital service being used. Who’s operating those? Is there some policy how data is about to be treated and protected? Is there any transparency on how data is stored, backed up, managed, eventually encrypted on-disk? Is there any way of telling who has (technically) access to these data either using some sort of API or directly working with data stored on disk or in some database?
We would possibly need much more cryptography on all possible levels. Maybe we also would need something such as DRM for user-data to allow end users to have more fine-grained control over how their (digital!) data is shared and used. But in the end no technical measure protects you from having your data printed and transferred on paper. What we possibly need is some ethics in how to work with digital systems and user data. And we need more awareness on the end-user side. As always: No technical solution to social problems…