Latest news about Bitcoin and all cryptocurrencies. Your daily crypto news habit.
Facebook has been receiving criticism once again for how they handled users’ personal data. Here is a quick summary: in 2013, a 3rd party developer acquired large amounts of data from about 50 million users through an old platform capability (which was removed by Facebook itself one year later to prevent abuse); this data was then used to target US voters during the 2016 Presidential Election. The issue is complex in depth and it highlights a bigger underlying problem: users’ privacy expectations are not aligned with the commitment from most tech companies.
Zuckerberg said in a recent interview with Wired, “early on […] we had this very idealistic vision around how data portability would allow all these different new experiences, and I think the feedback that we’ve gotten from our community and from the world is that privacy and having the data locked down is more important to people.”
Regardless, Facebook never committed to fully lock down users’ data, and their business model was in fact built around the value that data can have for advertisers through interest relevance and demographic targeting. Google and Facebook accounted for 73% of all US digital ad revenue in the second quarter of FY18, up from 63% two years before.
I can nonetheless relate to that idealistic vision between privacy and technology. The more information the Google Assistant knows about the music I like, the better it can personalize my listening experience. Richer actions become available too, like allowing me to control the Nest thermostat or the lights by voice. At the end of the day, I’m trusting Google with my music taste and the devices installed in my house, and I get the benefit of convenience in return.
The same can be applied to many other tech companies: you provide data, they provide a benefit. Data is one of the most valuable commodities of the 21st century, and some companies are openly creating their business model around this (for example: would you give up your privacy for unlimited movies?). When you can infer personality traits or personal habits from that data, the stakes are much higher, and we should hold these companies accountable.
Accountability is especially difficult if there is a lack of transparency in the data a company stores about us. That’s why laws like the General Data Protection Regulation (GDPR) approved by the European Union are critical. Many are calling for more regulation on Facebook, but regulation should be applied equally to any tech company with private user information.
Every user should know what data is being collected when they use a product, so that they can make an informed decision on whether or not to proceed. Are you sure you want to answer that personality quiz? If so, do you really know who is keeping the answers and how can they be used in the future? Users also have a responsibility to protect their own privacy and this is unfortunately a big challenge. When was the last time you saw a dashboard with all the data Facebook, Google, Microsoft or Amazon have about you? I wouldn’t be surprised if the answer was never.
Even if we assumed GDPR-like regulation was already enforced and that users knew all the data they have shared with the world, there is one final problem that nobody has been able to solve and that primarily affects tech products where human emotions are involved: algorithms and advertising tools are easy to exploit in order to manipulate users.
“When you have a huge artificial intelligence, the way that Facebook and Google have, and you marry that to a software product that captures every human action — and in the case of Facebook captures emotional states, across the day and all their interactions — you have massive power. You put that on a smartphone, which is available every waking moment, and the combination of those things creates filter bubbles, which is to say the ability for each person to live in their own context, their own world, with their own facts, and to surround themselves with like-minded people so that non-confirming facts don’t get through. When people are in that heightened state, and when the advertisers have the ability to take advantage of the outrage curve — which is to say pushing the buttons of fear and anger, the things that create maximum engagement — people are really vulnerable and you can implant ideas in their head that they will think are their own,” Roger McNamee, one of Facebook’s early investors, explains in an interview with ProMarket. Facebook wasn’t hacked, our minds were.
When creating a social product, we need to ensure that it’s difficult to use to deform the truth, and that it protects users’ personal data. A social product should also not break us apart into smaller bubbles, but bring us together into a better society. Ultimately, this is the tech industry’s biggest challenge: continue improving the world through cohesive services and convenience, while considering users themselves a true living customer and not a product.
Did you like this article? Subscribe to get new posts by email.
Originally published at geekonrecord.com on March 26, 2018.
Fixing Facebook’s privacy problem was originally published in Hacker Noon on Medium, where people are continuing the conversation by highlighting and responding to this story.
Disclaimer
The views and opinions expressed in this article are solely those of the authors and do not reflect the views of Bitcoin Insider. Every investment and trading move involves risk - this is especially true for cryptocurrencies given their volatility. We strongly advise our readers to conduct their own research when making a decision.