OPINION | How to make Google and Facebook care about privacy

play article
Subscribers can listen to this article
Facebook has been looking into new ways of increasing video consumption on the platform (Photo Illustration by Rafael Henrique/SOPA Images/LightRocket via Getty Images)
Facebook has been looking into new ways of increasing video consumption on the platform (Photo Illustration by Rafael Henrique/SOPA Images/LightRocket via Getty Images)
Rafael Henrique/SOPA Images/LightRocket via Getty

Take a lesson from the story of photo app Ever, says Cathy O'Neil. 

If you haven’t heard about the rise and fall of the photo app Ever, I’d suggest paying attention. Its story illustrates how the government, if it wanted to, could compel big tech companies such as Google and Facebook to respect people’s privacy.

Like many cloud services, Ever offered users a place to store their photos. It then went a step further, using those photos to train a facial-recognition algorithm, which it marketed to law enforcement agencies and other potential clients. Some Ever users felt that their privacy had been violated, and the Federal Trade Commission alleged that the company, Everalbum, had acted deceptively by employing face recognition without customers’ knowledge and by failing to delete their photos when they deactivated their accounts.

What’s really interesting are the terms of the settlement reached this week. It doesn’t just require Everalbum to delete the photos in question and obtain consumers’ consent to use face recognition. The company must also delete any algorithms that it developed with the photos and videos that it obtained through the app (which was shut down last year).

The FTC’s focus on the algorithms could set a powerful precedent. In the world of artificial intelligence, people’s data are just the raw material: for Google, search terms and ad clicks; for Facebook, the posts people read and how long they’re engaged; for Amazon, what people buy and how they find it. The companies then use those data to update their algorithms –- daily, hourly or even every minute -- to attract and generate profit from ever more people. The algorithms are the core of the product. They contain the full accumulated knowledge, including the newest links, the latest viral videos and the hottest new products.

So when the FTC fines Facebook $5 billion for misusing user data, as it did in 2019, that’s maybe expensive but far from fatal. The most valuable assets — the algorithms that Facebook developed from the misappropriated data –- remain intact. Like the bodies of euthanasia patients in the dystopian thriller “Soylent Green,” people’s information has already been processed into the final product, ready to be fed to the next in line.

But what if authorities required Facebook to delete the offending parts of the algorithm? What if the company had to revert to an earlier version, before it started misusing the data? The AI would be completely out of touch: Imagine Facebook serving up articles from before the 2016 election. Retraining without the missing information would require a monumental effort, severely screwing up the business model for some time.

Therein lies a potent weapon. If authorities let it be known that they’ll be coming after the algorithms the next time they catch someone misusing data, tech companies will probably take privacy concerns a lot more seriously.

We live in a world where facts and fiction get blurred
In times of uncertainty you need journalism you can trust. For only R75 per month, you have access to a world of in-depth analyses, investigative journalism, top opinions and a range of features. Journalism strengthens democracy. Invest in the future today.
Subscribe to News24
Brent Crude
All Share
Top 40
Financial 15
Industrial 25
Resource 10
All JSE data delayed by at least 15 minutes morningstar logo
Company Snapshot
Voting Booth
Please select an option Oops! Something went wrong, please try again later.
Yes, and I've gotten it.
21% - 1052 votes
No, I did not.
52% - 2628 votes
My landlord refused
28% - 1420 votes