If you needed confirmation that the internet makes you a global citizen, then the European General Data Protection Regulation (GDPR) was it. While the regulation was aimed at protecting European users, it essentially offers protection to all consumers the world over who consume information on websites or applications based within Europe’s borders.
As an information consumer in South Africa, my inbox was overflowing with GDPR updates from websites that were essentially admitting to me, for the first time, that they stored some form of information about me.
But what GDPR has also done, is presented some interesting opportunities to look at how these companies engage with users when it comes to issues of privacy settings.
A recent report by the Norwegian Consumer Council titled Deceived by Design: How tech companies use dark patterns to discourage us from exercising our rights to privacy, found that Facebook, Google and Windows 10 user privacy designs were “deceptive”, “manipulative” and “unethical”.
The report questions whether the consent from users “can be said to be explicit, informed and freely given”, when the impact of the “dark patterns” of design is taken into account. It defines “dark patterns” as “features of interface design crafted to trick users into doing things that they might not want to do, but which benefit the business in question”.
The report was prepared in line with the GDPR, which came into effect in mid May, and it analysed the user settings updates of three digital services. These user setting updates from Facebook, Google and Windows 10 sought to effect compliance with the GDPR regulations.
What the report finds is that privacy settings are often defaulted to the most intrusive option; that users are given an illusion of control and often misleading language is used to confuse the user.
Two examples that the report highlights were Facebook’s targeted advertising and Google’s privacy dashboard.
“Firstly, Facebook gives the user an impression of control over use of third party data to show ads, while it turns out that the control is much more limited than it initially appears,” reads the report. “Secondly, Google’s privacy dashboard promises to let the user easily delete user data, but the dashboard turns out to be difficult to navigate, more resembling a maze than a tool for user control.”
The report finds that software companies like Facebook and Google have “privacy-intrusive defaults”, where users who seek privacy-friendly options have to go through a “significantly longer” process.
It also finds that often privacy-friendly settings are hidden away, or the website architectures are designed to make choosing the friendly option a lot more work.
“They even obscure some of these settings so that the user cannot know that the more privacy-intrusive option was preselected.
“The popups from Facebook, Google and Windows 10 have design, symbols and wording that nudge users away from the privacy-friendly choices,” reads the report. “Choices are worded to compel users to make certain choices, while key information is omitted or downplayed.
The report argues that users are presented with “granular choices” regarding the collection and use of personal data, while at the same time, service providers employ numerous tactics in order to “nudge or push” consumers toward sharing as much data as possible.
“Instead, digital service providers should trust their users to make independent choices about what data they wish to share, and how their personal data is used,” argues the report. “After all, trust is the basis of any good customer relationship, and deceit and manipulation lead to the erosion of trust.”
The report argues that when digital services employ “dark patterns” to nudge users toward sharing more personal data, “the financial incentive has taken precedence over respecting users’ right to choose”.
The report makes it clear that while these technology companies have crossed a line when it comes to user data collection, they are not that interested in having to retreat behind it again under the threat of greater regulation.
Profit trumps privacy, it seems.
South Africa’s Protection of Personal Information Act (PoPI) is expected to come into effect sometime this year. We already have an Information Regulator and the draft regulations for PoPI have been published.
For South African users who value their privacy, PoPI can’t come soon enough.
However, one has to ask the question: Will we soon be in a place where user privacy settings are so complex that we’ll need legal advice on our own settings before signing up?