OPINION | Abongile Mashele and Laurie Less: The need to regulate internet platforms

accreditation
0:00
play article
Subscribers can listen to this article
Unsplash

The question we should ask as the general public is - who has the power to decide what content is left up or taken down, ask the Film and Publication Board's Abongile Mashele and Laurie Less.


January 2021, then-US president Donald Trump's Twitter account was "permanently suspended due to the risk of further incitement of violence", Twitter stated. 

This decision came after a big-tech purge of the online platforms used by Trump and his supporters demonstrating the power social media platforms have on public discourse and the ability of tech giants to exercise public power with little or no public accountability.

Lawmakers and the public have been calling for his banning for years, and former first lady Michelle Obama tweeted that the Silicon Valley giants should stop enabling Trump's "monstrous behaviour" and permanently expel him.

Similarly, local South African celebrities and politicians declared their intention to deactivate Twitter and other social media accounts. The action taken by these platforms may certainly be viewed as the ability for self-regulation to correct transgressions that occur on various platforms. 

Is this a dangerous precedent, considering the legal and public instruments in place to safeguard and regulate freedom of expression? Should this superpower be relegated to private companies? These actions have amplified lingering questions regarding the regulation of social media content. 

When it comes to account enforcement within Twitter - a delicate balancing act where its rules often flout and or insults the notion of free speech, inviting racists and chauvinists, to publicly present their toxic views on their platform. The question we should ask as the general public is - who has the power to decide what content is left up or taken down?

ALSO READ: OPINION | Owen Dean: How Parliament should redraft the 'horrendous' copyright amendment bill

As campaigners with a keen interest in the ever-evolving discipline of internet governance, and being avid social media users, debates on internet governance and the regulation of social media platforms is a critical narrative to follow.

The search for appropriate regulatory instruments that could be adopted to safeguard the public interest through regulation of over the top (OTT) platforms such as video on demand (VOD) services and more controversially social media platforms is core to observing freedom of speech, parallel to curtailing hate speech, incitement to war, prejudice, violence and misogyny.

There is a plethora of platforms added daily such as dating sites, Amazon, Netflix, eBay, Facebook, YouTube, etc… These platforms are all in fact business models creating marketplaces that rely on what economists call indirect network effects.

Why regulate?

Having regard to our recent past of thought control, censorship and enforced conformity to governmental theories, freedom of expression, the free and open exchange of ideas, is no less important than it is in the US or globally.

It could actually be contended with much more force that the public interest in the open marketplace of ideas is all the more important to us as South Africans because our democracy is not yet deeply embedded. Therefore, we should be particularly astute to outlawing any form of thought control, however respectably dressed should it be outside the constitutional parameters that govern free speech.

However, technology has enabled easy sharing and access of inappropriate media, making this an extremely difficult aspect to regulate and mitigate. Thus, parents and caregivers need to be exposed to appropriate information to allow them to make informed decisions regarding the media content which they and the children in their care consume.

Our Constitution recognises that people must be able to hear, form and express opinions freely. For freedom of expression is the cornerstone of democracy. It is valuable both for its intrinsic importance and because it is instrumentally useful.

It is useful in protecting democracy, by informing citizens, encouraging debate and enabling folly and misgovernance to be exposed. It also helps the search for truth by both individuals and society generally. If society represses views it considers unacceptable they may never be exposed as wrong. Open debate enhances truth-finding and enables us to scrutinise political argument and deliberate social values.

What is more, being able to speak freely recognises and protects the moral agency of individuals in our society. We are entitled to speak out not just to be good citizens, but to fulfill our capacity to be individually human. Being able to speak out freely is closely connected to the right to vote and to stand for public office. That right lay at the core of the struggle for democracy in our country.

Shamefully, it was for centuries denied to the majority of our people. In celebrating the democracy we have created, we rejoice as much in the right to vote as in the freedom to speak that makes that right meaningful.

The right to freedom of expression is one of a web of mutually supporting rights the Constitution affords. Apart from its intense connection to the right to vote, it is closely related to freedom of religion, belief and opinion, the right to dignity, as well as the right to freedom of association and the right to assembly.

So, what is internet governance?

Social media platforms connect consumers with similar interests to others (buyers and sellers) or in the case of a dating site to your potential future life partner.

Platforms do not generally create their own content, and rely on the broader public to upload content thus creating community content or what we refer to as user-generated content (UGC).

It is for this reason that platform owners argue they are not responsible for what users produce and are thus exempt from the libel, defamation, laws and regulations that govern traditional media like newspapers and television. In other words, social media provides a platform created for free speech and owners assume limited responsibility for the content their users generate.

Internet governance refers to protocols designed to curb violations in cyberspace while safeguarding a safer internet engagement experience. It includes data protection, cybersecurity, content regulation dealing with prohibited and harmful content; it involves measures to eliminate the distribution of child sexual abuse materials, fake news and extremist propaganda or any form of propaganda that could be harmful in modern democracies.

There are two key schools of thought, largely polarised, on how social media platforms should be governed: one view is that governments have no role to play in the governance of the internet, citing the risk of authoritarian governments curtailing free speech and other liberties enjoyed in democracies.

On the other hand, it is incorrect to claim that platform owners do not exercise editorial control over its content. Traditional television and newspapers are what we call broadcast journalism, meaning that they provide the same content to a broad, general audience.

Social media platforms, by contrast, are narrowcasters. Given their ability to be directive, to pinpoint who you are, their algorithms choose content exclusively for what they think you want to hear and see, making frequent, personalised editorial decisions based on your browsing behaviour on their platforms as well as other websites (e.g. when you use Facebook or Google to login), and geolocation information taken from your cellphone.

Social media platforms are also referred to as natural monopolies. Explicit rules about what is or is not allowed on these platforms are implemented only when necessary, as they can constrain its expansion and are expensive to implement.

You may recall the much earlier YouTube period when they allowed users to post any type of music, TV show or film content. Or as recent as 2019, when the gruesome live streaming of the New Zealand mosque massacre became an example of how extremists abuse online platforms if left unregulated. Social media sites, including Facebook, faced a massive backlash after having failed to remove the live stream of this attack.

"We cannot simply sit back and accept that these platforms just exist and what is said is not the responsibility of the place where they are published," said Prime Minister of New Zealand Jacinda Ardern. "They are the publisher, not just the postman. There cannot be a case of all profit and no responsibility. It is unacceptable to treat the internet as an ungoverned space."

Viewers and consumers were immersed in this video and according to reports, Facebook's deputy general counsel, Chris Sonderby, said none of the approximately 200 people who watched the live video flagged it to moderators.

The first user report was filed 12 minutes after the broadcast ended. Only after significant legal threats from the media industry did the online video streaming platform begin to impose restrictions on copyrighted material.

Stickiness and the amplification effect

Platforms choose content based on maximising user time (i.e. the contents stickiness) on their site. In the PR and media world, the adage, "If it bleeds, it leads", refers to the fact that sensationalist, violent, or scandalous content provokes more emotions and sells more newspapers or advertising.

This stickiness economy encourages users to create and share content within their networks in exchange for likes and additional shares as a currency of self-affirmation. Some have termed the hyper-personalisation bias of the platform's algorithms as filter bubbles or echo chambers, and the fact that users are more likely to like and share the more polarising topics has been called the amplification effect.

The rise of populism and recent activism such as the #BlackLivesMatter campaign, as well as divisive, behaviour often perpetuated online, is a topic of concern for sociologists, sociolinguists and political scientists.

Hate speech is one example that concerns regulatory proponents, where it is argued that being subjected to constant online attacks on the basis of identity, e.g. cyber misogyny, can have seriously negative consequences as witnessed by the recent suicide of a young schoolgirl in Limpopo.

This could violate the rights of members of a certain race, place of origin or those with a particular sexual orientation among many other identities in our diverse societies. While diverse opinions are to be celebrated in our fragile democracies, when social media platforms are not held responsible for the accuracy of the content they present, there is no incentive or an algorithmic firewall for them not to show you the most outrageous or deepfake news.  

Excessive social polarisation is undesirable as it erodes the democratic institutions that protect free speech and other basic rights.

Without some basic consensus on the common objectives of social welfare, democracies weaken and become dysfunctional or corrupt. Just like we are compelled to adhere to banking regulations to protect and safeguard our life savings, social media platforms should be regulated to mitigate its worst effects. Self-regulatory controls that are in place have been effective in certain circumstances yet can only gain traction if consumers are made aware of these and encouraged to use them.

A relatively new space of debate is the collection and monetising of big data. Yours and mine. These platform owners collect so much demographic and behavioural data from our online activities, they can create a precise digital footprint or model of who we are with significant predictive accuracy.

They then sell these profiles, our digital twins or avatars, to advertisers both in and outside their platforms. They do this with little explicit knowledge or consent from their users. Moreover, users have no rights over their meta-data.

It is a competently asymmetric relationship; a Faustian bargain where, in exchange for carrying out searches, networking and taking advantage of geolocation services, we as users allow these platforms into the most intimate corners of our lives with little understanding of how or which of our secrets they sell.

The consequences of social media platforms that function on a quasi-monopolistic scale are more recently being understood. Given their expansive service offering, we could not imagine a life without the internet, today.

Whom do you hold responsible for posting your private nude selfies or whom do you sue when your revenge pornography lands on PornHub or some Google platform? This has led to a push for greater state-led interventions to ensure greater public accountability from the major tech giants who have become global monopoly operators - a quasi fourth tier of government.

Ergo, the need for robust public discourse on the role of internet regulation. We cannot ignore the economic and social benefits of these platforms, especially in emerging markets like ours, as we consider greater public accountability of these platforms, we need to ensure the regulatory interventions do not hinder the agility and innovation that sets them apart. 

But, like many industries, there are undesirable consequences that work against the greater social and economic good. Serious conversations on how social media platforms should be regulated to minimise their social costs are critically needed.

Is there a middle ground?

David Kayes, the UN international Rapporteur on free speech, wrote a book advocating for a middle ground that requires collaboration between private companies and state entities.

Critical to this debate is ensuring that public services are not outsourced to private companies with little or no accountability or recourse for members of the public. The global nature of the operations in these companies has also created a compliance nightmare for them as they learn to appreciate that varying jurisdictions have different legal instruments to regulate free speech as well as data protection.

The founder, chairman and CEO of Facebook, Mark Zuckerberg, has often articulated Facebook's ambition to create a global community, but as Africans, we know that there is no universality on global values and norms. It is these very values and norms that are the determinants of what is deemed to be harmful content that informs laws that regulate free speech across various countries.

All major social media platforms have community standards with clear reporting mechanisms that are activated should the public feel their rights have been violated. So, an important question is, who informs those standards? What recourse do we as South Africans have should our rights according to South African laws be violated? What recourse is available to you should response to your complaint with the platform indicate that their community standards have not been violated yet one strongly believes personal rights have been violated? For example is the video posted by Adam Catzavelos in his racist rant, sparking Twitter outrage.

The strides that have been made by social media platforms to establish internal governance mechanisms that clearly outline the rules of engagement on their platforms should be commended.

They have community standards and terms and conditions outlining what is permissible on their platforms. In the case of Twitter, they have a trust and safety team, with a team of content moderators whose job it is to moderate or curate content. They review a tweet that has been flagged to determine whether it violates Twitter's rules. If it does, moderators can usually enforce punishment at this stage, but Twitter requires a second layer of review for offenders who are considered public figures, in this case, a verified politician with more than 100 000 followers.

Social media platform owners claim their community standards are reviewed on a regular basis and inform an algorithm that can immediately identify prohibited content once it has been posted.

Where there is a universally clear standard on the prohibited content, as in the case of child sexual abuse material, the content is immediately blocked from these platforms. Some of them have developed an escalation system where human beings review and moderate when content has been flagged for violating the rules of the platform. 

Others use artificial intelligence and recognition tools. But are the tools in place sufficient when the rules themselves are not universally understood and applied by users across the globe? Will the content moderators always understand the context within which statements are made when they are not exposed to the social context, values and norms within a region where a post originates? Are we not handing over too much public power to private hands with limited checks and balances that the public can use to hold them accountable?

These are not by any means exhaustive questions, and they are certainly not questions we hold all the answers to. We do, however, need to host a coherent debate as a country and a continent on how to develop effective governance mechanisms for these platforms without limiting innovative models of tech companies and the convenience they provide society.

Africa is the new frontier for extensive growth of these companies as we have growth potential in connecting more people online. This means greater investment will be made to grow revenue in African markets.

As Africans, we should be exploring regulatory instruments that allows for greater public accountability by all stakeholders. A co-regulatory system where both private and public players play a critical role in safeguarding the public interest, seems to be a viable option in the digital world.   

Regulatory coherence is one possible avenue that may be pursued to adequately safeguard the public interest, to ensure coherence in the existing regulatory frameworks. Tech companies cannot find themselves chained to the current arrangements that are in place - it is simply far too bureaucratic.

The multiplicity of layers of red tape may prove far too burdensome for robust public discourse and the marketplace of free ideas. This administrative burden could stifle innovation and the agility of these tech companies.

Digital literacy - a critical path

Any regulatory system (self-regulatory/co-regulatory) that is not aligned to a clear digital literacy initiative will experience little success. All social partners need to embark on extensive digital literacy campaigns, educating users on the dos and don'ts of social media and focusing on the consequences delinquent users may face should the agreed-upon rules be transgressed. 

The Department of Basic Education plays an important role in reaching a critical mass of our youth and has an important role to play in ensuring young people are equipped with the tools necessary to safeguard themselves online. A number of social partners are already implementing various initiatives aimed at improved digital literacy targeting youth, parents and children. These efforts should be commended and better co-ordinated to ensure consistency and coherence in messaging.

For now, all we can rely on are that platforms self-regulate, with lawmakers across the globe starting to question this practice, given its inherent deficiencies.

The courts are a second option where citizens can pursue civil or criminal cases as we have seen with the recent cases of Penny Sparrow and Adam Catzavellos. This has proven to be an expensive and lengthy process. Poorer countries cann

ot be as litigious as those countries in the north where the cost of going the legal route is prohibitive. Looking after Gogo Dlamini's interests and those of her grandchildren relies heavily on the public's ability and willingness to engage tech giants who might from time to time crush her rights.

Alternatively, we throw our hands in the air and retreat from public engagement, or face internet trolls head-on and continue to allow those who choose to hide behind the veil of anonymity on these platforms to polarise us further as a nation.

- Abongile Mashele is the acting chief executive officer of the Film and Publication Boar

- Laurie Less is a shared services sxecutive at the Film and Publication Board

To receive Opinions Weekly, sign up for the newsletter here

 

*Want to respond to the columnist? Send your letter or article to opinions@news24.comwith your name and town or province. You are welcome to also send a profile picture. We encourage a diversity of voices and views in our readers' submissions and reserve the right not to publish any and all submissions received.

Disclaimer: News24 encourages freedom of speech and the expression of diverse views. The views of columnists published on News24 are therefore their own and do not necessarily represent the views of News24.

We live in a world where facts and fiction get blurred
In times of uncertainty you need journalism you can trust. For only R75 per month, you have access to a world of in-depth analyses, investigative journalism, top opinions and a range of features. Journalism strengthens democracy. Invest in the future today.
Subscribe to News24
Lockdown For
DAYS
HRS
MINS
Voting Booth
With the Springboks losing their last two matches against Australia, how concerned are you?
Please select an option Oops! Something went wrong, please try again later.
Results
Very concerned! Something is not right ...
59% - 2800 votes
I'm not worried. Every team has a bad game. The Boks will bounce back.
17% - 833 votes
It's too early to tell. Let's see how they go against the All Blacks.
24% - 1138 votes
Vote
Rand - Dollar
14.88
-0.6%
Rand - Pound
20.30
-0.5%
Rand - Euro
17.44
-0.6%
Rand - Aus dollar
10.74
-0.2%
Rand - Yen
0.14
-0.8%
Gold
1,777.28
+0.7%
Silver
22.48
+1.0%
Palladium
1,896.00
+0.3%
Platinum
947.00
+3.5%
Brent Crude
73.92
-1.9%
Top 40
56,061
+1.5%
All Share
62,337
+1.4%
Resource 10
55,821
+1.3%
Industrial 25
80,750
+2.0%
Financial 15
13,860
+0.8%
All JSE data delayed by at least 15 minutes Iress logo
Editorial feedback and complaints

Contact the public editor with feedback for our journalists, complaints, queries or suggestions about articles on News24.

LEARN MORE