COVID-19 has shown us the power of technology to help people and solve problems. We’ve also seen people using our apps more than ever, to stay connected with family and friends fostering communities even during challenging times.
This is how Facebook groups like RebuildSA emerged; the group is focused on rebuilding communities, small businesses, schools, clinics, homes and lives in riot-affected communities like Kwa-Zulu Natal and Gauteng.
Of course, empowering people to connect with each other has also resulted in people sharing things like false news, rumours, and harmful content and misinformation. This isn’t a problem just in South Africa, but it matters a lot here – especially as we enter the election period.
This municipal election is important to us at Facebook and we’re using the lessons we’ve learned from more than 200 elections around the world to help ensure the integrity of the election in the days leading up to polling day in South Africa.
Our approach has three main pillars: We’re taking steps to detect and remove harmful content, like hate speech or content inciting violence, in order to keep people safe; we’re combating misinformation, and we’re delivering better transparency about who is behind the political ads you may see on our platform.
Keeping People Safe
Let’s take these one at a time, starting with safety. The safety of everyone in South Africa is something we prioritize and care about deeply.
We have strict rules, called our Community Standards, that outline what is and isn’t allowed on Facebook and Instagram. This includes policies on hate speech, harassment and incitement of violence that are particularly important during election periods. We remove this content when we become aware of it.
We’re continuing to invest heavily in people and technology to find this content quicker. We’ve invested more than $13 billion over the last few years to improve our workforce and technology in these areas and company-wide, we now have 40,000 people working on safety and security. This includes over 15,000 expert content reviewers who are based all over the globe — including those fluent in local South African languages — to assess content that may violate our Community Standards.
As long as there’s been information, there’s been misinformation. Rumour mills, sensationalist headlines and propaganda have been an ugly part of the human discourse for generations before the internet.
This may not be a new problem, but in recent years Facebook has come under scrutiny for not doing enough to stop the spread of misinformation on its platforms. The truth is Facebook undertakes extensive efforts to stop false news from spreading.
Let’s start by focusing on what we don’t do. We don’t take something down just because someone says it isn’t true. It wouldn’t be appropriate for a tech company based in the United States to say what is true or what is false. Not only that but a policy that tried to do this would be impossible to enforce accurately.
Of course, we recognise we still have a responsibility to help keep people safe on the platform. So while we don’t have a policy that explicitly states that everything on Facebook must be true, we do have policies in place to address some of the most harmful types of false information.
For example, misinformation that is intended to suppress voting during elections and false information that could lead to imminent harm if it were to be believed – such as claims COVID-19 is a hoax. During the COVID-19 pandemic, we have removed over 20 million pieces of false COVID-19 and vaccine content globally for violating these policies.
If a piece of misinformation doesn’t break one of these policies we work to reduce its distribution, so fewer people see it. Of course, getting this right is an immense challenge. That’s why we’ve built a global network of more than 80 independent fact-checkers, including AFP and AfricaCheck in South Africa.
They are capable of reviewing potentially false content, in Zulu, English, Afrikaans, Sotho, Northern Sotho, Setswana, and Northern Ndebele languages. When this fact-checkers rate something as false, we reduce its distribution so fewer people see it and add a warning label with more information for anyone who sees it.
Pages and Groups that repeatedly share false information that has been debunked will also see their distribution reduced and their ability to advertise and monetize reduced.
We know from experience that people on Facebook don’t want to see misinformation, false news and harmful content, so we have a big incentive to remove it. While we are making good progress, we know we have more work to do, but we can’t do it alone. The people of South Africa are our greatest ally in this fight.
If you see posts that violate our Community Standards you can help our efforts by reporting them to us, which remains anonymous. We also encourage people not to post and reshare misinformation without checking the facts. By working together we’ll continue to make progress against this problem, together.
Making Political Ads More Transparent
When you see a political ad in your News Feed or in one of our apps, you deserve to know who promoted it.
In June, we took a critical step toward ensuring better transparency when it comes to political ads in South Africa. Anyone who wants to run political ads for this election now must go through a verification process, proving who they are and that they live in the country.
These ads are labelled with a disclaimer, so you can see who paid for them.
All of the political ads you’ll see regarding this election will be stored in our publicly accessible Ads Library for seven years. In the Ads Library, you can see what ads are running, what types of people saw them and how much was spent on the ad. To give people more control over their Facebook experience, we also offer a way for people to opt-out of these verified ads.
In the lead up to the elections, our team has trained political parties and the Independent Electoral Commission on our best practices, ads transparency, Community Standards and election integrity efforts.
We’ve also informed political advertisers on how to best reach and engage supporters throughout the election period. In addition, we’ve established an internal task force to help us work through issues that may come up — such as the spread of misinformation.