I was invited to a Community Standards workshop by Facebook Africa in Nairobi, Kenya last week. It was the first time the company put together an event like this for 40 journalists from about 18 countries in Africa, which included five media from South Africa.
Image credit: Shutterstock
Initially it wasn’t clear what the event was going to be about, and when we got to the event, were told no photos and videos, etc. Not the correct approach to take with journalists, who are expected to report back on the event. Nevertheless, this was my experience.
WHY WAS I THERE?
Essentially, we were given a deeper understanding of what it’s like to be a content moderator – what they do daily, on either FB or Instagram. There are over 1 billion photos alone uploaded daily (excluding videos, status updates, comments and stories), so structure is needed. For anyone not aware, their Community Standards is publicly available for all to view: https://www.facebook.com/communitystandards/
Aside from the Facebook communications teams, the event included the public policy team from various regions: Emilar Gandhi, Fadzai Madzingira, Toby Parlett, Mercy Ndagwa, and Phil Odour who provided insights and answered questions.
WHAT DID WE DO?
Once the specific departments spoke about how they work, what they do daily, and what it takes to be a content moderator, we were put into groups with media from other countries. We were then given two fictitious scenarios with a thirty-minute time frame to discuss whether the particular content case presented needed to remain online or to be removed. While I cannot go into detail about the scenarios, there was a fair amount of discussion taking place.
My understanding of the situation felt clear cut and both my decisions on what to do with the two different pieces of content – I chose to keep one and remove the other – were in fact the wrong decision. The were lots of debate within the teams, and loads of disagreements – naturally – because different factors came into play. We discussed how we came by certain decisions, and it turns out, saying something sounds dodgy (like what South Africans know to be a scam, like asking for money on FB) isn’t disallowed.
WHAT I LEARNT FROM THE WORKSHOP
Working as a content moderator is a difficult task. I couldn’t do it on the daily, and I’d probably get fired if this is a position I held. Also, emotions cannot form part of the decision, from what we experienced within the teams where there were plenty of opposing views. It’s not always a quick yes or no situation.
There are three tiers of attacks in the case of hate speech specifically, for example: Tier 1 is calls to violence, dehumanizing; Tier 2 is inferiority, contempt, disgust; and Tier 3 calls for seclusion and segregation (the most difficult one, I don’t understand it fully but it took seven months to write the policy). It’s complicated.
We were given a ten-minute explanation of the policy, versus the eighty hours of training for employees. We received a condensed version for the workshop and I think I understood it from a surface level, with the help of the context from the scenarios we were put in. Ten minutes is no match to eighty hours of training.
Given that billions of people use Facebook and Instagram, I understand and can sympathise with the teams who have to do these difficult tasks, and the toll it takes. I hope their working conditions actually improve (more on this below from my questions posed to them), and we can see a follow up story on how it has changed.
AS A FB OR IG USER
The team touched on the new “Restrict” feature on Instagram, which to be honest, I wasn’t sure fully how it works. I think it can be best described as a “soft blow” instead of blocking someone. If you put someone on Restrict, they can send you messages but they don’t know you are not receiving them, or if they leave a comment on your post, nobody will be able to see it besides them; LOL.
As someone who uses the block and report tools daily (no exaggeration), I have seen the turnaround time shift in the last year, from months to weeks to days now – the quickest one I received – less than an hour. I report weirdos who message me, especially ones with dodgy accounts (forex and bitcoin traders, anyone?), which get taken down, or those who sell followers, and just content that goes against the rules.
You can appeal any decision Facebook and Instagram makes with regards to keeping content up or taking it down. We were told it goes both ways, and they are willing to learn from mistakes. However, when it comes to politics, religion, health etc, things can get messy and appear very biased.
Facebook releases reports on what content gets taken down, you can find it here: https://transparency.facebook.com/community-standards-enforcement
MY QUESTIONS ANSWERED
Around support for the moderators: Toby Parlett said they have specialist teams to help with complex issues like child safety or terrorism. They have 35 000 people working in safety and security, this includes 15 000 content moderators. The partners and contractors have extensive experience with this kind of work, and it is why they choose to work with them; and they are held to a set of strict guidelines.
How has the environment been made better in the last few months for moderators: There is an AI that is used to take content down, and while it is good and getting better, it’s not perfect, which is why humans still have to play a role in this, said Parlett. Moderators constantly get training due to new policies. I’m told they also get psychological support, on site or at home, or it can include private healthcare. Moderators are also allowed to leave their desks immediately for a time out when required.
Around the difficulty to get copyrighted content removed from IG: There’s a separate flow of questions one has to follow which goes to a specialist team to deal with it. FB also said it’s a separate team that works on this and they’d have to get more more info if I wanted. (I know there isn’t way to directly report someone who has used an image of yours without permission, it leads to tapping on “learn more”).
LOOKING AT THE BIGGER PICTURE
There’s a lot going down with Facebook right now, and while I was in Kenya, Mark Zuckerburg was testifying before US Congress. It feels like I cannot finish off this blog post and ignore what it actually happening. You can read this piece by the Guardian. I’ll leave you with this: Does FB need to be broken up? Do they have too much power? Why won’t political ads get fact-checked? How do we feel about the CEO saying he cannot say when he found out about the CA scandal? Lots of things going on so I best leave it to the experts to cover.
Disclaimer: For those not aware, I quit Facebook before the CA scandal, but it obviously has no bearing on me writing about the company. I have a love/hate relationship with Instagram (love the beautiful content I choose to see; hate the spam I get from creeps, and randoms tagging me in comments & photos).
Nafisa Akabor
Related posts
ABOUT
Recharged is an independent site that focuses on technology, electric vehicles, and the digital life by Nafisa Akabor. Drawing from her 16-year tech journalism career, expect news, reviews, how-tos, comparisons, and practical uses of tech that are easy to digest. info@recharged.co.za