Written by Dr. Binoy Kampmark
Africa has been a continent exploited since the European scramble carved it out in lines of a draughtsman’s crude design. Its resources have been pilfered; its peoples enslaved for reasons of trade and profit; its political conditions manipulated to favour predatory companies.
A similar pattern is detectable in the digital world. The slavers have replaced their human product with data and information. The ubiquitous sharing of information on social media platforms has brought with it a fair share of dangerous ills. A $2 billion lawsuit against Facebook’s parent company Meta, which was filed in Kenya’s High Court this month, is a case in point.
The petitioners, Kenyan rights group Katiba Institute, and Ethiopian researchers Fisseha Tekle and Abrham Meareg, argue that Meta failed to employ sufficient safety measures on the Facebook platform which would have prevented the incitement of lethal conflict. Most notable were the deaths of Ethiopians arising from the Tigray War, a conflict that has claimed tens of thousands of lives, and seen the displacement of 2.1 million Ethiopians.
Abrham Meareg’s case is particularly harrowing. His father, chemistry Professor Meareg Amare Abrha and an ethnic Tigrayan, was singled out and harassed in a number of violent and racially inflammatory Facebook posts. Two posts screeching with slander (complicity in massacres; aiding military raids, corruption and theft) and death threats found their way onto a page named “BDU STAFF”, which sported over 50,000 followers at the time.
The posts also included the professor’s picture and home locality. Complaints to the platform by his son received no response. The posts remained up for four weeks. Meareg Amare was subsequently assassinated after leaving his work at Bahir Dar University. According to his son, the killing “was orchestrated by both state and non-state actors.”
Rosa Curling, Director of the non-profit campaign outfit Foxglove, an organisation supporting the petitioners, is convinced that the professor would still be alive had the posts been removed. She also makes a salient point. “Sadly, ‘engaging’ posts are often violent or shocking, because people react to them, share them, comment on them. All those reactions mean the Facebook algorithm promotes the post more, and can make hate posts and violence go viral, and spread even further.”
Meta, in response, has trotted out the standard, disingenuous deflection, giving us an insight into a parallel universe of compliance. “We have strict rules about what is and isn’t allowed on Facebook and Instagram,” declared Meta spokesperson Mike DelMoro. “Feedback from local civil society organizations and international institutions guides our security and integrity work in Ethiopia.”
Meta’s content moderation hub for Eastern and Southern Africa is located in Nairobi. But questions have been raised about how adequate its staffing and resourcing arrangements are. DelMoro claims there is nothing of interest on that score. “We employ staff with local knowledge and expertise, and continue to develop our skills to detect harmful content in the country’s most commonly spoken languages, including Amharic, Oromo, Somali and Tigrinya.”
The treatment of staff at Meta’s main subcontractor for content moderation in Africa, Sama, is also the subject of another lawsuit. That action alleges the use of forced labour and human trafficking, unequal labour relations, attacks on unions and a failure to provide sufficient mental health and psychosocial support to hired moderators.
Abrham Meareg and his fellow petitioners are demanding, along with Facebook’s halting of viral hate and demoting of content inciting violence, the employment of greater numbers of content moderators versed in a range of languages.The legal filing also demands that Meta issue an apology for the professor’s death and establish a restitution fund for victims of hate speech or misinformation posted on the company’s platforms, including Facebook and Instagram.
Such actions are becoming regular fare. All tend to follow a similar blueprint. In December last year, a class action complaint was lodged with the northern district court in San Francisco claiming that Facebook was “willing to trade the lives of the Rohingya people for better market penetration in a small country in south-east Asia.” The language proved instructive: a company, operating much in the traditional mercantilist mould, a plunderer of resources, its gold the product of surveillance capitalism.
Lawyers representing the petitioners also submitted a letter to Facebook’s UK office stating that their clients had been subjected to acts of “serious violence, murder and/or other grave human rights abuses” as part of a genocidal campaign waged by the military regime and aligned extremists in Myanmar.
As with the case lodged in Kenyan High Court, the grounds against Facebook were that its algorithms amplified hate speech against the Rohingya populace; it failed to adequately invest in local moderators and diligent fact-checkers; it failed to remove posts inciting violence against the Rohingya; and it did not shut down or delete specific accounts, groups and pages that encouraged ethnic violence.
Despite such actions,there is nothing in the way Meta operates to suggest a change in approach. As far long as the wallets stretch, platforms such as Facebook will continue to use devilish algorithms to boost bad behaviour. In the scheme of things, such behaviour, however hateful or misinformed, sells. The dragon of surveillance capitalism continues to thrive with fire breathing menace.
Dr. Binoy Kampmark was a Commonwealth Scholar at Selwyn College, Cambridge. He currently lectures at RMIT University. Email: bkampmark@gmail.com
Social Media can be dangerous, I deactivated my facebook and I don’t miss it at all.
More nighas whining about fake exploitation. Those dumb nighas don’t even know how to use a spoon to feed themselves.
Look at those ugly fat nigha bitched. No nigger would fuck them even if paid in bananas.
Oh gee. Retarded niggers. When they open their mouths…. You realize that they are dumber than chimps.
Can they find any uglier nigger slags?
Or maybe fatter monkeys?
Nobody told these fat greasy niggers that Niggerland has no Europeans there. Still living in 1600s, eh?
Ho no ! Any media can be used malignantly to promote sub-humans behaviors like tribal warfare and unjustified hatred , nothing new here , just ask the un-jabbed to know what it feels like. Or the Hutu in Rwanda in 1994 , long before internet was available locally…
OK , now can we talk about who let those medias behave like catalyst for violence , and why did they practice a very selective censorship to protect some individuals and communities and not everyone ? Who’s on every single media and social media board since Patriot act and DMCA ?
“Meta” is brainwashing machine as any other western MSM and informational instrument. They are using nazist ukros as moderators in russian segment of “Facebook”, this is their policy. They refused to delete many extremist materials, they called to kill russians and were blocked for this by a court decision in Russia. But you’re right in common. How to separate all this scum from normal people in social networks? There is thin line between doing nothing against and support of extremism from the side of soc.network admins. Just block them in your country and remind to your people from time to time how extremist, untruthful and slanderous western MSM are to provoke people for leaving this network/MSM. I doubt they will have any money from “Meta”, so just kick them off.
One and the same racist prick with different names is in commentary section, except Zefplok. Dude, if you wish to make the world better, kill yourself. Canada kills 15 thousand own citizens via euthanasia per year, ask them for help, so no single negroid/mongoloid/humanoid/your neighbors will disturb you anymore.
Look at those Homo Troglodytes. 500lb of fat each. And I am supposed to believe that niggers are starving in Africa.