LawyersHub

lawyershub_blog

Last week on 30/9/20, The Lawyers Hub strengthened its commitment to defending and upholding digital rights and strengthening electoral democracy in Africa by joining the Real Facebook Oversight Board (RFOB). The Real oversight Board consists of academics, experts and Civil Rights leaders committed to responding to immediate threats on democracy caused by Facebook’s Unchecked Power. Currently, members of the Board include Rashad Robinson the President of Colour for Change which is America’s largest Racial Justice Forum, Shoshana Zuboff author of the book, The Age of Surveillance Capitalism and Roger McNamee, Mark Zuckerberg’s former mentor and author of Zucked: Waking up to the Facebook Catastrophe.

 Noteworthy to point out that in May 2020, Facebook launched it’s oversight Board  designed to make final and binding decisions on content that should be allowed on Facebook and Instagram. The board serves as an appellate function, making decisions after posts are made but has to date not begun its operation. Unlike this Facebook Board, the RFOB responds to issues on Facebook roles in election in real time. To date, the RFOB has made 3 demands to Facebook calling on it to:  Enforce its own policy and remove posts inciting violence, ban all paid advertising that mentions presidential elections until one candidate is declared president elect and the other concedes and label all organic posts as untrue and premature until one candidate is declared president elect and the other candidate concedes. In a short term win, on 7th October 2020 Facebook announced that it will temporarily suspend all political and issue based advertising after the close of polls on November 3rd, broadening earlier restrictions and preventing disinformation. 

While the RFOB was initially formed to give immediate response  to electoral integrity issues in the upcoming US elections, we believe countries in Africa and the larger Global South are not immune to the same challenges. The Role of communication technologies during electioneering periods has garnered public attention since it was revealed that a British Political Consulting Firm had obtained and perhaps even successfully exploited personal data from 87 million Facebook user accounts for political mileage in countries like the USA, the United Kingdom, Nigeria and Kenya. The firm exploited big data to profile voters and target certain users with specific content. In Kenya, targeted advertising on social media and search platforms was deployed by parties and candidates in a move to diversify away from traditional media strategies such as radio, television and physical rallies. In addition to data profiling, the emergence of these digital communication platforms has also created more avenues for rapid dissemination of information. This has practically facilitated tactics such as negative campaigning, creating and spreading viral content which have also led to an increase in disinformation and polarization.

Take the example of Ethiopia, where perpetrators of violence offline have leveraged platforms such as Facebook to spread hatred and incite violence online. Similarly, Facebook was used to spread anti-Rohingya Muslim sentiment leading to polarisation and the eventual persecution of the Rohingya Muslims in Myanmar. In this case, Facebook even admitted not having done enough to curtail disinformation on their platform. The negative impact that content on digital communication platforms such as Facebook have are now apparent to the global community. This content can lead to physical violence, discrimination against minorities and other acts of hostility. However, such content remains online and visible on the platform despite the risk it poses for these minority groups.

Undoubtedly, there is a need for content moderation on Social Media Platforms. Organisations such as Twitter have banned all political ads whereas others like Facebook have at some point taken a more apathetic approach by neither restricting nor fact checking them. While appreciating the innate freedom of expression, content that qualifies the threshold of incitement to violence, hostility or discrimination is not protected speech under international human rights law. Furthermore, these platforms have an obligation under human rights law to prevent, mitigate and remedy any adverse human rights impacts that are directly linked to their services or to which they contribute.

While we appreciate steps that Facebook has taken to address these collective issues in Africa, we believe the Company ought to do more to make its platforms safer and safeguard democracy. This is especially in volatile nations where profiling, disinformation and incitement may lead to offline violence.  In this regard, the Lawyers Hub believes joining the board strengthens the call for oversight and accountability of these platforms. In summary, we are calling for Facebook to do more to safeguard Human rights and put people over profits. 


Add Comment

Your Email address will not be published
The Lawyers Hub
Why we Joined the Real Facebook Oversight Board
blogimg
Why we Joined the Real Facebook Oversight Board
Oct. 12, 2020


blogimg
Alternative Justice Systems Policy and Legal Hack launched in Kenya
Sept. 21, 2020


blogimg
Taxing Kenya’s Digital Economy: The Digital Service Tax (DST) explained
Aug. 21, 2020


blogimg
What does Kenya need to think about with the US FTA?
Aug. 13, 2020


blogimg
Kenya’s Chief Justice Issues Practice Directions on ICT in Light of Covid19 Pandemic
March 19, 2020


Address

Bishop Road First Ngong Ave, Nairobi, Kenya

Contact

Email: info@lawyershub.ke
Phone: +254 784 840 228

Subscribe
Social