Connect with us

Social Media

Kenya: Meta Sued for $1.6 Billion USD for Fuelling Ethiopia Ethnic Violence

Published

on

Facebook Meta

By Amnesty International 

Meta must reform its business practices to ensure Facebook’s algorithms do not amplify hatred and fuel ethnic conflict, Amnesty International said today in the wake of a landmark legal action against Meta submitted in Kenya’s High Court.

The legal action claims that Meta promoted speech that led to ethnic violence and killings in Ethiopia by utilizing an algorithm that prioritizes and recommends hateful and violent content on Facebook. The petitioners seek to stop Facebook’s algorithms from recommending such content to Facebook users and compel Meta to create a 200 billion ($1.6 billion USD) victims’ fund. Amnesty International joins six other human rights and legal organizations as interested parties in the case.

“The spread of dangerous content on Facebook lies at the heart of Meta’s pursuit of profit, as its systems are designed to keep people engaged. This legal action is a significant step in holding Meta to account for its harmful business model,” said Flavia Mwangovya, Amnesty International’s Deputy Regional Director of East Africa, Horn, and Great Lakes Region.

One of Amnesty’s staff members in the region was targeted as a result of posts on the social media platform.

“In Ethiopia, the people rely on social media for news and information. Because of the hate and disinformation on Facebook, human rights defenders have also become targets of threats and vitriol.    I saw first-hand how the dynamics on Facebook harmed my own human rights work and hope this case will redress the imbalance,” said Fisseha Tekle, legal advisor at Amnesty International.

Fisseha Tekle is one of the petitioners bringing the case, after being subjected to a stream of hateful posts on Facebook for his work exposing human rights violations in Ethiopia. An Ethiopian national, he now lives in Kenya, fears for his life and dare not return to Ethiopia to see his family because of the vitriol directed at him on Facebook.

Fatal failings

The legal action is also being brought by Abraham Meareg, the son of Meareg Amare, a University Professor at Bahir Dar University in northern Ethiopia, who was hunted down and killed in November 2021, weeks after posts inciting hatred and violence against him spread on Facebook. The case claims that Facebook only removed the hateful posts eight days after Professor Meareg’s killing, more than three weeks after his family had first alerted the company.

The Court has been informed that Abraham Meareg fears for his safety and is seeking asylum in the United States. His mother who fled to Addis Ababa is severely traumatized and screams every night in her sleep after witnessing her husband’s killing. The family had their home in Bahir Dar seized by regional police.

The harmful posts targeting Meareg Amare and Fisseha Tekle were not isolated cases.  The legal action alleges Facebook is awash with hateful, inciteful and dangerous posts in the context of the Ethiopia conflict.

Meta uses engagement-based algorithmic systems to power Facebook’s news feed, ranking, recommendations and groups features, shaping what is seen on the platform. Meta profits when Facebook users stay on the platform as long as possible, by selling more targeted advertising.

The display of inflammatory content – including that which advocates hatred, constituting incitement to violence, hostility and discrimination – is an effective way of keeping people on the platform longer. As such, the promotion and amplification of this type of content is key to the surveillance-based business model of Facebook.

Internal studies dating back to 2012 indicated that Meta knew its algorithms could result in serious real-world harms. In 2016, Meta’s own research clearly acknowledged that “our recommendation systems grow the problem” of extremism.

In September 2022, Amnesty International documented how Meta’s algorithms proactively amplified and promoted content which incited violence, hatred, and discrimination against the Rohingya in Myanmar and substantially increasing the risk of an outbreak of mass violence.

“From Ethiopia to Myanmar, Meta knew or should have known that its algorithmic systems were fuelling the spread of harmful content leading to serious real-world harms,” said Flavia Mwangovya.

“Meta has shown itself incapable to act to stem this tsunami of hate. Governments need to step up and enforce effective legislation to rein in the surveillance-based business models of tech companies.”

Deadly double standards

The legal action also claims that there is a disparity in Meta’s approach in crisis situations in Africa compared to elsewhere in the world, particularly North America. The company has the capability to implement special adjustments to its algorithms to quickly remove inflammatory content during a crisis. But despite being deployed elsewhere in the world, according to the petitioners none of these adjustments were made during the conflict in Ethiopia, ensuring harmful content continued to proliferate.

Internal Meta documents disclosed by whistle-blower Frances Haugen, known as the Facebook Papers, showed that the US $300 billion company also did not have sufficient content moderators who speak local languages. A report by Meta’s Oversight Board also raised concerns that Meta had not invested sufficient resources in moderating content in languages other than English.

“Meta has failed to adequately invest in content moderation in the Global South, meaning that the spread of hate, violence, and discrimination disproportionally impacts the most marginalized and oppressed communities across the world, and particularly in the Global South.”

Is the CEO and Founder of Investors King Limited. He is a seasoned foreign exchange research analyst and a published author on Yahoo Finance, Business Insider, Nasdaq, Entrepreneur.com, Investorplace, and other prominent platforms. With over two decades of experience in global financial markets, Olukoya is well-recognized in the industry.

Continue Reading
Comments

Social Media

Telegram to Expose Users Who Use Platform For Criminal Activities, Share Data With Relevant Authorities 

Published

on

Telegram

With a view to joining efforts at waging war against cybercrime and other illicit activities on social media, a popular messaging app, Telegram has reviewed its users’ privacy and protection policy by announcing its readiness to make personal details of defaulting users available to relevant authorities for investigation.

The Telegram Chief Executive Officer, Pavel Durov, in a post sighted by Investors King, disclosed that the reversal of the company’s privacy policy was in response to alleged criminal activities and other illicit events happening on the popular social messaging platform.

Durov declared that once Telegram gets valid legal requests for the provision of the users’ IP addresses and phone numbers to authorities, the management would not hesitate in obliging the court order.

He said the move is to attempt to control criminal activity on the platform and prevent abuse.

Recall that Telegram’s policy, before it was changed, limited user information sharing to cases involving terror suspects.

However, Telegram tinkered with the policy following the arrest of its CEO, Durov in France over allegations that the company pretended not to notice the alleged various crimes flourishing unchecked on the platform.

Investors King also gathered that policy reversal is not unconnected with the recent decision of the Ukrainian government to ban the use of Telegram by government officials, military personnel, and other defense and critical infrastructure workers because of national security concerns.

Meanwhile, after Durov was subsequently released on bail and ordered to stay in the country pending ongoing investigation, he made it clear that the IP addresses and phone numbers of those who violate Telegram’s rules would now be made available to relevant authorities subject to valid legal requests.

The company further stated that if it receives a valid order from the relevant judicial authorities that confirms that any of its users is a suspect in a case involving criminal activities that violate the Telegram Terms of Service, it will perform a legal analysis of the request and may disclose affected user’s IP address and phone number to the relevant authorities.

It added that such data disclosures will be included in its periodic transparency reports, noting that the service may collect metadata such as IP address, devices and Telegram apps used, and the history of username changes to tackle spam, abuse, and other violations.

The platform has already featured the policy changes on its app as its search feature now removes problematic content and provides a new mechanism for users to report illegal search terms and material through the @SearchReport bot for subsequent review and removal by a human moderation team.

Continue Reading

Social Media

Telegram Was Adding Nearly 500,000 Users Daily Before Durov’s Arrest

Published

on

Telegram

Ever since it launched in August 2013, Telegram has been an exceptionally popular social media platform and messaging app, thanks to its utility and focus on privacy.

Telegram’s strong growth continued well into 2024. Finbold’s research found that, between April 10 and July 22, the platform added more than 485,000 monthly active users (MoU) every day.

The growth ensured that, by the middle of the summer, Telegram’s user base stood at 950 million – meaning that approximately one-eighth of humanity was using the app.

While coming just 50 million shy of 1 billion users is a major milestone, it is interesting to note that the social media platform has, at times, boasted even stronger growth. For example, in July 2023, the CEO and founder, Pavel Durov, revealed that 2.5 million people signed up to Telegram daily.

EU’s shadow over Telegram

Despite Telegram’s popularity and momentum, the platform has been gaining a different kind of attention since August 24 when the French police arrested Durov at an airport near Paris.

Though President Emmanuel Macron and his government maintain that the arrest was not politically motivated, it has nonetheless sparked a strong backlash, with many interpreting it as a crackdown on privacy and free speech.

Indeed, even the allegations of poor moderation and failure to prevent illicit activity are founded, they, nonetheless, raise important questions in the debate on the balance between privacy, surveillance, and national security.

As Andreja Stojanovic, a co-author of the research, noted “Even if genuine and undisputable illicit activity on Telegram was detected, the arrest is still likely to make many question if, by the same logic, the entire police force of a nation should be prosecuted whenever any illegal activity takes place in a private home or a hotel room.”

Nonetheless, there are no guarantees the arrest will have a profound impact on Telegram itself and, indeed, the platform has already shown significant resilience to government pressure during the Russian 2018 ban.

Continue Reading

Social Media

Russia Questions Legitimacy of France’s Arrest Warrant for Telegram’s Pavel Durov

Published

on

Telegram

Russia has issued a stern rebuke to France over the recent arrest warrant issued for Pavel Durov, the CEO and founder of the popular messaging app Telegram.

The Kremlin has raised significant concerns about the validity of the charges against Durov, suggesting that the move could be politically motivated.

On Saturday, the Paris Public Prosecutor’s Office issued a warrant for Durov’s arrest, citing an ongoing investigation into organized crime, drug trafficking, fraud, and the distribution of pornographic images of minors on Telegram.

This development came as a shock to many, given Durov’s prominence as a leading technology entrepreneur and a vocal advocate for internet freedom.

Kremlin spokesman Dmitry Peskov responded sharply to the French authorities’ actions during a press briefing on Tuesday.

Peskov demanded that Paris provide concrete evidence to substantiate the serious allegations against Durov.

He warned that without a robust basis for these accusations, the arrest could be perceived as a direct assault on free speech and a potential act of political intimidation.

“The charges against Durov are gravely serious and must be supported by equally serious evidence,” Peskov told journalists. “Otherwise, it could be viewed as an attempt to stifle communication and suppress freedom of expression.”

Durov, who is a Russian-born but holds French and UAE citizenship, has been a significant figure in the tech industry.

His company, Telegram, which he founded in 2013, is renowned for its commitment to privacy and has become a crucial platform for global communication, including in politically volatile regions.

The Russian government has emphasized its readiness to provide assistance to Durov, although it acknowledges the complexity of the situation.

Meanwhile, the United Arab Emirates, where Durov also holds citizenship, has requested urgent diplomatic support from French officials and is closely monitoring the case.

Durov’s arrest comes at a time of heightened geopolitical tension between Russia and France. Relations have been strained by France’s strong stance against Russia’s invasion of Ukraine and its support for Ukraine’s sovereignty.

This backdrop has fueled speculation that Durov’s detention might be more about political maneuvering than genuine legal concerns.

According to reports, Durov had traveled to Paris from Baku, Azerbaijan, where he was rumored to have had meetings, including speculative discussions with Russian President Vladimir Putin—claims that Kremlin officials have since denied.

Telegram, which boasts over 800 million users globally, including many government and military officials on both sides of the Russia-Ukraine conflict, has denied any wrongdoing.

The company has consistently defended its platform’s neutrality and commitment to user privacy.

As the situation unfolds, Russia’s challenge to the legitimacy of France’s legal actions underscores the broader tensions between the two nations and raises questions about the intersection of politics and international legal processes.

The outcome of this case may have significant implications not only for Durov but also for the broader landscape of digital freedom and diplomatic relations.

Continue Reading
Advertisement
Advertisement




Advertisement
Advertisement
Advertisement

Trending