Facebook employees reportedly raised alarms about how the social media giant has repeatedly enabled dangerous and dark behavior in developing countries and hasn’t fixed internal systems despite being aware of the problems.
Staff flagged various problematic behavior occurring on its platform related to Mexican drug cartels, human trafficking in the Middle East and eastern Africa, armed groups in Ethiopia, polarizing nationalist content and softcore pornography in India, and the suppression of Vietnamese political dissidents, according to internal Facebook documents reviewed by the Wall Street Journal.
The documents show that despite this awareness, in many instances, the company’s leadership responded to such issues with inadequate steps or did nothing at all.
“There is very rarely a significant, concerted effort to invest in fixing those areas,” Brian Boland, a former Facebook vice president, said. Boland used to be in charge of Facebook’s partnerships with internet providers in Africa and Asia before resigning at the end of last year.
Although more than 90% of Facebook’s new monthly users and growth is occurring outside the United States, the company’s employees and contractors only spent 13% of their time working on searching, labeling, or taking down content outside the U.S. that was false or misleading and often led to illegal behavior.
FACEBOOK REPORTEDLY BENT ITS OWN RULES FOR TRUMP AND CELEBRITIES
The tech company had its employees spend almost three times as many hours working on “brand safety,” or measures to ensure ads don’t appear alongside content that advertisers would not approve of, highlighting Facebook’s priority on profits and the consideration of harms in developing countries as part of the cost of doing business in such nations, the Wall Street Journal reported.
Facebook uses far less manpower to stop harm occurring outside the U.S. than inside, the internal documents showed.
“Facebook continues to unleash products on the world that enable both good and bad behavior, and then, when Facebook finally is forced to recognize the bad behavior, they step in to try to fix. This is backward,” said Emily Dreyfuss with Harvard’s Shorenstein Center on Media, Politics, and Public Policy.
“Don’t unleash products you know can cause harm just because they can also enable some benefit. The benefits don’t cancel out the harms. The ‘intention’ of how the products ‘should’ be used in the opinion of the platform are irrelevant. What is relevant is how the product can be used.”
Facebook said it is committed to tackling such problems and already has teams dedicated to such efforts, although it admits there are some areas it could invest more resources.
CLICK HERE TO READ MORE FROM THE WASHINGTON EXAMINER
“In countries at risk for conflict and violence, we have a comprehensive strategy, including relying on global teams with native speakers covering over 50 languages, educational resources, and partnerships with local experts and third-party fact checkers to keep people safe,” Facebook spokesman Andy Stone said to the Wall Street Journal.

