Facebook's Algorithmic Analysis Suggests Unusual Behavior in a Compact English Social Circle
Ponder over the inhabitants of Coulsdon, England, who allege they're under relentless suppression due to the overbearing control of algorithmic censorship, seemingly sans justification, apart from their town's peculiar spelling.
According to the local news outlet Inside Croydon, local businesses and neighborhood associations in the town have experienced deleted content from their Facebook pages. The purported reason behind this is that the social media giant's content moderation algorithms mistake the "LSD" in Coulsdon as a reference to the psychedelic drug.
The blog, citing unnamed local sources, reported that pages for local theaters, hardware stores, history groups, and residents' associations have all been affected by the censorship, with Facebook yet to address the issue despite numerous complaints.
One anonymous source told Inside Croydon, "As long as it has 'Coulsdon' in the title, you get the drug reference that there's no way around."
In a brief statement, Meta, Facebook's parent company, spokesperson Dave Arnold acknowledged, "this was an error that has now been fixed."
It's not the first time Facebook's filters have hindered posts containing harmless—or potentially lifesaving—information.
In 2021, Facebook apologized to some English users for censoring and banning people who posted about the Plymouth Hoe, a landmark in the coastal city of Plymouth.
Earlier this year, The Washington Post reported that as wildfires ravaged the West Coast, Facebook's algorithms suppressed posts about the blazes in local emergency management and fire safety groups. The newspaper documented dozens of instances where Facebook flagged the posts as "misleading" spam.
Facebook group administrators have also observed posts containing the word "men" being flagged as hate speech, as reported by Vice. This led to the creation of facebookjailed.com, where users documented unusual moderation decisions, including a picture of a chicken being labeled as nudity or sexual activity.
Facebook's own data reveals that its heavy reliance on algorithms to regulate content on the platform results in millions of errors each month.
According to its latest moderation data, between April and June this year, Facebook took 1.7 million enforcement actions on drug-related content. Nearly 98 percent of this content was detected by the company, while just 2 percent was reported by users. People appealed the penalties in 182,000 cases, and Facebook ultimately restored more than 40,000 pieces of content—11,700 without an appeal and 28,500 after an appeal.
The algorithms targeting other prohibited content, such as spam, result in even more mistakes. During the most recent three-month period, Facebook restored nearly 35 million posts it had mistakenly labeled as spam, representing more than 10 percent of the alleged spammy content it had previously removed.
In light of Facebook's past mistakes, there's concern about the future of tech regulation and the role of technology in shaping digital censorship. Local businesses in Coulsdon, England, have faced tech-related challenges with algorithmic censorship, leading to deleted content due to the misinterpretation of their town's name.