Skip to content

Meta acknowledges and corrects issue leading to suggestive violence in Reels video suggestions

Meta is offering apologies for a technical glitch, as some Instagram Reels viewers reported encountering disturbing, harsh video content within their feeds.

A handheld device displaying the emblem of Instagram's application.
A handheld device displaying the emblem of Instagram's application.

Meta acknowledges and corrects issue leading to suggestive violence in Reels video suggestions

Oops, it seems like an error has popped up on Instagram, causing some users to witness content in their Reels feed that shouldn't have been suggested to them. Meta, Instagram's parent company, apologized for this slip-up in a statement.

Tuesday saw numerous Instagram users reporting a never-ending stream of recommended videos in their Reels feed, displaying disturbing scenes of violence, such as people getting beaten or killed. Even users who had their "Sensitive Content Control" set to the maximum level weren't exempt from this unsettling content.

Meta kept mum on the specifics of what led to this technical glitch.

This incident comes at a time when Meta is putting in extra effort to boost short-form video engagement on its platforms. With TikTok's future in the US hanging by a thread, Meta is trying to win over users by incorporating features popular on its main competitor, such as extending the video time limit and allowing users to pause Reels with a simple tap.

Meta is also planning to launch a new video creation app, Edits, in the coming weeks, which resembles CapCut, ByteDance's TikTok-owned app widely used by creators for producing short-form videos.

Recently, Meta has been making some controversial changes to its content moderation policies. In January, the company announced it would eliminate fact-checkers, opting instead for a user-generated "Community Notes" model for adding context to posts, and decided to focus only on the most extreme content rule violations. This shift in strategy is likely to lead to more freedom of speech, but it may also result in less removal of inappropriate content.

However, it's uncertain whether these modifications played any role in the issues reported on Wednesday, which Meta termed an error. When Meta's CEO, Mark Zuckerberg, announced these content moderation updates, he acknowledged that the company would miss out on "catching less bad stuff" on its platforms, but he also emphasized the importance of fostering more free speech.

  1. Despite the controversy surrounding its recent content moderation policy changes, Meta continues to prioritize short-form video engagement on platforms like Instagram and is planning to launch a new app, Edits, similar to ByteDance's CapCut.
  2. Although Meta announced it would eliminate fact-checkers and shift its content moderation strategy to a user-generated "Community Notes" model, the company refused to comment on whether these changes contributed to the unsettling content that appeared in some Instagram users' Reels feeds.
  3. The tech giant apologized for the tech glitch that caused disturbing videos to appear in Instagram users' Reels feeds, prompting calls for improved moderation measures, especially in light of Meta's latest content moderation policy changes.

Read also:

    Latest