Instagram has a setting that lets users choose how much fact-checked content should be degraded on their feeds.
Meta revealed in a blog update that it would apply an obscure Instagram option to Threads yesterday, allowing users to control the amount of fact-checked material they see on their feed. Meta claims that its fact-checking aims to combat misinformation. Users will therefore be able decide how much controversial content they wish to see on the site.
There are three options: “Don’t Reduce,” “Reduce,” or “Reduce More.” Although none of these choices will hide the content completely, they can affect the ranking for posts that “contain false or partially false information, changed content, or lack context.”
Users can access the settings by tapping the two lines at the top-right corner of the profile tab. Then tap Account > Other account Settings (which will take you to Instagram), then Content preferences > Reduced through fact-checking.
It’s a concept that seems really appealing. Who hasn’t wished for a “drama filter” in their lives? Meta told NBC News in a message that the options were meant to give users more control over the algorithm that ranks the posts in their feed. Meta also said that they are responding to user demands to “have greater power to decide what is shown on our apps”.
NBC News referred to a posting with thousands of likes, saying that the change was intended to censor the content related to Israel-Hamas War. It doesn’t matter if that’s the case or not. There’s plenty of room to censor with a tool which invites users’ complicity.
Meta now uses third-party fact checkers to determine whether content on Instagram or Facebook is factual. What they determine will be applied indirectly to Threads’ content. Meta says that, although fact-checkers cannot directly rate Threads’ content, they will transfer ratings from Instagram or Facebook to “nearly identical content on Threads.”
Meta claims that Instagram has been offering fact-checking ranking options for many years but never properly announced them. According The Economic Times Meta added the feature in May. A Meta spokesperson said it was to “make user controls on Facebook consistent with those already present on Instagram.”
The rapid growth of online communication has meant that moderation is not a smooth process. Social networks have not found a solution to this problem. In some cases their efforts have raised questions regarding the federal government’s involvement or anger.
Meta must moderate its platform. This is not just because of laws in the European Union, or the US’s own regulations. The company’s X (formerly Twitter) platform is a good example of what happens when you give up on moderation. revenue has reportedly dropped as a result of unmoderated and increasingly charged rhetoric.