Instagram’s ‘sensitive content’ controls will soon filter out all featured content – Vidak For Congress

Last year, Instagram added a way for users to filter some types of “sensitive” content from the Explore tab. Now Instagram is extending that setting to allow users to disable that content in app-wide recommendations.

Instagram doesn’t offer much transparency on how it defines sensitive content or what even counts. When it introduced controls on sensitive content last year, the company framed sensitive content as “messages that don’t necessarily violate our rules, but could potentially be upsetting to some people — such as messages that could be sexually suggestive or violent.”

The expanded content checks will soon apply to search, roles, hashtag pages, “accounts you may be following” and in-feed suggested posts. Instagram says the changes will roll out to all users in the coming weeks.

Instead of letting users mute certain content topics, Instagram’s controls only have three settings, one that shows you less of this content content, the default setting, and an option to see more sensitive content. Instagram users under the age of 18 cannot choose the latter setting.

In a Help Center post explaining the content controls in more detail, describing the category as content that “impedes our ability to promote a safe community.” For Instagram, that includes:

“Content that can depict violence, such as people fighting. (We remove graphically violent content.)

Content that may be sexually explicit or suggestive, such as photos of people in see-through clothing. (We remove content that contains adult nudity or sexual activity.)

Content that promotes the use of certain regulated products, such as tobacco or vapor products, adult products and services, or pharmaceutical drugs. (We remove content that attempts to sell or trade most regulated goods.)

Content that may promote or portray cosmetic procedures.
Content that may attempt to sell products or services based on health-related claims, such as promoting a supplement to help a person lose weight.”

In the images accompanying the blog posts, Instagram notes that “some people don’t want to see content on topics like drugs or firearms.” As we noted when the option was introduced, Instagram’s lack of transparency on how it defines sensitive content and its decision not to provide users with more granular content control is troubling, especially given its decision to lump sex and violence together as “sensitive.” .

Instagram is a platform notorious for its hostility towards sex workers, sex educators, and even sexually suggestive emoji. The update is generally more bad news for accounts affected by Instagram’s aggressive sexual content parameters, but those communities are already used to bending over backwards to stay in the good graces of the platform.

From where we are now, it’s not at all intuitive that a user who doesn’t want to see posts pushing scams and diet culture would also be averse to pictures of people in see-through clothes, but Instagram is clearly painting broad strokes here. The result is a tool that invites users to turn off an opaque blob of “mature” content rather than a meaningful way for users to easily avoid things they’d rather not see while browsing Instagram’s algorithms.

    Leave a Comment