Public figures aren’t the only people who deal with unwanted comments on Instagram, but until today, only VIPs had access to tools that allowed them to fight back. Instagram is rolling out a new moderation feature that gives you more control over what people can say in your comments.
The tool is accessible by tapping on the gear icon where you can adjust your settings. There’s a new option called Comments that lets you add words that people can’t post. Instagram offers some default keyword suggestions under “Hide Inappropriate Comments” if you don’t know where to start, or you can create custom keywords. Those custom options can include words, numbers, or emojis.
Turning on comment moderation will apply your list of banned words to every photo you’ve ever posted to Instagram. Comments with inappropriate words aren’t deleted, they’re just filtered away. If you turn off comment moderation, the comments will return.
Why this matters: Instagram took some heat after people noticed that snake emojis posted in Taylor Swift’s Instagram comments were automatically deleted—too quickly to be the work of human mods. Instagram ’fessed up to a batch delete tool designed for so-called “high-volume” accounts like Swift’s.
Now many of Instagram’s 500 million active users have access to the new moderation tool, though the feature doesn’t currently support languages without spaces. But that’s better than Twitter’s efforts to combat harassment, which so far have done little to curb abuse. Twitter is reportedly working on a keyword filtering tool similar to Instagram’s, but there’s no word on when that feature will roll out.