Instagram aims to end bullying on platform with new AI-powered tools
CAPE TOWN – Instagram announced that it has launched two new features to not only put a stop to bullying on its platform but to make users think twice when posting offensive comments online.
The first tool will be a notification tool that uses artificial intelligence (AI) to detect potentially offensive or bullying comments. It will then send a pop-up warning asking users if they are sure about adding the offensive comments on a post.
( The AI tool. Photo: Instagram)
Adam Mosseri, Head of Instagram said in a statement: “In the last few days, we started rolling out a new feature powered by AI that notifies people when their comment may be considered offensive before it’s posted."
"This intervention gives people a chance to reflect and undo their comment and prevents the recipient from receiving a harmful comment notification. From early tests of this feature, we have found that it encourages some people to undo their comment and share something less hurtful once they have had a chance to reflect.”
The second feature will first be tested before becoming an reality, will be called Restrict, that will put an end to unwanted interactions on your posts. How the feature works is that once you restrict someone, comments on your posts from that person will only be visible to that person.
"You can choose to make a restricted person’s comments visible to others by approving their comments. Restricted people won’t be able to see when you’re active on Instagram or when you’ve read their direct messages," said Mosseri.
(The restricted feature. Photo: Instagram.)
BUSINESS REPORT ONLINE