Instagram has said it is "committed to leading the industry in the fight against online bullying", and so has implemented a new tool powered by AI that notifies people when a comment they're about to post may be considered offensive. "Are you sure you want to post this?" the prompt says, with a timed line indicating how long you have to take the message back. If you press 'undo' in time, the recipient will never receive a notification of your comment. But the worry is, of course, that people can ignore the prompt, and send the offensive message. "From early tests of this feature, we have found that it encourages some people to undo their comment and share something less hurtful once they have had a chance to reflect," Instagram said in its statement. "This is especially crucial for teens since they are less likely to report online bullying even when they are the ones who experience it the most," Instagram added of the new measure. We tested to see if the feature is working on an Android phone in the UAE, and it was. We tried out some messages on a colleague's account (who knew the experiment was happening). "You are so ugly and stupid" was considered offensive enough for the prompt, but "I hate you" wasn't. "I hate your religion" also went through without a prompt, as did "you are fat": both terrible things to say to someone. Stronger suggestions of violence were rejected, however. In Instagram's community guidelines, it defines content that is removable as that containing "credible threats or hate speech, content that targets private individuals to degrade or shame them, personal information meant to blackmail or harass someone, and repeated unwanted messages. "We do generally allow stronger conversation around people who are featured in the news or have a large public audience due to their profession or chosen activities," its guidelines state. It goes on to say, "it's never OK to encourage violence or attack anyone based on their race, ethnicity, national origin, sex, gender, gender identity, sexual orientation, religious affiliation, disabilities, or diseases. When hate speech is being shared to challenge it or to raise awareness, we may allow it. In those instances, we ask that you express your intent clearly. Serious threats of harm to public and personal safety aren't allowed. This includes specific threats of physical harm as well as threats of theft, vandalism, and other financial harm. We carefully review reports of threats and consider many things when determining whether a threat is credible." The other new tool that has been rolled out by Instagram this week is 'Restrict'; this allows you to select people that won't be able to see when you're online or when you've read their direct messages. Also, comments on your posts by a restricted person will only be visible to that person, and that person will not know you restricted them. "We’ve heard from young people in our community that they’re reluctant to block, unfollow, or report their bully because it could escalate the situation, especially if they interact with their bully in real life," said Instagram of the rationale behind the feature. You can find more Instagram tools <a href="https://wellbeing.instagram.com/safety">that may help anyone being bullied here.</a>