At a time when reports of suicides linked to the Blue Whale challenge internet game are sending shock waves through the country, Facebook on Friday said it is working with suicide prevention partners to collect phrases, hashtags and group names associated with online challenges encouraging self-harm or suicide.
"We offer resources to people that search for these terms on Facebook," the social media giant said.
The Blue Whale challenge is said to psychologically provoke players to indulge in daring, self-destructive tasks for 50 days before finally taking the "winning" step of killing themselves.
Facebook said it also removes content that violates our Community Standards, which do not allow the promotion of self-injury or suicide. Additional resources about suicide prevention and online wellbeing will also be added to its Safety Center, Facebook said.
With these resources, people can access tools to resolve conflict online, help a friend who is expressing suicidal thoughts or get resources if they are going through a difficult time.
Facebook's Safety Center also offers guidance for parents, teenagers, educators, and law enforcement officials to start a conversation about online safety, with localised resources and videos available.
People can also reach out to Facebook when they see something that makes them concerned about a friend's well-being.
"We have teams working around the world, 24/7, who review reports that come in and prioritise the most serious reports like suicide. For those who reach out to us, we provide suggested text to make it easier for people to start a conversation with their friend in need," Facebook said.
"We provide the friend who has expressed suicidal thoughts information about local help lines, along with other tips and resources," it added.