Facebook is an app that is used by all users worldwide. The app has recently announced that they’re going to make some serious improvements. The content in the areas of suicide and self-harm will be taken care of. Recently the company has been in touch with health experts all over and they have advised that such content could be harmful to a specific group of people.
To maintain harmony and to make sure the users are safe, the company has decided to tighten its policies. The policies surrounding the sensitive issues will be properly measured and strict reforms will be taken.

The measures are taken to protect vulnerable users so that their emotions are not affected mentally, emotionally, or socially. Also, the available resources will be expanded for those who are in need.
Facebook decided to take proper measures and it announced the same on World Suicide Prevention Day, September 10. Facebook also claimed, “As a global online community, keeping people safe on our apps is incredibly important to us.” These words were written by the Global Head of Safety of Facebook, Antigone Davis.
Facebook has reformed its policies. The new policies are such that the users can no more post graphic cutting images. This step is taken to prevent triggering self-harm of any of its users. Not on Facebook but also on Instagram such photos will be difficult to find. Though images of healed cuts can be posted they will have a sensitivity screen over them.
This would stop the site from unintentionally promoting any kind of self-harm. Many social media sites are also addressing the issue of how eating disorders are presented online in such a bizarre manner.
On the other hand, Facebook says that users are prohibited from posting any kind of content that could promote eating disorders. These include instructions for drastic weight loss and depiction of collar bones, ribs, concave stomachs, thigh gaps, or protruding spine that are usually shared with the related keywords.

Moreover, the site will keep on sending resources to the users who are seen posting content such as self-harm or eating disorders, and even suicide. Davis also told that around 1.5 million content actions have been taken from April to June this year itself. The content was related to suicide and self-harm.
Approximately 94% of the posts were found on Facebook before they were seen and reported by other users. On the other hand, Instagram has but a shield of sensitivity screen on about 800,000 posts during this very same period and 75% of the content was found by the app itself before being reported by the other users.
In addition to this, to ensure the proper safety of its users, an app like Facebook has decided to hire proper health and well-being experts. They will help them improve more in terms of safety. Facebook wants to create a safer environment for its users free from any triggering content. A safer space will be made for the ones who are struggling with mental health issues