New prohibition of Instagram on suicide content

New prohibition of Instagram on suicide content


Adam Mosseri Instagram

 

The famous social network Instagram has implemented a new ban on self-harm graphic content, giving space to a range of content describing suicide, using imaginary graphic content on self-harm and suicidal methodologies such as: cartoons, memes and drawings.

“Last month, we further expanded our policies to ban more types of self-harm and suicidal content. We will no longer allow imaginary depictions of self-harm or suicide on Instagram, such as drawings or memes or movie or comic book content using graphic images,” this and what Instagram chief Adam Mosseri wrote, explaining the latest policy change. “We will also remove other images that may not show self-harm or suicide, but will include associated materials or methods.”

With regard to self-harm this year a meeting was held between Adam Mosseri and the Secretary of the Slaute of the United Kingdom to discuss together the policy that the platform will use with regards to self-harm. On the part of the country the well-known company has suffered not a few pressures following a public protest after the family of Molly Russell, a British student of just 14 years who after having viewed suicide material on the Instagram platform committed suicide, making it public tragedy talked about with the BBC.

In February, the social media platform owned by Facebook Inc. announced that it would prohibit graphic images that show acts of self-harm, such as eg. cutting, and limiting access to non-graphical content of self-harm, such as images regarding healed scars, obviously not recommending it in research.

At the time the idea of ​​using sensitive screens to obfuscate content resulting in non-graphic suicide acts was turning, stating that he was consulting with experts. In the case it seems there is a decision to go further, with the intention of removing imaginative contents related to self-injury, as well as to derivative contents to the practice that describe suicide / self-harm methods.

The well-known social affirms that it has already doubled the amount of self-inflicted contents on which it has already acted following the change of policy – so Mosseri states that in the three months following the new ban on graphic cutting images “removed, reduced visibility or added sensitivity screens with more than 834,000 contents “.

In addition to a 77% of this type of content has been identified by the platform before its reporting, he adds.

A final policy change is underway as confirmed by an Instagram spokesman.

It is not entirely clear how long the action will take to implement it and make it effectively applied. Adam Mosseri told BBC News: “It will take time for full implementation,” adding that “It will not be the last step we take.”

The Instagram Chief on his Blog writes about the change of policy: “based on the advice of experts from academics and mental health organizations such as the Samaritans in the United Kingdom and the National Suicide Prevention Line in the United States,” stating: “We aim to reach the difficult balance between allowing people to share their mental health experiences while protecting others from exposure to potentially harmful content. ”

“Even accounts that share this type of content will not be recommended in searches or in our discovery areas, such as Explore. And we will send more people more resources with localized help lines like the Samaritans and PAPYRUS in the UK or the National Suicide Prevention Lifeline and The Trevor Project in the United States, “he adds.

He goes on to say that the problems in question are complex and that “no single company or set of policies and practices alone can solve”, while the defense continues to allow some suicidal and self-defeating content on Instagram, saying “the experts tell us that giving people an opportunity to share their most difficult moments and their recovery stories can be a vital means of support “and that” preventing people from sharing this kind of content could not only stigmatize these types of mental health problems, but could prevent loved ones from identifying and responding to a request for help “.

“But getting the right approach requires more than a single change to our policies or a one-time update of our technology. Our work here is never done. Our policies and technologies must evolve as new trends emerge and behaviors change,” he adds. .

Stay Tuned!


This site uses Akismet to reduce spam. Learn how your comment data is processed.

Close Menu