Adam Mosseri, the head of Instagram, said “no graphic self-harm image” would in future be allowed on the platform.
In a comment released by the social media giant as it seeks to deal with the storm of criticism that has followed Molly’s death, Mosseri said: “Nothing is more important to me than the safety of the people who use Instagram. We are not where we need to be on self-harm and suicide, and we need to do more to protect the most vulnerable in our community.
“We will not allow any graphic images of self-harm, such as cutting, on Instagram, even if it would previously have been allowed as admission. We will also make it harder for people to discover non-graphic, self-harm related content and we won’t be recommending it. We are not removing non-graphic self-harm related content from Instagram entirely, as we don’t want to stigmatise or isolate people who may be in distress.”
The news was first announced in an interview with the Daily Telegraph, with Mosseri saying that the social media company would alter how people searched for images so that self-harm related content was harder to find. This involves taking content off “Explore”, “Search”, hashtagged pages and account recommendations.
“If there is self-harm related content that stays on the platform even if it’s admission-orientated, maybe someone has a picture of a scar and says I am 30 days clean, it’s going to be much more difficult to find,” Mosseri told the Telegraph.
Instagram has come under fire from ministers and charities over its failure to tackle imagery on its site.
Asked why Instagram had taken so long to tackle the issue, Mosseri said: “We have not been as focused as we should have been on the effects of graphic imagery of anyone looking at content.
“That is something that we are looking to correct and correct quickly. It’s unfortunate it took the last few weeks for us to realise that. It’s now our responsibility to address that issue as quickly as we can.”
The move follows significant public anger over Molly’s death. Her father, Ian Russell, said he believed Instagram had “helped kill my daughter”. When her family looked at her Instagram account after her death, they found distressing material about depression and suicide.
Speaking on BBC Radio 4’s PM programme, the digital minister, Margot James, said the government would “have to keep the situation very closely under review to make sure that these commitments are made real – and as swiftly as possible”.
At the moment graphic images of self-harm are reported to Instagram and then taken down, but Mosseri said the company was looking at how technology would be used to improve things.
He added: “Historically, we have allowed content related to self-harm that’s ‘admission’ because people sometimes need to tell their story, but we haven’t allowed anything that promoted self-harm.
“But, moving forward, we’re going to change our policy to not allow any graphic images of self-harm.”
Mosseri said some self-harm images would be allowed to remain on Instagram. “I might have an image of a scar and say, ‘I’m 30 days clean,’ and that’s an important way to tell my story,” he said.
“That kind of content can still live on the site but the next change is that it won’t show up in any recommendation services so it will be harder to find.”
Mosseri said he would have “a long thought” about how well he was doing in his role if self-harm content was still on the site in six months, in response to a question about whether he would resign if the problem was not solved.
Instagram’s decision comes as large social media companies such as Facebook, which owns Instagram, prepare to battle with the British government of the future of internet regulation in the UK.
The government is considering imposing a mandatory code of conduct on tech companies, which could be accompanied by fines for non-compliance, prompting in a substantial behind-the-scenes lobbying campaign by social media sites.
The culture secretary, Jeremy Wright, is due to unveil the government’s proposals at the end of this month, helping to spur Facebook into swift action.
Sarah Marsh and Jim Waterson