Welcome Home

Follow Me On The Regular!

Instagram tackles self-harm and suicide with new reporting tools, support options

Posted By on October 20, 2016 in Written Content | 0 comments

Instagram tackles self-harm and suicide with new reporting tools, support options

While Twitter continues to struggle with rampant bullying and abuse, Instagram is instead stepping up its game when it comes to keeping users safe from harm. The Facebook-owned photo-sharing network this week began rollingout a new reporting tool that lets its users anonymouslyflag friends posts about self-harm. This, in turn,will prompt a message from Instagram to the user in question, offering support, including tips as well as access to a help line, all of which can be accessed directly in the app itself.

The system itself is fairly simple to use, but addresses a serious need on the social network which continues to have a large teenage and young adult audience.

When you see a post where a friend is threatening self-harm or suicide, you may not always feel comfortable broaching the subject with them. Or you may not know what to say. In other cases, you may be following accounts of people you dont know that well (or at all), and dont feel its your place to speak out.

Instagram is instead offering a different option. By anonymously flagging the post, the friendwill be sent a support message that reads, Someone saw one of your posts and thinks you might be going through a difficult time. If you need support, wed like to help.

The recipient can then click through to see a list of support options, which includes a suggestion to message or call a friend, accessmore general tips and support or contact a help line which will vary as determined by the users location. 40 organizations around the world are partners on the help line side of the system.

The company also says it worked with National Eating Disorders Association, Dr. Nancy Zucker (Associate Professor of Psychology and Neuroscience at Duke), and Forefront, which is led by academic researchers at the University of Washington, on the new tools. Other organizations including The National Suicide Prevention Lifeline and Save.org (US), Samaritans (UK), beyond blue and headspace (Australia) provided input, too.

updated-self-injury-flow

Whats interesting about Instagrams tool is that it isnt only triggered by anonymous reporting. Instagrams app will also direct users to the support message when they search the service for certainhashtags, like the banned search term #thinspo, for example, which is associated with eating disorders.

Dealing with abuse (and self-directed abuse)

The move is one of several changes Instagram has made in recent days to limit the abuses on its network. In September, it also made it possible for anyone to filter their comments using customizableblocklists meaning, they could disallow anyone from posting certain explicit words or bullying terms in their Instagram comments.

Steps like these are critical to establishing the communitys tone. Unregulated free speech can devolve into anonymous bullying as is now so prevalent on Twitter that it has at least partially impacted the networks ability to find an acquirer. (Reportedly, Disney stepped away from an acquisition because of this problem.)

Posts about self-harm are a different matter than bullying, of course, but they fall under the broader umbrella of user protection and safety.

updated-self-injury-tools-1

They are about establishing a community where its safe to share, but also one where certain types of sharingcan bemoderated for potential problems whether thats someone posting harmful comments directed at another person, or someone posting harmful comments directed at themselves.

Not having these sorts of limitations and policies in place can lead to dangerous results, especially when networks cater to a younger audience. For instance, the teen Q&A network became so well-known for bullying back in the day, it ended up being a contributing factor in a series of teen suicides.

Over the years, the major social media companieshave learned from past tragedies to offer better ways to protect their users. Many partner with support organizations to craft and run PSAs when users search for harmful terms Tumblr, Pinterest and Instagram have all done so.And most today use a combination of automated systems, human moderators and flagging tools to address problems that range from user-led searches for concerning terms and tags (like thinspo or suicide) to promptsthat are triggered by users posts themselves.

screen-shot-2016-10-19-at-3-38-36-pm

Instagrams new flagging tools are designed after thosethat are already in place on parent company Facebook, and an example of how the smaller network can benefit from the infrastructure Facebook already has in place to address problems related to self-harm.

Last year, Facebook launched an almost identical tool right down to the wording for Facebook users in the U.S. And earlier in 2016, it rolled this out to international users as well.

Along with the launch of the new tools, Instagram also partnered with Seventeen on a campaign focused on body image and self-confidence, National Body Confidence Day. This is running now under the hashtag, #PerfectlyMe.The November issue of Seventeen will also include16 pages inthe magazine in support of body confidence and #PerfectlyMe.

Read more: https://techcrunch.com/2016/10/19/instagram-tackles-self-harm-and-suicide-with-new-reporting-tools-support-options/