Ask an Expert: How Can We Help Teens Posting About Self-harm?

An unseen teen on their bed with cozy socks and a sweater and fairy lights in the background; they're scrolling on their phone.


Key Takeaways

  • A recent study found that Instagram posts containing hashtags related to self-harm have been on the rise.
  • More posts mean a higher risk of exposure to the content, which can be harmful.
  • Experts say that policy changes, such as improved post flagging, and increased and organized attention from parents, schools, and friends can help.

Hashtags on social media can help popularize ideas, but what happens when hashtags help popularize dangerous thoughts and actions?

A recent study found that posts relating to non-suicidal self-injury might be increasing among social media users, especially on Instagram. Posts containing hashtags like #Cutting, #SelfHarm, and #HateMyself in the captions increased in 2018 from around 60,000 posts in January and February to 112,000 in December of that year.

What Is Non-suicidal Self-Injury?

Non-suicidal self-injury (also referred to as self-harm) is defined as "deliberately hurting one's own body without clear suicidal intent." For example, it might involve cutting or burning one’s skin. The acts can be a dangerous way to cope with emotional pain, intense anger, and/or frustration.

Past research estimated that about 1 in 5 people will experience self-harm at some point in their lives. The average age for beginning to experiment with self-injury is 12.

Although now three years old, the data suggests that self-harm-related content has been increasing. It also showed that the posts containing self-harm-related hashtags frequently contained other tags related to suicide, depression, general mental distress, anxiety/panic, and eating disorders.

The researchers said that the study's findings suggest that Instagram users associated non-suicidal self-injury with psychological distress. Exposure to related content, especially for teenagers, may popularize self-harm as a way to cope with that distress.

In light of the data, the researchers recommend that mental health professionals consider their clients’ online activity when making treatment plans. However, Diana Herweck PsyD, LMFT, LPCC, a psychotherapist and clinical director at the University of Phoenix, told Verywell that more recommendations can be made—from the clinical to the corporate world.

Verywell: How have you seen self-harm and social media interact?

Herweck: I have seen social media increasingly being used with those who self-harm, and not just teens. It is increasingly present. While those making the posts might not be the ones reporting them, others who are viewing the posts often do.

I hear from more and more adolescents (and the counselors who are working with them) who mention either wanting to report such posts or having reported to the sites directly. Adolescents and younger children have shown me posts on different platforms in which others have shared their own self-harming behaviors and even ways to hide such behaviors from teachers, counselors, and parents. 

Verywell: What concerns do you have about that interaction?

Herweck: We know that social media for many, including these kids, is about getting more followers, more likes, and more comments. While sites often have minimum age requirements, we know younger kids can easily access them. This means young children have access to the same images and content. They start following the trendsetters and want to be like them. This can be dangerous for everyone.

The other piece of this is our kids often know more about social media than the adults in their lives. How many of us go to our kids to ask about the latest sites? What might take us 10 minutes or more to find, they can often access in a matter of seconds. In some cases, that is great news! In others, it can be quite dangerous. 

Even if the original poster of a message does not make their content “shareable,” it can be shared easily via a screenshot. This means the content can be shared with hundreds, even thousands or more, in a matter of seconds. 

Verywell: How could social media companies change to minimize harm?

Herweck: This is something I have talked with several kids about in the past. Although they like getting all the likes, followers, hearts, and positive comments on their posts, they often mention it would be better if these things were not available. It seems there is too much competition to be the next social media star, influencer, or trendsetter. This in itself can be damaging to kids.

Social media sites incentivize their users. Without calling out any companies directly, if users have so many followers or clicks, they can share their videos and posts and start getting paid through advertisements. This is a big draw for kids, as they see their influencers across all platforms making a living (or at least they think they are) out of this “job.” I wonder what would happen if these incentives were removed?

Some sites do have policies against posting graphic images and inappropriate content. It’s not always clear what is inappropriate, though. There is not a fail-proof system in place yet to identify these posts automatically (although artificial technology is improving). This means posts are not blocked and only get removed (or even considered for removal) if they are reported.

People get upset when their posts are removed for questionable content, or they get blocked for a few days or longer. However, that's a little sacrifice to know these posts will be blocked or removed quicker. I think it would require a human, not a computer, to review posts though.

There are some helpful posts that might include some of the same wording or images. For instance, some people make posts about how they got help or how they overcame their self-injurious behavior. 

Some sites have pop-ups, sort of as a warning to the poster. Instagram, for instance, has a pop-up when a post might seem similar to others that have been reported. It says something like, “Keep IG a supportive place.” While this does not prohibit the message, it can at least give someone time to pause and decide if the post should be made. Maybe this could go further— perhaps even prohibiting the post for a short time. If someone waits and decides that they still want to post, they can do so after the time has passed.

I would love to see these social media companies invest in ways to block and better manage content. I wonder if they could have teams of staff that could have these discussions and review removed or questionable posts, perhaps even work with mental health professionals? We have to be able to increase the benefits of social media while limiting the risks. 

Verywell: If someone expresses a desire to self-harm on social media, what's a good way to reach out?

Herweck: Blocking and removing these posts from social media is only one piece of the puzzle. While those steps help keep the posts from spreading, they don’t do anything to help the person who posted in the first place.

Self-injury does not necessarily mean someone is suicidal. It is often used to self-regulate; to cope with what they are feeling or experiencing. Self-harm is not a long-term solution for self-care, though, and either way, there is a risk of suicidal ideation, even suicide attempts.

Research has shown a strong association between self-harming behavior and suicidality. That is why all self-harming behavior needs to be taken seriously.

I’ve often wondered if there is some way to get a mental health team involved when posts are flagged and reported. There are ways to get police and even the FBI involved when certain social media posts are discovered. What if we had the same [system for] alerting mental health teams, school counselors, or others?

If those seeing the posts know the person making the self-harm statements or graphics (not just as a “follower” on social media), they can reach out directly to school counselors. I know many kids and parents who have done so. Those seeing the posts can take a screenshot to share with the professionals (teachers, counselors), as the original post might be removed by the social media company or from the original poster themselves. 

The problem—and its solution—is bigger than the social media companies, though. It involves parents and even the education system.

While society is constantly changing, our education system is slow to change. Teaching some of these social and emotional skills in the classroom could be a big help. Academic knowledge is needed, but so is emotional and social development.

Including education on using the internet and social media would be helpful as well. Parents and caregivers also need to be involved and educated, and this is another topic for discussion. I now include questions about social media use in my own intakes with clients (teens or otherwise), and I educate my students and interns to do the same. This was certainly not an issue in my own training, but it's one mental health providers need to be aware of today.

What this Means For You

If you or someone you know might be engaging in self-harm, you can text the Crisis Text Line or call the National Suicide Prevention Lifeline at 1-800-273-TALK. All contact is confidential and available 24/7, in English and in Spanish.

2 Sources
Verywell Health uses only high-quality sources, including peer-reviewed studies, to support the facts within our articles. Read our editorial process to learn more about how we fact-check and keep our content accurate, reliable, and trustworthy.
  1. Giordano, AL, Lundeen, LA, Wester, KL, et al. Nonsuicidal Self-Injury on Instagram: Examining Hashtag Trends. Int J Adv Counselling (2021). doi:10.1007/s10447-021-09451-z

  2. UGA Today. Adolescents use social media to post about self-injury.

By Sarah Simon
Sarah Simon is a bilingual multimedia journalist with a degree in psychology. She has previously written for publications including The Daily Beast and Rantt Media.