Pretty as a pixel

14th Mar 2022
Influencing policies
Body image

This content mentions eating disorders and body image or generally discusses weight, which some people may find triggering.

As the government prepares to launch its Online Safety Bill, we look at the impact of image editing apps on mental health and some changes we'd like to see to the bill to help protect young people.

How image editing apps are hurting our mental health and the urgent need for action

YouTuber Haley is trying to show girls how their Instagram heroes tweak their bodies digitally, presenting unrealistic – in fact, unreal – images of themselves to their followers.

“First, we’re going to fix my hairline. I hate my hairline. It’s really, like, weird.”

Yet as she demonstrates the image-editing app FaceTune, she becomes visibly excited, drawn into what she can do to perfect herself. In seconds her hairline is ‘fixed’.

She moves on. “We’re going to just make my waist teeny-tiny.” “Of course,” she adds.

She’s busting the myths of how the Insta-famous get their looks, but at the same time can’t hide her delight as a flick of her finger gives results one would need to pay thousands to achieve surgically, and with considerably less discomfort.

These apps are, quite clearly, fun. Fun like a house of mirrors if you push the extreme waist-shrinking slider to its maximum. Or if you have a little more restraint, they’re fun like having your own Vogue airbrushing team; to sharpen your cheekbone just slightly, to iron out those wrinkles you always get when you smile for the camera.

Like many of life’s pleasures, we ought to ask, though, whether these apps are good for us. The answer is unsurprising. Creating perfect versions of our bodies - the bodies we grow up in, live in, and know, with all their imperfections - and launching these buffed, slimmed avatars into the digital parallel universe is not, it turns out, good for our mental health.

We were so concerned about the shame and distress people can feel about their bodies that we made it the focus of Mental Health Awareness Week in 2019. As part of this, we carried out research that showed that getting on for half of teenagers have had worries about their body image that they directly attribute to social media. It affects girls more, but boys are not immune.

In addition to these worries and the shame young people can feel about themselves, the research shows that having a negative body image is associated with severe mental health problems such as body dysmorphic disorder and eating disorders (like anorexia and bulimia).

We should also ask why people feel they need to alter their images in this way, and what can be done about it? These are timely questions, as the Government prepares to launch its Online Safety Bill. The Bill attempts to balance freedom of expression with the duties and responsibilities technology companies have to their users. It covers a lot of bases, attempting to address everything from terrorism-related content to preventing online harassment.

It says nothing, though, on the more subtle harm of promoting unrealistic images of so-called ‘perfect’ bodies. Unlike, say, sending a death threat online, this is a harm that functions by accretion.

Let’s not pretend a single instance of airbrushed cellulite will hurt anyone very much at all.

But when all you see is conventionally ‘perfect’ legs, tummies, faces, served to you algorithmically because that’s what keeps you scrolling, that’s a different matter. Scrolling even as your mood sinks, and you start to experience your own body as more and more inadequate, a sign of failure, or an object of disgust. That’s when young people start to face real risks.

To be clear, teenagers are not to blame for this. Image-editing apps are made by profit-making companies that know exactly what they’re doing. They function within a social media ecosystem designed to keep people scrolling, regardless of the emotional consequences.

Much of the most dangerous content is not really made by individuals at all, but by advertisers working with ‘influencers’ through paid partnerships with beauty companies, which are sometimes declared transparently, but often are not.

The Mental Health Foundation is calling for three changes to the Bill, to protect young people from this insidious, daily bombardment.

We want apps designed to adjust users’ bodies and faces to be available only to adults.

We want a requirement for individuals to have control over the types of content presented to them algorithmically on social media, with the safest setting being the default. They can then avoid being presented with such an overwhelming volume of images of conventionally ‘perfect’ bodies.

And we want users to be given control over the type of advertising they receive, so they can avoid being presented with excessive amounts of advertising showing this sort of imagery.

These are proportionate recommendations. As it happens, they would do more than just protect young people from this type of material: we could all do with some more control over the content served to us by secret and impenetrable algorithms.

We and others have been raising the alarm loudly and clearly on these dangers for some time now. The new Bill presents the perfect opportunity to fix this part of the online world that’s harming us so greatly: the Government must seize it.

If you feel affected by the content you have read, please see our get help page for support.

Related content

Body image and mental health

Our research found that 30% of all adults have felt so stressed by body image and appearance that they felt overwhelmed or unable to cope. That’s almost 1 in every 3 people.

Image-editing apps and mental health report

We consider it vital that we take action to understand how these apps influence people’s body image and their mental health. It is an industry that is growing largely unchecked, in a space where the potential to negatively impact people’s lives is significant.

Was this content useful?