Mental health experts warn Big Tech about image-editing apps aimed at children
The Mental Health Foundation, University of Birmingham and the Cochrane Common Mental Disorders Group – alongside a group of another 12 organisation representatives and experts by experience – are calling for the introduction of guidelines to protect children from image-editing technology.
The expert-led group conducted a rapid evidence review of the effects and regulation of image-editing apps and today has published a series of recommendations for the sector.
One of the group’s recommendations is that Google Play (Android apps) and App Store (Apple apps) should introduce specific guidelines for app developers that give equal consideration to risks of harm to mental health as those to physical health. The group also recommends that all body and face-editing apps should be age-restricted, and this restriction should be enforced, as some apps are targeting children from as young as five years old.
These recommendations build on the body image campaigning work that the Mental Health Foundation has led in recent years. In July 2018, the charity wrote to the Advertising Standards Authority warning that strategically placed adverts which appeared during ITV’s Love Island "painted a false picture of perfection" and "exacerbated young people’s insecurities". The Advertising Standards Authority agreed that the advert breached advertising rules and withdrew it.
In Mental Health Awareness Week 2019, the charity published ‘Body Image: How we think and feel about our bodies’, a detailed report on the relationships between body image and our mental health.
Dr Antonis Kousoulis, Director for England and Wales at the Mental Health Foundation, commented: “Image-editing technology is growing largely unchecked and under the radar, in a space where the potential to negatively impact on children and young people’s lives is significant. We have written to five big tech companies to share our concerns and offer our expertise to co-produce guidelines that safeguard and protect children.
“Our work demonstrates that there are serious societal pressures on body image that are driven by the commercial sector. We must act now to understand how standalone image-editing apps, and popular in-app filters on platforms such as TikTok and Instagram, influence children and young people’s body image and their mental health. All of us, including multinational companies, have a responsibility to protect children from technology which is harmful to their mental health.”
Having a healthy body image is important for our mental health1. Research has found that higher body dissatisfaction is associated with a poorer quality of life and psychological distress2, a higher likelihood of depression symptoms and the risk of unhealthy eating behaviours and eating disorders.
A UK-wide survey by the Mental Health Foundation of 1,118 teenagers3 (13-19 years old) in 2019 revealed that one in four girls and one in ten boys had edited photos of themselves in order to change their face or body shape because of concerns about their body image. One in eight (12%) young adults4 (aged 18-24) had edited pictures of themselves to change their face or body shape in the previous year.
Among young people aged 18-24, more than one in three (37%) had felt shame about their body image in the previous 12 months and one in four (25%) said they felt so stressed by body image and appearance that they had felt overwhelmed or out of control.
Heather Widdows, Professor of Global Ethics at the University of Birmingham, author of ‘Perfect Me’ and founder of the #everydaylookism campaign, added: “As our culture becomes more visual, the pressure to have a perfect body intensifies, as do feelings of failure and shame. Body image anxiety is overwhelming for many and apps like this add to pressure and give the message that your face and body need changing and aren’t good enough as they are. We need to take this seriously, it’s not just fun and games.”
As part of the evidence, the group heard from young people who had used these apps and from people with experience of eating disorders or body dysmorphia and academic experts who highlighted the challenges of collecting good quality data in a rapidly changing landscape.
Dr Kousoulis added, “We cannot wait for the academic research to catch up with the rate of development of social media nor can we expect that public education alone will be enough to improve people’s health. Using the “precautionary principle” by taking action on the balance of evidence, as a society we need to take steps to protect young people from harmful factors over which they have no personal control.”
The full briefing, which includes six recommendations, is available to download here. Visit mentalhealth.org.uk/bodyimage to find out more or follow #BeBodyKind on social media.
Notes to editors:
Interviews with case studies or expert spokespeople are available on request.
For further information and interview requests contact Muireann Kirby in the Mental Health Foundation Press Office on 07761274159 or at [email protected] or [email protected]
Based on this accumulating evidence, the Mental Health Foundation, University of Birmingham and the Cochrane Common Mental Disorders Group propose the following actions:
1. Body positivity and kindness activists, interested organisations and individuals should engage with the #EverydayLookism campaign. Negative comments about other people's bodies matter. When we shame bodies, we shame people. These are lookist comments. Visit #EverydayLookism and #BeBodyKind on social media to find out more.
2. Google Play and App Store should update their guidelines for developers to explicitly include ‘mental health’ in the range of harms that are unacceptable. We recognise there will be issues regarding liability. However, guidance on the definition of mental health can take this into account and this can be co-produced with the Mental Health Foundation and other experts. We recommend that at a minimum the guidelines are updated to clearly state that apps should not promote images that are discriminatory, shaming, or triggering of past trauma or eating disorders.
3. Google Play and App Store should make it mandatory that all body and face image-editing apps are rated as PEGI 12/16 and 13+ respectively, to ensure that children and young people who are below the legal age for having a social media account (13 years old) are not using these apps. All in-app purchases for additional features should be restricted to people over the age of 18, to ensure predatory promotion is restricted. Currently, only a handful of these apps are restricted in this way, and most have no age restrictions, thus often allowing children as young as five to download and use them.
4. Research should focus on understanding the features of image-manipulation apps that are most harmful to body satisfaction and mental health. Research can take an ethical perspective in defining more clearly the line which determines which image-manipulation apps or features are acceptable and which are unacceptable because of the high risk they pose to mental health.
5. Researchers and experts who design services should consider developing new social media literacy training for children and young people. All training and other programmes should employ a coproduction approach, involving children and young people in their development as well as parents and carers. There appears to be relatively little in the applied research that looks at parents in relation to body image and modelling positive behaviours. Given that parents have a significant influence on the way in which children view their bodies, parents need to be included more in the discussion about image-manipulation apps.
6. Everyone should be more aware that if they see an advert in a magazine, on television or online that they think presents an unhealthy body image as aspirational, they can complain to the Advertising Standards Authority. This includes online or other predatory advertising in relation to image editing apps. Advertisements that promote these apps to more vulnerable groups, for instance young people belonging to BAME communities, warrant greater scrutiny and investigation.
2. Griffiths, S., Hay, P., Mitchison, D., Mond, J. M., McLean, S. A., Rodgers, B., ... & Paxton, S. J. (2016). Sex differences in the relationships between body dissatisfaction, quality of life and psychological distress. Australian and New Zealand Journal of Public Health, 40(6), 518-522.
3. Research undertaken by YouGov Plc. Total sample size was 1118 teenagers. Fieldwork was undertaken between 15th- 21st March 2019. The survey was carried out online. The figures have been weighted and are representative of all GB children (aged 13-19).
4. Research undertaken by YouGov Plc. Total sample size was 4505 adults. Fieldwork was between 25th - 26th March 2019. The survey was carried out online. The figures have been weighted and are representative of all UK adults (aged 18+).
About the expert-led group on image-editing apps
The Mental Health Foundation led a group of representatives from 12 other organisations and experts by experience to review the image editing app industry. This builds on the charity’s body image campaigning work.
The University of Birmingham supported the conception of the work, participated in the roundtable, and contributed expert input to the briefing, campaign and recommendations. The Cochrane Common Mental Disorders Group oversaw the work from start to finish, conducted, collated and oversaw the rapid evidence review underpinning this work, and participated in the roundtable.
Representatives from the following organisations and academic institutions also participated to this work: University of Essex, University of York University of Strathclyde, King’s College London, University of Bath, Queen Mary’s University of London, Aneurin Bevan University Health Board, Scottish Advisory Group on Healthy Body Image, Joint Council for Cosmetic Practitioners, Centre for Appearance Research at the University of Bristol.