Ofcom: one third of UK children use adult social media accounts
Most children aged between eight and 17 years old in UK had at least one social media platform, and many are younger than the minimum age required.
A study commissioned by regulator Ofcom, carried out by Yonder Consulting, revealed 47% of children aged eight to 15 with a social media profile have a user age of 16 and over, and 32% of children aged eight to 17 have a user age of 18 and over.
In a younger subcategory of eight to 12-year-olds surveyed, the study estimated 39% had a user age profile of someone aged 16 and over, and 23% had a user age of 18 and over.
Despite most platforms having a minimum age of 13, Yonder Consulting found 77% of children aged between eight and 17 years old who used social media have their own profile on at least one platform.
The research discovered a total of 60% of children aged between eight and 12 years old are signed up on social media with their own profile. Up to 50% of this subsection had set up the profile themselves, and up to two thirds had help from a parent or guardian.
Ofcom’s report highlighted the problem with a child having an account with a fake age is that it puts them at a greater risk of coming across age-inappropriate and potentially harmful content as new permissions are granted when an adult user age is reached, either 16 or 18 depending on the platform in question.
Online hazards have range of short and long-term effects
An online hazard was defined as the ‘thing’ that someone encounters online such as the potentially harmful content seen or potentially harmful contact with another user.
The report detailed a framework of “Four Cs”; content, contact, conduct and contract, which was used to categorise the harmful content in the study.
The research described children’s lives as “increasingly enmeshed with the online world”, with many finding it “very difficult” to disengage from online platforms as they rely on these spaces for activities across all aspects of their lives including friendship, connection, education and engaging with culture.
Children surveyed reportedly “struggled” to imagine what life would be like without the internet and experienced a range of harms online, with varying impact, the research discovered.
The research was based on interviews with children and parents and described a variation in the severity of negative effects of social media.
These effects ranged from:
- “Minimal transient emotional upset” like confusion or anger
- Short-term behaviour change or deep emotional impact which could manifest in physical aggression or short-term food restriction.
- Far-reaching, severe psychological and physical harm like social withdrawal or self-harm.
Hazards online that were described as “harmful” included, but were not limited to, stumbling across or being sent a violent or sexual video in a social media feed, being immersed in body-focused content in social media feeds, and engaging with and participating in pro-anorexia communities online.
Cumulative passive and active exposure to these hazards over time had more serious long-term effects than isolated exposure, the report stated.
Delay in Online Safety Bill putting young people ‘at risk’
Ofcom was given powers in autumn 2020 to regulate UK established video-sharing platforms (VSPs), defined as online services which allow users to upload and share videos with other people and engage with a wide range of content and social features.
Regulation of VSPs is meant to protect users from specific types of harmful material in videos like inciting violence or hatred, content constituting criminal offences relating to terrorism, child sexual abuse material, and racism and xenophobia. VSPs are also required to ensure standards around advertising are met.
In December 2020, the Government confirmed its intention to nominate Ofcom as the regulator for online safety in the UK, under the Online Safety Bill, which is currently delayed in Parliament.
This report in to under-age use of social media is one in a series of studies into online safety that will inform Ofcom’s preparations for implementing the new online safety law, it said in its introduction.
Ofcom’s specific objectives for this piece of research were to “go beyond” simply a descriptive account of what children experience online and to explore the risk factors that may lead them to harm and why, with a focus on social media, video-sharing platforms (VSPs), gaming, and search platforms.
The report comes as an inquest in to the death of teenager Molly Russell, from Harrow north-west London, in 2017 was due to her viewing “extensive amounts of content”, particularly on Instagram and Pinterest, related to self-harm, suicide, anxiety and depression.
Her father Ian Russell, said any delay to the Online Safety Bill, intended to make the UK the “safest place to go online”, “endangers” young people.
This was echoed by Jake Dubbins, co-founder of The Conscious Advertising Network when he told The Media Leader another delay to the bill was “as disappointing as it is predictable”.
Culture Secretary Michelle Donelan reportedly told Russell the Online Safety Bill would resume its progress through Parliament before Christmas after being paused in July after former Prime Minister Boris Johnson announced he would step down. Liz Truss has not yet announced if and when the Bill would be discussed in Parliament again.
Ofcom recruited 42 children aged between seven and 17 years old from all four UK nations, with a broad spread by gender, socio-economic group, family structure, ethnicity, special educational need, device and online use to contribute to the research project. Seven of these children were “looked after children” in either foster or residential care.
The sample included children who had a range of types and frequency of experiences online; from those who had no or few negative experiences, to those who sometimes had negative experiences, and others who had frequent or significant negative experiences online.