Online Safety
Online safety continues to be a critical component of safeguarding practice, recognising the evolving risks that children and young people face in digital environments. The Online Safety Act 2023 strengthens the UK's legislative framework by placing new duties on online platforms and introducing a range of criminal offences designed to protect children from harm, including false communications, threatening communications, cyberflashing, encouragement of self-harm and updated intimate image offences.
Amendments introduced in 2024-25 further clarified the law in relation to sexually coerced extortion ('sextortion') and strengthened protections against AI-generated child sexual abuse material. Together, these measures reflect the rapidly changing nature of online threats and reinforce the responsibility of agencies and professionals to identify, prevent and respond to online abuse as part of their core safeguarding responsibilities.
Social networking sites can be exploited by perpetrators as a convenient way to access children and young people for sexual abuse. The Serious Crime Act 2015 introduced an offence of sexual communication with a child. This law applies to an adult who communicates with a child in a sexual manner or intends to elicit a sexual response from the child, provided the adult reasonably believes the child is under 16 years of age. Additionally, the Act amended the Sex Offences Act 2003, making it an offence for an adult to arrange to meet someone under 16 after communicating with them.
'Internet Abuse' relates to four primary areas of abuse to children:
- Abusive images of children (although these are not confined to the Internet);
- A child or young person being groomed for the purpose of Sexual Abuse;
- Exposure to pornographic images and other offensive material via the Internet; and
- The use of the internet, and in particular social media sites, to engage children in extremist ideologies.
The 4 Cs of online safety
Online safety risks are broken into four areas: content, contact, conduct and commerce (sometimes referred to as contract), These are known as the 4 Cs of online safety.
Content
Content is anything posted online - it might be words or it could be images and video. Children and young people may see illegal, inappropriate or harmful content when online. This includes things like pornography, fake news, racism, misogyny, self-harm, suicide, anti-Semitism, radicalisation and extremism.
Contact
Contact is about the risk of harm young people may face when interacting with other users online. This includes things like peer-to-peer pressure or seeing inappropriate commercial advertising. Sometimes adults pose as children or young adults with the intention of grooming or exploiting a child or young person for sexual, criminal, financial or other purposes.
Conduct
Conduct means the way people behave online. Some online behaviour can increase the likelihood, or even cause, harm - for example, online bullying. Conduct also includes things like sharing or receiving nudes and semi-nude images and viewing or sending pornography.
Commerce
Commerce is about the risk from things like online gambling, inappropriate advertising, phishing or financial scams. Children and young people may be exposed to these risks directly.
Cybercrime refers to activities conducted using electronic devices, the internet, and various forms of information and communication technology.
The rise in the use of these systems has facilitated various types of crime, including economic cybercrime, organised crime, malicious and offensive communications, cyberstalking and harassment, and cyberterrorism. The primary goal of the perpetrators of these offences is to achieve financial gain by deceiving individuals. They rely on deliberate deception and often exploit people's basic needs and desires to manipulate and trick their victims.
The term 'digital and interactive technology' includes a range of electronic tools used across devices such as mobile phones, laptops, computers, tablets, webcams, cameras and games consoles.
Social networking sites are frequently used by perpetrators as an easy way to contact children and young people for sexual abuse. In addition Radical or extremist groups may also use social networking to attract children into rigid ideologies. These processes have parallels with grooming and exploit similar vulnerabilities. The groups concerned include those linked to extreme Islamist, or Far Right/Neo Nazi ideologies, Irish Republican and Loyalist paramilitary groups, extremist Animal Rights groups and others who justify political, religious, sexist or racist violence.
Online abuse may also include cyberbullying or online bullying (see Bullying). This is when a person is tormented, threatened, harassed, humiliated, embarrassed or otherwise targeted using the Internet and/or mobile devices. It is essentially behaviour between children, although it is possible for one victim to be bullied by many perpetrators. In severe cases, behaviour by one child against another may constitute as child - on child abuse.
As well as bullying the online space may be used to spread hatred through misinformation or disinformation.
Misinformation is sharing of misinformation or 'Fake news' is online content that can mislead or provide false information towards a particular topic. Stories can often be fabricated to cause panic or concern and heavily rely on users to critically determine what is trustworthy or not.
Disinformation aims to create a false perception of a person or group, sometimes to support a hateful ideology or conspiracy theory.
Misinformation on Social Media - Guidance, Impact and Support (SWGfL).
The sharing of 'nudes', sometimes referred to as sexting, describes the use of technology to generate images or videos made by children under the age of 18 of other children; images that are of a sexual nature and are indecent. The content can vary, from text messages to images of partial nudity to sexual images or video. These images are then shared between young people and/or adults and with people they may not even know. Young people are not always aware that their actions are illegal and the increasing use of smart phones has made the practice much more common place.
Online Safety is a term that refers to raising awareness about how children, young people and adults can protect themselves when using digital technology, and includes examples of interventions that can reduce risk. Schools deliver lessons on how to navigate the online space through the PHSE lessons as part of the curriculum offer.
The chapters relating to Organised and Complex Abuse and Allegations Against Staff or Volunteers should be borne in mind depending on the circumstances of the concerns.
The Online Safety Act 2023 introduced new criminal offences including:
- Sending a message containing information known to be false with the intention of causing non-trivial psychological or physical harm;
- Sending a message with a threat of death, serious injury, rape or serious financial loss where the sender intends the recipient to fear that threat will be carried out (or is reckless as to whether the recipient has such fear);
- Sending or showing an electronic communication with flashing images with the intention to cause harm to a person with epilepsy; A new law has been passed called Zach's Law which protects and prosecutes trollers who post flashing images with intent to trigger a seizure. See: #ZachsLaw: world leading legislation (Epilepsy Society) for more information;
- Communicating, publishing or showing material intended or capable of encouraging or assisting serious self-harm, even if the sender cannot identify the recipients and even if the self-harm does not occur;
- Intentionally sending or giving images of another person's genitals with the intention to cause the recipient alarm, distress or humiliation, or for the purposes of sexual gratification;
- Intentionally sharing or threatening to share intimate images without consent (replacing previous 'revenge porn' offences);
- Sexually coerced extortion or 'sextortion' is a type of blackmail where an offender uses intimate, naked or sexual images or videos of a child or young person to force them to pay money or engage in unwanted actions. Offenders often meet children via dating apps, social media, livestreaming sites or platforms linked to pornography;
- Criminals often target people who use dating apps, social media platforms, webcam/live streaming sites or websites related to pornography. They might pretend to be someone else online and become friends with the child. Later, they might threaten to share pictures or videos with a child's family and/or friends.
Help if you're worried about 'sextortion' or online blackmail - Internet Watch Foundation
Sextortion - UK Safer Internet Centre
The National Crime Agency's CEOP Education have issued an alert to education settings across the UK in response to this threat: FMSE Alert - CEOP Education. The alert will help professionals to:
- Recognise and understand financially motivated sexual extortion;
- Raise awareness and help-seeking behaviours amongst children and young people;
- Give suitable messaging and support to parents and carers; and
- Support victims of financially motivated sexual extortion.
Research indicates that individuals who possess indecent images of children may escalate to direct abuse. Assessments should consider an individual's access to children in family, employment, voluntary work or other positions of trust.
Exposure to extremist content may normalise harmful ideologies. Repeated viewing may diminish a child's ability to recognise bias or danger.
In particular, the individual's access to children should be established during an assessment and investigation to consider the possibility that they may be actively involved in the abuse of children including those within the family, within employment contexts or in other settings such as voluntary work with children or other positions of trust.
Any indecent, obscene image involving a child has, by its very nature, involved a person, who in creating that image, has been party to abusing that child.
Similarly, children may be drawn to adopt a radical ideology through a failure to appreciate the bias in extremist material; in addition by repeated viewing of extreme content they may come to view it as normal.
There is a correlation between online risk and real-life vulnerability. Care-experienced children and those with communication difficulties, mental health issues or other vulnerabilities are at heightened risk. For example, young people with eating disorders, care experienced children and young people and those with communication challenges may use technology to communicate and socialise in ways they cannot achieve without it. Denying online access may itself increase risk through loss of social opportunity and resilience building.
Safe internet use
The internet plays a crucial role in modern life, and it is essential for children in residential care and at home to learn how to navigate it safely and responsibly. Effective guidance provided in the home environment can significantly enhance their understanding of online safety.
Children and young people should receive support from home staff to use the internet and social media safely. They must understand the importance of not sharing personal information-such as their name, address, school, or mobile phone number-with anyone they do not know or trust. Conversations should also address the necessary precautions they must consider if they plan to meet someone in person whom they have only interacted with online.
Establishing clear internet rules within the home can set boundaries and define expectations for children and young people while they are at home. See: Parents and Carers - UK Safer Internet Centre for more details.
Moreover, computers and web-enabled devices owned by the family/caregiver should have adequate protections in place, including access controls and site restrictions. Firewalls and other safety filters must be installed, regularly monitored, and maintained.
Online abuse often becomes known through accidental detection of images or messages. Discovery may be unexpected and distressing for families or professionals. This in itself can make accepting the fact of the abuse difficult for those who know and may have trusted that individual. Partners, colleagues and friends often find it difficult to believe and may require support.
Children may show behavioural or mood changes, secretiveness around devices, changes in friendships, anxiety when receiving messages, or reluctance to be with certain individuals Clearly such changes can also be attributed to many innocent events in a child's life and cannot be regarded as diagnostic. However changes to a child's circle of friends or a noticeable change in attitude towards the use of computer or phone could have their origin in abusive behaviour.
Increased time online, being socially isolated and being secretive can be an indicator of online abuse.
Children often show rather than tell that something is upsetting them. There may be many reasons for changes in their behaviour, but if we notice a combination of worrying signs it may be time to call for help or advice.
Suspected or actual evidence of indecent images of children must be referred to the Police and Children's Social Care in line with the Referrals Procedure.
Concerns about grooming, exposure to pornography or inappropriate online contact should also be referred.
The Serious Crime Act (2015) introduced an offence of 'sexual communication with a child'. applying where an adult communicates sexually or attempts to elicit sexual communication whilst believing the child to be under 16. The Act also amended the Sex Offences Act 2003 so it is now an offence for an adult to arrange to meet with someone under 16 having communicated with them on just one occasion, previously it was on at least two occasions.
Due to the risk of evidence destruction, professionals should consult Police and Children's Social Care before informing families. This will enable a joint decision to be made about informing the family and ensuring that the child's welfare is safeguarded.
All such reports should be taken seriously. Most referrals will warrant a Strategy Discussion to determine the course of further investigation, enquiry and assessment. Any intervention should be continually under review especially if further evidence becomes known.
See Sharing nudes and semi-nudes advice for education settings working with children and young people, for recent government guidance and Sexting: how to respond to an incident - an overview for all teaching and non-teaching staff in schools and colleges.
Where there are concerns in relation to a child's exposure to extremist materials, the child's school may be able to provide advice and support: all schools are required to identify a Prevent Single Point of Contact (SPOC) who is the lead for safeguarding in relation to protecting individuals from radicalisation and involvement in terrorism.
Suspected online terrorist material can be reported through www.gov.uk/report-terrorism. Content of concern can also be reported directly to social media platforms - see Safety features on Social Networks.
When communicating via the internet, Children often speak more openly online than in person. Both adults and young people may use the internet to harm children, including by taking or distributing sexual images
Some do this by looking at, taking and/or distributing photographs and video images on the internet of children naked, in sexual poses and/or being sexually abused.
Children and young people should be supported to understand that when they use digital technology they should not give out personal information, particularly their name, address or school, mobile phone numbers to anyone they do not know or trust: this particularly includes social networking and online gaming sites.
The Digital Passport is aimed specifically at children in care, but may be a useful resource that can be adapted for any vulnerable child.
If they have been asked for such information, they should always check with their parent or other trusted adult before providing such details. It is also important that they understand why they must take a parent or trusted adult with them if they meet someone face to face whom they have only previously met on-line.
Children and young people should be warned about the risks of taking sexually explicit pictures of themselves and sharing them on the internet or by text. It is essential, therefore, that young people understand the legal implications and the risks they are taking. The initial risk posed by the sharing of nudes/sexting may come from peersand others in their social network who may share the images. Young people should understand that sharing sexual images carries legal risks and once shared, images may be redistributed beyond their control.
Where young people are voluntarily sending/sharing sexual images or content with one another the police may use the recently introduced 'outcome 21' recording code. This allows the Police to record a crime as having happened but for no formal criminal justice action to be taken. Crimes recorded this way are unlikely to appear on future records or checks, unless the young person has been involved in other similar activities which may indicate they are at risk.
The discretion about whether to disclose non-conviction information rests with each Chief Constable managing the process.
In some cases adults may also groom a young persons into sending such images which can then be used to blackmail and ensnare them – see Child Sexual Exploitation.
Artificially generated child sexual abuse material (CSAM) includes images produced wholly or partly using AI. Such material has become increasingly realistic. Many AI tools restrict abusive content, but offenders have found ways to bypass controls.
AI Images are usually produced using software which converts a text description into an image. This technology is developing rapidly, and images created can be very realistic, with recent examples being difficult to differentiate from unaltered photographs.
Many popular, publicly available artificial intelligence tools automatically block attempts to create abusive material, but the large number of child sexual abuse images made using them that have been detected show that individuals have found ways around this. Typically, they are made using publicly available artificial intelligence tools that can be used and manipulated to produce images, (and, increasingly, videos) depicting child sexual abuse.
See also How AI is being abused to create child sexual abuse material (CSAM) online (iwf.org.uk).
This term describes children under 18 sending or posting nude or semi-nude images, videos or livestreams online or via device-to-device transfer. Images may be shared publicly, privately, or in group chats. The sharing of nudes and semi-nudes can happen publicly online, in 1:1 messaging or via group chats and closed social media accounts. Nude or semi-nude images, videos or live streams may include more than one child or young person. The term 'nudes' is used as it is most commonly recognised by young people and more appropriately covers all types of image sharing incidents. Alternative terms used by children and young people may include 'dick pics' or 'pics'. Many professionals may refer to 'nudes and semi-nudes' as:
- Youth produced sexual imagery or 'youth involved' sexual imagery;
- Indecent imagery. This is the legal term used to define nude or semi-nude images and videos of children and young people under the age of 18;
- 'Sexting'. Many adults may use this term, however some young people interpret sexting as 'writing and sharing explicit messages with people they know' rather than sharing images;
- Image-based sexual abuse. This term may be used when referring to the nonconsensual sharing of nudes and semi-nudes.
Terms such as 'revenge porn' and 'upskirting' are also used to refer to specific incidents of nudes and semi-nudes being shared. However, these terms are more often used in the context of adult-to-adult non-consensual image sharing offences outlined in s.33-35 of the Criminal Justice and Courts Act 2015, Voyeurism (Offences) Act 2019 and s.67A of the Sexual Offences Act 2003. Terms such as 'deep fakes' and 'deep nudes' may also be used by adults and young people to refer to digitally manipulated and AI-generated nudes and semi-nudes.
See Sharing nudes and semi-nudes: advice for education settings working with children and young people.
Cyberflashing:
Cyberflashing is when someone sends you a photo of genitals without your permission. It is sometimes called 'dick pics' or 'flashing'. It is illegal to do this. They may be done to cause distress, to embarrass a person, sexual gratification, or to cause alarm, distress or humiliation. It doesn't matter if the photo shows a person's real genitals or if the photo has been computer-generated to look like someone's genitals.
It is different to if someone deliberately exposes their genitals in person to frighten or upset, this is known as indecent exposure.
The Professionals Online Safety Helpline is a free service for professionals and volunteers working with children and young people. It provides signposting, advice and mediation to resolve online safety issues staff face about themselves, such as protecting professional identity and online harassment, or problems affecting young people, for example cyber-bullying or sexting issues.
Professionals Online Safety Helpline - UK Safer Internet Centre
Coram Children's Legal Centre - LawStuff is run by Coram Children's Legal Centre and gives free legal information to young people on a range of different issues. See Children's rights in the digital world in particular.
Child Safety Online - A Practical Guide for Parents and Carers whose Children and Using Social Media.
Misinformation and Fake News - Stop Hate UK
Teaching online safety in schools (DfE)
Sharing nudes and semi-nudes: advice for education settings working with children and young people
Talking to Your Child About Online Sexual Harassment (Children's Commissioner)
NSPCC Report Remove Tool - The tool enables young people under the age of 18 to report a nude image or video of themselves which has appeared online. The Internet Watch Foundation will review these reports and work to remove any content which breaks the law.
UK Council for Internet Safety (UKCIS) Digital Passport - a communication tool to support children and young people with care experience to talk with their carers about their online lives.
Social Media as a Catalyst and Trigger for Youth Violence (Catch 22)
Behaviour that is illegal if committed offline is also illegal if committed online. It is recommended that legal advice is sought in the event of an online issue or situation. There are a number of pieces of legislation that may apply including:
Keeping Children Safe in Education
Protecting children from harmful sexual behaviour - NSPCC Learning
Internet Matters - advice for professionals, parents and young people on a wide range of digital safety issues including the digital passport.
Refuge and Risk: Life Online for Vulnerable Young People - research into the risks and dangers for vulnerable young people online. The report discusses the types of risk they encounter which is exacerbated by the vulnerabilities.
Child sexual exploitation by organised networks Investigation Report
Child Exploitation Disruption Toolkit
EY providers:
Last Updated: March 20, 2026
v71