
Respected and Safe
Key things we have learned from Respected and Safe P7s in 2024-25
Through Respected and Safe we continually learn more from all of the children about the kinds of harm they are experiencing in different spaces online, as well as how different safety settings and changes to design impact their experiences.
This is contextual safeguarding in action - working in partnership with schools to build the picture around each child and community in all parts of their lives (including digital).
We can only keep children safe together by sharing information about these spaces as we learn.
Here are 3 examples of things we have learned from P7s on Respected and Safe this year (2024-2025), which can inform wider strategy on protecting children online. We will be sharing more examples at staff training later this year.
Online Misogyny - so much more than Andrew Tate
Adults so often focus on widely held examples of influencers (think Andrew Tate...) when thinking about online misogyny, but the P7s describe a much more nuanced picture of the many - perhaps less obvious - ways that harmful messages appear in their online spaces.
From algorithmic loops - particularly on YouTube, Tiktok and Snapchat Spotlight, where the app is frequently recommending content involving banter and the meme-ification of violence against women and girls as standard and pushing these to accounts because it generates high engagement - to adverts for apps where you can AI generate images of others without their consent (e.g. AI kissing apps - promoted mainly to the boys' accounts) the patterns are complex but important to be aware of. We will explore these in much greater depth at training.
Reporting harm in the apps is ineffective
We have spent a lot of time understanding children's experiences of responding to and reporting harm on different platforms, highlighting the obstacles and limitations they experience in reporting and moderating mechanisms on major platforms and games.
On Roblox alone, children tell us:
-
Moderation on chat functions doesn’t effectively censor inappropriate and harmful language – partly, for example, because users find ways around censorship such as hashing blocked words.
-
They have many experiences of trying to report harm on the platform and seeing the perpetrator banned only temporarily, only to return to the game after a few days.
-
Often when they try to respond to harm by using their voice and, for example, asking someone to ‘stop’ in a chat, they face bans on their own accounts.
We can counter these failings by the products by recognising this, and continually increasing and promoting opportunities for children to talk about any harmful online experiences during their time in school.
Safety Settings
The experiences of P6s and P7s on Respected and Safe also inform questions and learning on the effectiveness of safety settings designed for children on major platforms.
For example, many children under 13 do not benefit from the automatic safety settings platforms have built in for ‘teen’ accounts under 18, because when they are signing up for the app, they often create their digital age as much older than 18 – by choosing a year that is easy to remember, like 2000 or 1990.
As a result, if they are to access any safety settings at all without losing their (often carefully developed) accounts, they have to be taught to opt-in to settings. This is often possible but is usually not adopted by them - either because they don’t know how to change settings or because they don't want them to limit elements of connection that are already really central to social norms around the space.
As an example, Snapchat has a feature called ‘QuickAdd’ (or ‘Find Friends’), which allows users to expand their network of friends by making new friend suggestions. P7s in the project this year highlighted to us how this feature often puts children, and especially girls, at higher risk of receiving unwanted contact on the app.
This feature should be disabled on a 'teen' account if their date of birth has been set correctly. While it is very important that we encourage parents and carers not to allow their children to set up accounts before they can be set correctly (i.e. until the child is 13), it is also critical we teach them that if they are going to set up accounts early they need to do so with a date of birth as close to their biological date of birth as possible.
Alongside this, we also need to work to safeguard learners who already have accounts, and have done for some time, without judgment - as this only prevents them feeling safe to tell us about harm that has happened and is continuing to happen to them on these platforms.
We found that:
-
Most of our P7s didn’t know how to turn QuickAdd off, but after supporting them to change this setting, many girls told us that it had really reduced the amount of unwanted contact they were dealing with.
-
Some P7s were reluctant to turn QuickAdd off, because it was an important part of how they made new friend connections with their peers, but we were still able to empower them with knowledge about it and encourage conversations about them having agency to turn it off at certain times.
It is so important to bring insights like these from children into conversations about design and changes aimed at increasing their safety; through this programme, children in Aberdeen continually allow and encourage us to interrogate these changes by representing their experiences and raising their voices to influence national conversations.
An important part of the Respected and Safe lesson programme this year has been to encourage P7s to recognise ways that online platforms they use are designed, and how this impacts them. Combining this learning with the creation of a space for children to tell us about their experiences has enabled us to gain deeper insight into how different design features (and changes to them) actually impact norms, behaviour and risks in the online space as the children see it.
As teachers in our partner schools, we are thankful that you are an important part of supporting this work and would love you to contact us with the experiences of the children you teach at any time. If you want to do this, you can email: katrina.murray@cybersafescotland.org.
Get in touch
Feedback



