CYBER SAFE BLOG

Search

PEGI is the system by which games and apps are age rated to provide guidance to parents as to whether they are appropriate for their children. The PEGI rating considers the age suitability of a game, not the level of difficulty. It's ratings are the suggested MINIMUM AGE at which a game should be played.


For the quickest access to information on all games, which is incredibly useful if your child comes home from school saying "Can I download THIS game that x or y or z has. It's amazing! Please, please, please!!!!" you can download the free PEGI app allowing you to make a first check on the game. You can then cross check on www.netaware.org and www.commonsensemedia.org for further information about the detail of the game/app.


It's useful to understand the categories before you start. The categories are:

PEGI 18 - suitable for ages 18+


The adult classification is applied when the level of violence reaches a stage where it becomes a depiction of gross violence, apparently motiveless killing, or violence towards defenceless characters. The glamorisation of the use of illegal drugs falls into this category too. 18 games may also contain explicit sexual activity (intercourse) - check for the SEX content descriptor.






PEGI 16 - suitable for 16+


This rating is applied once the depiction of violence (or sex) reaches a stage that looks the same as would be expected in real life.










PEGI 12 - suitable for 12+


This category includes violence towards fantasy characters or non-realistic violence towards human-like characters, sexual innuendo or posturing, mild bad language and/or gambling.









PEGI 7 - recommended for ages 7+


Game content with scenes or sounds that can possibly frightening to younger children and very mild forms of violence (implied, non-detailed, or non-realistic).









PEGI 3 - recommended for ages 3+


Content is considered suitable for all age groups.










PEGI also uses 7 "CONTENT DESCRIPTORS" to help you further with deciding what is appropriate for your child - these are particularly helpful to identify individual features within games which you may want your child to avoid. The descriptors are: Violence, Bad Language, Fear, and Discrimination. Here is the shortest summary we could make!


SEX - PEGI 12 = sexual posturing or innuendo, PEGI 16 = erotic nudity or sexual intercourse without visible genitals, PEGI 18 rating = explicit sexual activity in the game.


VIOLENCE - In PEGI 7 games violence = non-realistic or non detailed. PEGI 12 = violence in a fantasy environment or non-realistic violence towards humans, PEGI 16 or 18 = increasingly realistic violence.


BAD LANGUAGE - in PEGI 12 = mild swearing, PEGI 16 & 18 = swearing + sexual expletives


FEAR - PEGI 7 = frightening pictures or sounds for young children, PEGI 12 = horrific sounds or horror effects (but without any violent content).


GAMBLING - Can only apply to PEGI 12, 16 or 18 = the game contains elements that encourage or teach gambling.


DRUGS - The game refers to or depicts the use of illegal drugs, alcohol or tobacco. Games with this content descriptor are always PEGI 16 or PEGI 18.


DISCRIMINATION - Used if the game contains depictions of ethnic, religious, nationalistic or other stereotypes likely to encourage hatred. This content is always restricted to a PEGI 18 rating (and likely to infringe national criminal laws).


Parental Control Tools

Parental control tools allow you to protect your children's privacy and online safety according to various parameters. You can select which games children are allowed to play (based on the PEGI age ratings), limit and monitor their online spending, control access to internet browsing and online interaction (chat), and set the amount of time children can spend playing. A full list of parental controls on ALL devices and how to set them is found here.


The Information Commissioner announced a new 15 point Age Appropriate Design Code yesterday that app and game developers will have to submit to from Autumn 2021.


The code will legally require developers to assess their sites for sexual abuse risks and incorporate measures to ensure that users under 16 no longer see self-harm and pro-suicide content. It will also require digital services to automatically provide children with a built-in baseline of data protection whenever they download a new app, game or visit a website.





The code sets out the standards expected of those responsible for designing, developing or providing online services like apps, connected toys, social media platforms, online games, educational websites and streaming services. It covers services likely to be accessed by children and which process their data.


The code includes that:


- Privacy settings should be set to high by default and nudge techniques should not be used to encourage children to weaken their settings.


- Location settings that allow the world to see where a child is should also be switched off automatically.


- Data collection and sharing should be minimised and profiling that can allow children to be served up targeted content should be switched off by default too.


The code says that the best interests of the child should be a primary consideration when designing and developing online services. And it gives practical guidance on data protection safeguards that ensure online services are appropriate for use by children.


Ms Denham, Information Commissioner, said:

“One in five internet users in the UK is a child, but they are using an internet that was not designed for them. “There are laws to protect children in the real world – film ratings, car seats, age restrictions on drinking and smoking. We need our laws to protect children in the digital world too. “In a generation from now, we will look back and find it astonishing that online services weren’t always designed with children in mind.”

The standards of the code are rooted in the General Data Protection Regulation (GDPR) and the code was introduced by the Data Protection Act 2018. The ICO submitted the code to the Secretary of State in November and it must complete a statutory process before it is laid in Parliament for approval. After that, organisations will have 12 months to update their practices before the code comes into full effect. The ICO expects this will be by autumn 2021.


Since 25 May 2018, the ICO has the power to impose a civil monetary penalty (CMP) on a data controller of up to £17million (20m Euro) or 4% of global turnover. Once the code is in place these will be the punitive measures attached to any breaches.





Updated: Jan 21

The Victoria Derbyshire programme today reported on the issue of Social Media Influencers being offered thousands of pounds regularly for sex. One influencer said social media had become "a catalogue for men to select their next conquest".


It's important for parents and all professionals involved with working with children to understand the use of social media in this way is continuing to grow and that it is hugely tempting for young people to reply to messages when they are offered considerable amounts of money. The programme featured Tyne Lexy-Clarkson - who was only 19 when she was offered £20,000 for the first time for dinner and drinks.


"It's high-end prostitution - it's just scary to think if they've messaged me, they've probably sent it to thousands of pretty girls on Instagram,"


After starring in series two of Love Island, an agency emailed, offering her £50,000 for five nights in Dubai. It contained a non-disclosure agreement, stating that the details of what she would be required to do would remain confidential.


Tyne-Lexy says she refused the offer, but fears that struggling influencers who do not receive luxury items for free would feel pressure to "keep up appearances" and become vulnerable to these kinds of transactions.


"It's a lot of money for some people, it's life-changing amounts of money."You can read about it in full here.


The problem affects children and young people in Scotland because the same approaches are made to them, but offering smaller sums of money/merchandise/free items. It is important that it is fully understood.


What can you do?

- Understand that on Instagram and Snapchat even if a child or young person's profile is set to private anyone can still send them messages and they have a choice whether to read them and whether to accept the user as a friend having read the message


- make sure you teach your child the law specific to this area - it is an offence under Scottish law for anyone to message a child online with a view to trying to meet them for sexual activity/a sexual act or the taking of intimate photos:

"Meeting a child following certain preliminary contact - Protection of Children and Prevention of Sexual Offences (Scotland) Act 2005" here.


- encourage parents (even when under pressure) and children to stick to the age limits for social media with their children (not to set up accounts before they are 13 on Tik Tok , Instagram or Snapchat) unless they are managing any account themselves on the parent's own phone/device


- ensure that children are not recognisable as children from their profile pictures on their accounts (whether by using avatars or other photos)


- be aware of the specific risk this presents for your child and their friends - ask them regularly to tell you if they ever receive a message offering them anything (money, merchandise, free drinks etc)









  • Facebook Social Icon
  • Twitter Social Icon

© 2020 CyberSafe Scotland