Children in the UK will lead safer online lives as Ofcom have finalised safety measures for sites and apps to introduce from July. Tech firms must act to prevent children from seeing harmful content, and meet their duties under the Online Safety Act.
Children in the UK will have safer online lives, under transformational new protections finalised by Ofcom.
They are laying down more than 40 practical measures for tech firms to meet their duties under the Online Safety Act. These will apply to sites and apps used by UK children in areas such as social media, search and gaming. This follows consultation and research involving tens of thousands of children, parents, companies and experts.
The steps include preventing minors from encountering the most harmful content relating to suicide, self-harm, eating disorders and pornography. Online services must also act to protect children from misogynistic, violent, hateful or abusive material, online bullying and dangerous challenges.
In designing the Codes of Practice, researchers heard from over 27,000 children and 13,000 parents, alongside consultation feedback from industry, civil society, charities and child safety experts. Ofcom also conducted a series of consultation workshops and interviews with children from across the UK to hear their views on our proposals in a safe environment.