Online Harm – Social Media, the Age Appropriate Design Code and Protection of Children Online

Sadly, online harm is becoming increasingly more prevalent. We have had a spike in enquiries asking how our clients go about dealing with trolls, bullies and other harmful material which is available or pushed on their children.  The first step will always have to come from the government and measures being put in place have been on the table for some time.

Theresa May’s government even published its own online harms White Paper with the aim to tackling this by appealing to companies that allow users to explore certain content and share this with each other on an interact platform.

This will of course include social media platforms and other public discussion forums to all attempt to better regulate sites but also proposed an external regulator to monitor and ensure the correct duty of care is being fulfilled.

This paper also put forward better transparency rules and have a key focus on the protection of children. Online use by minors is prevalent and ever growing; there was a rise in online bullying and sadly teenage suicide and so the Information Commissioners Office (ICO) published an Age Appropriate Design Code to tackle this.

This applies to all providers of online information services. Those services would also have to be likely to apply to under 18s and specifically those in the UK. This code is linked with the Data Protection Act 2018 so follows this statutory duty. The purpose of the code is to ensure that these service providers have conformed to the following standards/measures when providing the services to minors:

  1. The services available to the minor must show that they have the child’s best interests as its primary consideration, and this must be overriding to the commercial interests of the provider.
  2. There must be a Data Protection Impact Assessment undertaken prior to the services being made available to minors.
  3. Ensure that the services being accessed by the minor are age appropriate.
  4. Various Data Protection and Data Sharing safeguards are put in place. This may include only the minimum amount of data being recorded.
  5. The code also provides that if there are parental controls in place on the platform then the child will need to clearly see when this is being accessed.

It will important for the service providers to have policies and procedures in place to allow parents to keep track of the type of safeguards each platform has in place before the parent being happy to allow their children access. The ICO is responsible for enforcement of the code and in a positive way, the ICO has the power to enforce this the same way as GDPR requirements and can be used in court proceedings.

The point of all the above is to provide some solace to parents and teenagers who are concerned about online harms and wish to avoid them to a degree. If the concerns are more directed at trolls and bullies then this would be further addressed with the provider to block content, and speak to a legal representative who may be able to send a letter threatening action if the harassment or defamation (if it is being publicly attacked) persists.

share this Article

Share on facebook
Share on twitter
Share on linkedin
Share on whatsapp
Share on email

Recent Articles