Nexus Response
Ofcom Protecting Children from Harms Online Consultation
Nexus Response
Ofcom Protecting Children from Harms Online Consultation
#BreakTheCycle
Nexus Consultation Response:
Ofcom Protecting Children from Harms Online
Submitted July 2024
As part of the commissioning and implementation of the Online Safety Act (2023), Ofcom was named as the independent regulator of Online Safety. In this role, Ofcom must set out the guidance, framework, and steps for service providers (i.e. technology companies hosting content which may be accessible to children including social media, search, and pornography services, as well as apps, programs, connected toys and devices, streaming services, online games, news or educational websites, and websites offering goods or services to users over the internet) to take to fulfil their legal duties as set out in the Online Safety Act. Ofcom will also possess enforcement powers to ensure providers’ compliance with the agreed safety frameworks and codes of practise.
The comments, suggestions, and questions in our response are based on our expertise and experience supporting children, young people, and their families and carers through therapeutic interventions across Northern Ireland. You can view snippets of each of our responses below and view the full response by clicking here or on the image below.
If you have any questions or comments about this consultation response, please contact communications@nexusni.org.
This consultation focuses on Ofcom’s proposals for how internet services that enable the sharing of user-generated content (‘user-to-user services’) and search services should approach their new duties relating to content that is harmful to children. Click here for definitions and interpretations of terms like ‘user-generated content’ and ‘user-to-user’ services.
As providers of Relationship and Sexuality Education as well as a therapeutic intervention service for young people impacted by sexual abuse and abusive relationships, Nexus welcomes this robust, well-evidenced volume of proposals from Ofcom to strengthen protections and accountability measures for online services. As society continues to spend large quantities of time online or on social media, we strongly believe it is the responsibility of service providers, government, and strategic stakeholders to protect our children and young people from harms online.
This consultation focuses on Ofcom’s proposals for how internet services that enable the sharing of user-generated content (‘user-to-user services’) and search services should approach their new duties relating to content that is harmful to children. Click here for definitions and interpretations of terms like ‘user-generated content’ and ‘user-to-user’ services.
As providers of Relationship and Sexuality Education as well as a therapeutic intervention service for young people impacted by sexual abuse and abusive relationships, Nexus welcomes this robust, well-evidenced volume of proposals from Ofcom to strengthen protections and accountability measures for online services. As society continues to spend large quantities of time online or on social media, we strongly believe it is the responsibility of service providers, government, and strategic stakeholders to protect our children and young people from harms online.
Volume 2: Identifying the Services that Children are Using
Volume 2 sets out Ofcom’s approach to the Draft Children’s Access Assessment Guidance, including age assurance technologies, the meaning of “significant number of children” and how Ofcom proposes services assess whether they are likely to attract a significant number of children.
Nexus agrees with the proposals in Volume 2. We believe it is vital that the age assurance technology is accurate at determining the age of a user, is bespoke for the service type, and should be constantly tested for accuracy and reliability. Research shows that children are accessing content that can be classified as a risk of harm to their wellbeing but is not necessarily advertised for a child audience. We are encouraged to see that the proposed process for children’s access assessments is rigorous and intentionally wide-ranging to capture all aspects of service provision, business practises, and child user activity.
Volume 3: The Causes and Impacts of Harms to Children
Volume 3 presents Ofcom’s draft Children’s Register of Risks as part of Ofcom’s duty to assess the risk of harm to children from content online. The draft Guidance on Content Harmful to Children complements the Register of Risks, providing examples of what Ofcom considers to be, or not to be, content harmful to children.
On Ofcom’s assessment of the causes and impacts of online harms, we would like to see a fuller explanation of the kinds of content harmful to children included in the guidance to support service providers in identifying harmful content. We agree with Ofcom’s analysis of risk factors and content harmful to children, in particular relating to: pornographic content; abuse and hate content; and violent content. We agree that the use of anonymous profiles, private vs open profiles, direct messaging and tagging can increase the risk of children being exposed to harmful content. We would like to highlight the growing use of “Sextortion”, which involves “the threat of sharing images or videos – often ‘nudes’ or sexually explicit content – to extort money or force someone to do something against their will”. Through our work with children and young people, we have also come across a concerning trend using “Live” features on TikTok, Instagram, and Facebook to target children and young people with sexual content. We also included links in our full response to research on the risks of Generative AI to children, including the production and distribution of Child Sexual Abuse Material.
Nexus agrees with the proposed approach to the draft Guidance on Content Harmful to Children. We welcome Ofcom’s guidance on how content can be highly subjective and context specific, meaning that different kinds of harms can vary in nature due to the presentation of the content and the specific nature of the user and poster. We also welcome Ofcom’s efforts to differentiate between content harmful to children and recovery content, which can be beneficial for children and other users who are on or beginning a recovery journey.
We agree with the proposal to include codewords, hashtags, substitute terms/phrases, sounds, pornographic GIFs, sexualised emojis, and comments as elements for services to consider as content that poses a risk of harm to children. We would appreciate some clarification around when the use of language strongly associated with sexual activity and pornography is considered to be pornographic content. We have found from our engagement with young people that they don’t often use explicit, sexualised language and are more likely to use acronyms, emojis, and other code words to convey sexualised content.
Volume 4: Assessing the risks of Harm to Children Online
In this volume, Ofcom explain their proposals about the governance measures service providers should put in place to manage risk to children and how service providers should go about assessing the risk of harm to children encountering harmful content online.
We agree with the proposed governance measures. There is a balance of accountability between governance, senior leadership accountability, internal monitoring, and staff policy implementation to create a holistic approach to governance and accountability. We would like to reiterate that assessing risk is often complex and nuanced, and needs supported by other forms of specific training on Child Sexual Exploitation, Child Sexual Abuse, Safeguarding, Child Protection, etc.
We believe that the proposals in relation to the Children’s Risk Profiles are comprehensive and informative. It is vital that Ofcom creates guidance and assessments that remove any guesswork on behalf of services and directly highlight the harms that children are at risk of coming into contact with and therefore ensures that services completing their Risk Assessments are not self-eliminating their services. We appreciate the emphasis on the detrimental effects of poor governance and accountability on children and vulnerable people, especially if there is a disjointed approach across different codes.
Volume 5: What should Services do to mitigate the risks of Online Harms to Children?
Volume 5 outlines draft measures Ofcom propose providers of services likely to be accessed by children could take to comply with their child safety duties in the Online Safety Act. These are set out in the draft Children’s Safety Codes in Annexes 7 and 8, which will be finalised following consultation, including age assurance measures, content moderation, search moderation, user reporting and complaints, terms of service, recommender systems, user support, impact assessments, and statutory test.
We agree with the measures proposed in the Children’s Safety Codes, with some recommendations to add content types relating to sexual abuse and abusive relationships. We also recognise the need for a levelled approach to measures for different services according to risk, size of service, functionalities, and type of service. However, we want to emphasise the need for employing expertise to continually scope for software that might be able to bypass the age assurance technology as this landscape would move quickly. It is vital that the age assurance technology is accurate at determining the age of a user.
We agree with the proposals on content moderation for user-to-user services. Ofcom’s recommendations include cost-effective, proportionate measures for all services, with extra requirements for large and multi-risk services whereby there is a greater need for protecting service users. We also welcome Ofcom’s commitment to an additional consultation later this year on automated content moderation and detection tools, as we see a growing trend in online technology turning to automated features, AI content and recommendation functions, and automated customer support. We agree with the proposals around search moderation however would like to emphasise the need for services to employ rigorous testing and scoping for any technologies that can enable bypassing or circumventing search moderation and age assurance technologies.
We agree with the proposed measures for user reporting and complaints. Particularly, we agree that services should create accessible, easy to understand, and transparent complaint and reporting systems that will appeal to children and vulnerable services users and thereby increase the likelihood of harmful content and user profiles being reported. We also agree with the proposed Terms of Service and Publicly Available Statements measures; we believe that language, presentation, and length of the document are key considerations for engaging children, young people, and vulnerable users in the Terms of Service. For example, we have found that employing audio and visual mediums as opposed to text to be more accessible for young people.
We agree with the proposed recommender system (information filtering through machine learning algorithms) risk mitigation measures. Recommender systems can create a constant stream of harmful content for child users, making it harder for child users to avoid triggering content. Children are at risk of coming across content that they may not wish to see, but because they have engaged with it, they will continue to be recommended said content.
We agree with proposals around user support and see the importance of these proposals being rolled out to all users to protect them from harmful content. We agree with the proposals around search features, functionalities, and user support, however, recommend adding crisis prevention information relating to sexual abuse and abusive relationships.
Conclusion:
We welcome the comprehensive, equitable, and supportive work that Ofcom have conducted with their draft Publications for Protecting Children from Harms Online. Overall, the proposals are detailed, extensive, and accessible for service providers. It is clear that risk of harm for children needs to account for and recognise each kind of harmful content, the likelihood of a child interacting with the harmful content on the service, the effectiveness of existing safety measures, the impact the content on children- both directly and indirectly- and the severity and reach of the content. It is important that the implementation of the Online Safety Act captures both legal and illegal content, and in order to effectively protect child users, we agree that there needs to be two separate Codes that work together to protect children.
We would like to see the growing use of “sextortion” reflected in Ofcom’s risk assessments guidance, as well as clarification on the Guidance for Pornographic Content, specifically the use of language, emojis, GIFs, and alternative forms of sexualised language that is not text-based. We would also like to see the Children’s Safety Codes and the Measures for Search features expanded to include content related specifically to issues around sexual abuse and abusive relationships.
We would recommend Ofcom highlight to services the need for additional training in Child Sexual Exploitation, Child Protection and Safeguarding, and Child Sexual Abuse. Ensuring that staff and technologies are trained, and stay up to date with any new advancements is imperative to protecting children online.
Finally, we want to reiterate our recommendations for creating accessible, clear, and alternative ways for terms of reference and complaints procedures to be created to reach more children and young people who might need support online as well as being entirely informed before consenting to use a service.