Ofcom, the UK’s communication regulator, has released its first guidance for technology companies since the implementation of stricter online safety laws last month. The guidance specifies that higher-risk websites will not display suggested friends to children, nor will their location details be visible to others. Additionally, to tackle the issue of illegal images of minors, the regulator proposes the use of hash matching to compare digital signatures with a harmful content database and flag any matches. Although hash matching is currently used by the largest social media companies, it cannot be used with end-to-end encrypted messaging services. The regulator is also considering keyword detection to combat harmful online content.

Dame Melanie Dawes, chief executive of Ofcom, stated that the regulator’s objective is to create a safer online environment for young people. She also cited the results of research showing that most secondary school children have been contacted online in a way that has made them feel uncomfortable, with many experiencing such contact repeatedly.

Companies operating under the draft codes will be required to have dedicated teams and an accountable individual responsible for online safety. Additionally, safety tests will need to be carried out when altering recommendation algorithms, and easy reporting and blocking must be provided. The Online Safety Bill has granted Ofcom the power to impose fines of up to £18m or 10% of an offending company’s global revenue.

Ofcom is expected to release final revisions of the guidance in Autumn 2024, following parliamentary approval. The regulator has stated that it will release further guidance on adult sites later this year and consult on methods to restrict the promotion of harmful content, such as self-harm, in the spring.