New UK ‘Duty of Care’ Rules To Apply To Social Media Companies
The new ‘Online Harms’ whitepaper marks a world first as the UK government plans to introduce regulation to hold social media and other tech companies to account for the nature of the content they display, backed by the policing power of an independent regulator and the threat of fines or a ban.
Duty of Care
The proposed new legal framework from the Department for Digital, Culture, Media and Sport (DCMS) and the Home Office aims to give social media and tech companies a duty of care to protect users from threats, harm, and other damaging content relating to cyberbullying, terrorism, disinformation, child sexual exploitation and encouragement of behaviours that could be damaging.
The need for such regulation has been recognised for some time and was brought into sharper focus recently by the death in the UK of 14-year-old Molly Russell, who was reported to have viewed online material on depression and suicide, and in March this year, the live streaming on one of Facebook’s platforms of the mass shooting at a mosque in New Zealand which led Australia to suggest fines for social media and web-hosting companies and imprisonment of executives if violent content is not removed.
The Proposed Measures
The proposed measures by the UK government in its white paper include:
GDPR-Style Fines (Or A Ban)
Culture, Media and Sport Secretary Jeremy Wright has said that tech companies that don’t do everything reasonably practicable to stop harmful content on their platforms could face fines comparable with those imposed for serious GDPR breaches e.g. 4% of a company’s turnover.
It has also been suggested that under the new rules to be policed by an independent regulator, bosses could be held personally accountable for not stopping harmful content on their platforms. It has also been suggested that in the most serious cases, companies could be banned from operating in Britain if they do not do everything reasonably practical to stop harmful content being spread via their platforms.
Although there is a general recognition that regulation to protect, particularly young people, from harmful/damaging content is a good thing, a proportionate and predictable balance needs to be struck between protecting society and supporting innovation and free speech.
Facebook is reported to have said that it is looking forward to working with the government to ensure new regulations were effective and have a standard approach across platforms.
The government’s proposals will now have a 12-week consultation, but the main criticism to date has been that parts of the government’s approach in the proposals are too vague and that regulations alone can’t solve all the problems.
What Does This Mean For Your Business?
Clearly, the UK government believes that self-regulation among social media and tech companies does not work. The tech industry has generally given a positive response to the government’s proposals and to an approach that is risk-based and proportionate rather than one size fits all. The hope is that the vaguer elements of the proposals can be clarified and improved over the next 3 months of consultation.
To ensure the maximum protection for UK citizens, any regulations should be complemented by ongoing education for children, young people and adults to make sure that they have the skills and awareness to navigate the digital world safely and securely.