The new Labour government is poised to take significant strides in enhancing digital governance and safeguarding online spaces through the ‘Digital Information and Smart Data Bill’.
Highlighted in The King’s Speech, this bill aims to set new standards for how personal data is handled, enhancing transparency and accountability across digital platforms.
Its objectives are to ensure that people have greater control over their personal information while harnessing the power of data for economic growth, to support a modern digital government, and to improve people’s lives within a robust and ethical framework.
The question of how to regulate the proliferation of harmful online content and create a safer online environment is one that tends to polarise opinion.
As Mark Zuckerberg put it in 2020, “We have to balance promoting innovation and research against protecting people’s privacy and security.”
Recently, I had the privilege of organising a media law event at Google’s vibrant Dublin office, bringing together experts from around the globe and where I chaired a legal panel which focused on internet litigation and online regulation and included insights from industry leaders like Adam Smyth of BBCNI, Dualta O’Broin of Meta Platforms Ireland Ltd, and Judy O’Connell from the Media Commission for Ireland.
As a litigator, staying ahead of regulatory challenges is crucial - not just for horizon scanning but to understand where society is setting legal boundaries as our grasp of technology grows. In Northern Ireland, the law is fairly well settled with regards to online liability and when a social media platform loses its safe harbour defence thereby adopting the status as secondary publisher and liable in law.
The starting point of the legal principles are the EU Directive, which was transposed into national law by virtue of The E-Commerce Directive 2002. This statutory framework provides both transparency for a complainant, and access to a timely Notice and Take Down reporting mechanism, as well as providing immunity from liability for social media platforms in certain circumstances.
The framework provides social media platforms with a qualified safe harbour defence, and which first established the Notice & Take Down procedure for complainants.
One landmark case in Belfast, CG v. Facebook Ireland Ltd, highlighted the importance of social media platforms acting swiftly to remove unlawful content. The court noted how quickly comments can multiply, emphasizing the need for prompt action to curb the spread of harmful material.
In the UK, Ofcom has confirmed that its focus under the Online Safety Act 2023 (OSA) is Governance, Design, Trust and enabling choice to social media users. Ofcom’s roadmap for implementing the OSA consists of three key phases:
- Illegal harms duties: Ofcom published draft codes and guidance on these duties, which are expected to come into effect from 2025.
- Child safety, pornography and the protection of women and girls: Draft guidance is due to be published by spring 2025.
- Transparency, user empowerment, and other duties on categorised services: A select proportion of regulated services if they meet certain thresholds will have duties to: produce transparency reports; provide user empowerment tools; operate in line with terms of service; protect certain types of journalistic content; and prevent fraudulent advertising.
As we move forward, it is crucial to maintain an open dialogue among stakeholders, continually refine our approaches, and stay ahead of emerging challenges. By doing so, we can ensure that the digital world remains a place of innovation, trust, and safety for all users.
The journey is ongoing, and our resolve to protect and empower online communities will shape the future of the internet. Watch this space.
- Olivia O’Kane is partner and head of technology, media, cyber & data privacy disputes and public law in Northern Ireland & RoI for DWF Law (https://dwfgroup.com)