Public input is being sought on addressing digital content regulation following the release of a discussion document by the Department of Internal Affairs.
“Safer Online Services and Media Platforms” is the culmination of a lengthy period of consideration and engagement with the public to understand the nature of the issue, particularly the types of harm people are experiencing. You can access the discussion document here.
The discussion document seeks input from the public on addressing digital content regulation. Current regulation of digital content is outdated and inconsistent, and in some cases virtually non-existent. The overarching proposal is to have one regime that regulates all forms of platform across all types of digital content. In terms of designing the regime, the discussion document poses 26 questions where comment is sought. Many of the questions are general ones, such as what is thought about the different approaches that could be taken. The opportunity to help shape this area before draft legislation is prepared runs until 31 July 2023. Now is the time to have your say.
For readers interested in the international context, the discussion document includes an appendix which compares the features of the proposal to the regulatory environment in Australia, Canada, the EU, Ireland and the UK.
What is proposed?
Key elements of the proposed regime are:
- one cohesive framework that meets New Zealand’s social and cultural standards;
- codes of practice setting out specific safety obligations (ie standards and processes to manage risks to consumer safety) for which large or risky platforms will need to comply;
- a new independent regulator to promote safety on online and media platforms, with strong Māori representation being built into developing the regulator;
- codes of practice, meeting high-level safety objectives and minimum expectations set by legislation, would be developed by industry and approved by the new regulator (following community input), although the regulator could take the lead if industry did not do so;
- the new regulator would provide guidance on meeting the codes, check compliance by platforms with codes, recommend or direct the taking of remedial steps, and would be able to issue penalties for serious non-compliance; and
- the new regulator would also have the power to issue formal warnings and require illegal material to be removed quickly from public availability in New Zealand.
Two key aspects are discussed in depth in the discussion document. This is what would be regulated and how compliance is to be approached.
What would be regulated?
The framework would apply to content considered harmful or unsafe, regardless of the manner in which it is delivered. The content regulated would be content that causes or has a risk of causing loss or damage to rights, property, or physical, social, emotional, and mental wellbeing. However, the framework would not regulate the content providers, rather it would regulate the platforms that distribute or make content available. It would also only regulate the larger platforms, such as those with at least 25,000 account holders in New Zealand or an annual audience of more than 100,000 persons.
What would not change is what constitutes objectionable material and its nature as illegal. However, platforms would need to ensure any illegal material is identified and removed from their platforms.
The discussion document notes that the protections should support and not detract from human rights.
An interesting consequence of the proposal is that the current classification system for restricted content would cease and simply be a recommendation through the applicable code of practice with consumer warnings and age ratings in line with New Zealand standards (as is the case for broadcast content).
The discussion document also indicates that the codes of practice could be used to influence the practices of social media platforms. An example given is that the code could set rules for responsible and transparent design of ranking algorithms (as these can amplify harmful content). This shows the potentially wide scope that codes of practice could take.
The discussion document sets out initial expectations on what the codes of practice would cover:
- processes to prevent, remove, or reduce exposure to unsafe content;
- consumer focused processes, including how consumers are alerted to content that could cause harm and processes for making complaints, including an ability to appeal to a quasi-judicial body; and
- how platforms will report on the measures.
Examples on how these expectations could be met are also included in the discussion document.
How will compliance be approached?
In terms of monitoring compliance, the discussion document contemplates that the regulator could require platforms to submit transparency reports, undertake third party audits and provide information on their operations to enable the regulator to assess compliance. This will add a further compliance burden to platforms.
However, one area that has not yet been firmed up is how to deal with persistent and serious non-compliance by platforms. Options include larger fines or Court ordered service disruption. Comment is sought on these options and a view on whether the regulator should take a supportive or prescriptive approach to monitoring compliance.
How to have a say on the proposal
Feedback on the discussion document is being sought now and you have until 31 July to make a submission. Platforms that provide access to digital content should make a submission, as should anyone with a view on how digital content in New Zealand should be regulated. Submissions should address the questions posed by DIA, but submitters should not feel constrained by those questions if there are other matters you feel should be considered.
Submissions can be made by email, post or online. Page 10 of the discussion document provides information on how to make a submission.
Please get in touch with the team at Lane Neave if you would like assistance to make a submission to DIA.