Without bold reform our online spaces will remain toxic and dysfunctional, for people of all ages

October 1, 2024

Lizzie O’Shea is a lawyer and writer, she sues technology companies who have done the wrong thing. She is a founder and the chair of Digital Rights Watch.

Without bold reform our online spaces will remain toxic and dysfunctional, for people of all ages

Australia’s privacy laws have not been meaningfully updated in nearly 40 years, so the recent tabling of reforms in the Privacy and Other Legislation Amendment Bill 2024 are long overdue. It’s been a long road: for nearly four years, the Attorney General’s Department has worked through the reform process, making over a hundred recommendations for change. The vast majority of these were accepted by the government, setting the stage for bold reform. It was therefore significantly disappointing that the reforms that were eventually tabled (described as a ‘first tranche’) represent only a small handful of the recommendations.

The good news for those with a focus on children’s wellbeing in the digital age is that one of the reforms that did make it into this first tranche is for a children’s online privacy code. This is a significant achievement, and it is emblematic of the importance many Australians place on regulating to protect children online.  The proposed code will be drafted by the Office of the Australian Information Commissioner in consultation with civil society and other stakeholders. The expectation is that it will align with and follow the lead set by the UK children’s code, which was introduced in 2020. The UK code requires online services to comply with 15 standards. This includes privacy by default, strict constraints on using children’s personal information, and prohibitions on dark patterns or nudges that encourage young people to share more personal information.

The other key reform introduced in this first tranche is a statutory tort, to allow those who have experienced a serious violation of their privacy to sue for emotional distress. This proposal has been the subject of several law reform reports, and while it is unlikely to be used to remedy mass data breaches (the right to sue for such events was not included in the first tranche), it nonetheless will fill a gap in the law.

The problem for those interested in these issues is how much is left out. The government accepted that the definition of ‘personal information’ in the Privacy Act needed to be updated. This definition sets the scope of the Act’s coverage and is limited and unfit for the digital age. The creation of a ‘fair and reasonable’ test – which would end tick-a-box consent and instead impose more onerous obligations of companies to use personal information fairly and reasonably – has also not made the cut. These are key reforms that the government agreed were necessary but there is no timeline for when they might be coming.  This will unavoidably have an impact on the Children’s Online Privacy Code, which, no matter how well drafted, will be limited in scope by its enabling legislation.

Interestingly, industry is getting ahead of the curve. In the wake of the bill being tabled, Instagram immediately announced changes to accounts held by young people, introducing content restrictions, new rules about how users can be contacted, and features designed to curb screen time. One way to view this is as a sign of progress, another is that industry is aware of the very low regard with which these companies are held by many, many people. Changes of this kind should have come decades ago, and indeed, should be a design regime that adults should be able to benefit from as well.

And here lies one of the more critical tensions in piecemeal reforms oriented at children, whether it is the code or other policies like age assurance: it is one thing to make online spaces better for young people, but if they end up tossed into a toxic environment once they become adults, do we end up failing them anyway? We need a tide of privacy reform to lift all boats, such that all users with vulnerabilities are not subject to exploitation, and all people generally benefit from improvements in these information ecosystems. Without being too cynical about it, proposals aimed at children sometimes succeed to neutralise the political imperative for reform, which can be a problem if such reforms leave the broader trends untouched.

There is no way to avoid it: we need to talk about how data extractive business models are harmful not just for children, but for everyone, and our democracy more generally. Data extractive business models focus on engagement above all, which has significant negative consequences for society. Such business models incentivises extremist and misleading content because it allows users to monetise virality, and permits predatory industries to microtarget people via advertising. Without bold reform, which takes on those fundamentals of the digital economy, our online spaces will remain toxic and dysfunctional, for people of all ages. We must encourage our government to act to reform these laws for everyone, without delay.

*************************************

You May Also Like…

0 Comments