GDPR: How do Australian privacy and data protection laws stack up?

Over the last few weeks our inboxes have been flooded with privacy reminders, a direct response to the implementation of a new European regulation on privacy, the General Data Protection Regulation (GDPR). In this piece, Gerard Brody looks at how Australia’s privacy and data protection laws stack up and finds our laws and enforcement bodies are ill-equipped to protect privacy and consumer rights in a world of “big data”.  


download (1).png

Over the last few weeks, all our inboxes have been flooded with privacy reminders. Social media platforms, subscription services, and seemingly every other business that holds your personal information may have sent you an email to remind you of their privacy policy. 

These reminders are a direct response to the implementation of a new European regulation on privacy, the General Data Protection Regulation (GDPR). Any website that collects personal data from users in the EU needs to make sure it satisfies GDPR’s requirements as of May 25, when the law came into effect. Even though a service or website might be here in Australia, if it collects personal data from users in the EU, then it needs to comply. 

So, how do Australia’s privacy and data protection laws stack up in comparison? The unfortunate answer is that our laws and enforcement bodies are ill-equipped to protect privacy and consumer rights in a world of “big data”.  

The new EU regulation has been welcomed by consumer and privacy advocates around the world. The EU rules have been developed directly in response to the rise of big data, that is, data sets that may be analysed computationally to reveal patterns, trends, and associations, especially relating to human behaviour and interactions.  

Big data and algorithms are valuable to big business, as they can aid the development of new products and services, predict the preferences of individuals, help tailor services and commercial opportunities, and guide individualised marketing. We at Consumer Action, however, are only too aware of how big data analytics can also harm, particularly lower income and underserved people. 

Insurance provides an example. Traditionally, the business of insurance was about pooling risk for the benefit of all. The 19th century American insurance entrepreneur Jacques DR described insurance as ‘a great fund’ to ‘share the burden of suffering and calamity’.  

Today, however, decisions such as whether an insurer will insure someone’s house, car or even life, and on what terms, are made by algorithms. This creates efficiencies for the insurer, but outsourcing decision-making to machines risks bizarre and even discriminatory outcomes.  

Insurers are teaming up with banks, retailers, airlines and many others to access information about our finances, lifestyle and diets. Right now, insurers are setting our car insurance premiums based on the exponentially increasing personal information that they have about us.  

You might pay more if you did not finish high school, eat a lot of pasta and rice, or fill up on petrol late at night. Insurers justify this because the data they hold tells them such people are more likely to make a claim. In the UK, it was reported that one insurer charges you more if you have a Hotmail address, and several will hike up the price if your name is Mohammed

As insurers rely more on increasingly personal data to individualise and fragment risk profiles, we risk losing the mutual nature of insurance and the hard realities of financial and social exclusion emerge. The person who fills up on petrol at night may do so because that is when they work. It will be unfair if they therefore find insurance increasingly unaffordable. 

Consent is often viewed as the first line of defence of privacy, that is, we can agree or not to share our personal information. The reality is, though, that consent is not much protection. Often there isn’t much choice but to accept the collection and use of data if you want to use the service. That certainly is the case with services like insurance; we’re conditioned to answer a heap of personal questions when buying insurance. 

Under the GDPR, in addition to a requirement that people must give clear and free consent to how their data is collected and used, corporations can only collect ‘necessary’ data. In Australia, businesses often just need ‘implied’ consent to collect a lot of personal information. 

Recognising the limitations of consent, the GDPR goes further. The GDPR gives people the right to obtain, transfer and erase their own data, and to opt out of ‘automated decision making’ or profiling. Corporations must also assess the impact of any ‘high risk’ data processing. Failure to comply can lead to fines of up to €20 million or 4 percent of annual global turnover of the company. By comparison, breach of Australian laws can attract a relatively paltry $1.8 million. 

Australian privacy laws are yet to grapple with the world of big data. There certainly are not any specific protections in place. 

What we are getting is a new Consumer Data Right to access and transfer our data. This was announced with some fanfare in the recent Federal Budget, together with $44.6 million to aid its establishment. While we’re yet to know exactly what the proposed Consumer Data Right will provide, there is some indication that the new laws will allow Australians the right to ask a business to delete data held about them, as is the case in the EU. A right of deletion is essential if we’re to remedy the imbalance of power between big business and everyday people. 

More than that, the pendulum needs to swing towards consumer empowerment and protection if any privacy and data law reform are to match the safeguards and rights that now exist in the EU. If not, we risk some people lacking access to important services like insurance that provide protection when we most need it.


Gerard Brody is the Chief Executive Officer of the Consumer Action Law Centre. Gerard holds a Masters in Public Policy and Management, Bachelor of Laws (Hons) and Bachelor of Arts (Hons) from the University of Melbourne.

posted by @jrostant