To build trust in data, empowering consumers and citizens is key
After a string of data breaches in 2018, privacy and protecting people’s information became a hot issue which shows no signs of abating. Digital privacy made a global 2019 top ten list of technology trends and GDPR was the data privacy acronym of the year.
In the second of a four-part series, The Mandarin explores the online privacy landscape and how building and maintaining trust means empowering consumers and citizens.
Privacy and ethics top trend in 2019
Technology research firm Gartner named digital ethics and privacy as one of the top ten strategic technology trends for 2019. Privacy sits alongside AI-driven development, quantum computing and blockchain as a rapidly growing trend, reaching a tipping point over the next five years.
According to Gartner vice-president David Cearley, people have a growing awareness of the value of their personal information. They are increasingly concerned about how this information is being used by both public and private sector organisations. Those organisations that don’t proactively address these concerns are at risk of consumer and citizen backlash.
“Any discussion on privacy must be grounded in the broader topic of digital ethics and the trust of your customers, constituents and employees. While privacy and security are foundational components in building trust, trust is actually about more than just these components,” said Cearley.
For Cearley, the shift from privacy to ethics means moving the conversation from ‘are we compliant’ to ‘are we doing the right thing’.
Australian online privacy concerns growing
The Office of the Australian Information Commissioner conducts the Australian Community Attitudes to Privacy Survey, the longest-standing study of Australian attitudes to privacy. The study canvasses all aspects of privacy from handling of personal information to online privacy to credit reporting.
The latest report shows Australians are increasingly concerned about online privacy risks with 69 per cent more concerned than they were five years ago. The online environment is perceived to be a higher risk with 83 per cent of Australians thinking there are greater privacy risks in dealing with an organisation online compared to traditional settings.
Curiously, these concerns do not necessarily translate to privacy protection. The survey found over three in five Australians do not regularly read online privacy policies while two in five do not regularly adjust the privacy settings on social media accounts.
The Consumer Policy and Research Centre (CPRC) in Victoria commissioned research which explored Australian consumers’ understanding of data collection, use and sharing when accessing products and services. This included online shopping; services such as email, search engines and maps; digital media platforms; and mobile payment technology. The findings are included in their report on consumer data and the digital economy.
Once again, the majority of participants admitted they do not read privacy policies or terms and conditions, with only six per cent reading the documents for the products or services they signed up to in the past 12 months. The research also revealed that consumers do not fully understand what type of information was being collected and shared.
Focus group participants found company policy documents are not an effective communication tool. Length, complexity and readability is frequently an issue. Researchers from Carnegie Mellon University found it would take the average person 244 hours per year (six working weeks) to read all privacy policies that apply to them, not including the time it would take to check websites for changes to these policies.
“I skim through them, read any text that is interesting, highlighted in red, but even then, I don’t understand what it means, and I don’t get much out of reading it.”
– Focus group participant
Levels of privacy knowledge differ
“I confess, I sometimes wonder what I am agreeing to…”
– Focus group participant
Consent is vital to the social contract between individuals as data providers and organisations as users of that data. The design of an effective consent regime can empower people to control what data they share, why they share it, when they share it and for what purposes.
Building trust and empowering consumers is at the heart of the General Data Protection Regulation (GDPR), the European Union’s new data privacy law which came into effect from May 2018. It is one of the EU’s most wide-ranging legislative reforms, making sweeping changes to protect citizens’ data and reshape how organisations approach data privacy.
One of the reforms is specifying a set of rights individuals have to access and control their data. These rights include:
The right to be informed: Organisations must inform individuals about what data is being collected, what it’s being used for and how it is being shared.
The right of access: Individuals can access their data held by an organisation.
The right to rectification: Individuals can request that incorrect, inaccurate or incomplete personal data is corrected.
The right to erasure: Also known as the “right to be forgotten”, individuals can request their personal data is erased or deleted.
Additionally, the GDPR includes a new definition of consent which gives people genuine choice and control over their data. Under the GDPR, consent must be freely given, specific, informed and unambiguous. People also have the right to withdraw consent.
The GDPR also imposes obligations on how personal data is processed. There must be a “lawful basis” to process personal data and consent is one the conditions for lawful basis. No single basis is ’better’ or has primacy than the others – which basis is most appropriate to use will depend on the purpose and relationship with the individual.
Other conditions include:
Contractual necessity – for the performance of, or entry into, a contract.
Legal obligations – for compliance with a legal obligation.
Vital interests – the processing is necessary to protect someone’s life.
Public task – for the performance of a task carried out in the public interest or in the exercise of official authority.
Legitimate interests – for legitimate interests except where such interests are overridden by the interests, rights or freedom of the individual.
In complying with GDPR requirements, some privacy policies have become more complex. A readability study analysed a sample of 3000 pre-GDPR and post-GDPR policies. The results showed that reading privacy policies post GDPR had become more demanding. On average, the text featured 28 per cent additional words and 33 per cent more sentences. For the user, these policies demanded more time and effort.
Another analysis was undertaken of ten major websites spanning email, social media, retail, search engines and media services. The analysis examined word count, reading time and reading grade level before and after GDPR to determine how easy it was for users to understand their policy changes.
This has led to the emergence of just-in-time privacy notices. A just-in-time notice appears at the point where an individual provides a particular piece of information. The notice gives the individual a brief message explaining how the information they are about to provide will be used. The UK Information Commissioner’s Office has endorsed the use of just-in-time notices in its guidance to organisations on the GDPR.
Additionally, the ICO recommends a blended approach to privacy information rather than a single notice. Providing privacy information through a variety of media allows for flexibility and avoids overly complex notices. Just-in-time notices are one element of a suite of tools which also includes layering of information, dashboards and icons.
Prescriptive requirements for privacy notices can also be counterproductive as privacy regulation carries costs and benefits. These costs may outweigh the harms they reduce.
“It should be my data. I should have rights to it.”
– Focus group participant
What about consumer data rights?
The Australian Government is establishing a Consumer Data Right In response to the Productivity Commission’s 2017 inquiry on data availability and use. The CDR will improve consumers’ ability to compare and switch between products and services by allowing data to be shared with third parties, such as comparison sites.
In the first instance, the CDR will apply to the banking sector from July 2019 and the energy and telecommunications sectors are proposed to follow.
Facebook, Google, Microsoft and Twitter are collaborating on the Data Transfer Project. The project is extending data portability beyond a user’s ability to download their data from their service provider. It will provide the user with the ability to initiate a direct transfer of their data into and out of any participating provider.
This will allow users to easily switch to another product or service rather than being locked in. Data portability can also provide security benefits for users.
Trust is still king
Winning the trust of consumers and citizens over the collection and use of their data is pivotal to harnessing the economic and social value of data. While the regulatory landscape and legislation can help build public trust, there is also an imperative for transparency and accountability from private and public sector organisations who collect, use and share data.
As David Cearly observed, this means shifting from a position of ‘are we compliant’ to ‘are we doing the right thing’.
Posted by @jrostant