We’re closing out this year with some reflections on the growing global concern over data privacy, evidenced in actions by regulators, private companies and importantly consumers. If you’d like to explore better ways of delivering relevant services to your customers and partners while maintaining privacy, security and convenience - get in touch. We’d love to support you.
Action by Regulators
Regulatory bodies have a range of levers to pull to effect change and in this past year we’ve seen different approaches across jurisdictions that we work in, all of which are designed to reshape the behaviour of organisations that hold personal data. Our prediction for the coming year is that the combined effect of these measures will impact how a wide range of service providers operate, especially those that have a global footprint.
Penalties for cybersecurity breaches
The proliferation of data breaches in Australia in the last half of 2022 shone a light on the dangers of collecting and storing personal information as first major telco Optus, followed by healthcare provider Medibank and international real estate company Harcourts informed the Office of the Australian Information Commissioner (OAIC) under the Notifiable Data Breach scheme.
In response to these breaches, the Federal Government acted quickly to pass a bill introducing more severe penalties (up to $50 million, or 30 percent of adjusted turnover) intended to “send a clear message to large companies that they must do better to protect the data they collect.”
While steeper penalties may sharpen focus on security practices, it’s likely that a wholesale shift in the approach to data collection will have a longer-term impact. In a submission to the Australian Parliament, the Australian Banking Association (ABA) picked up on this highlighting the short comings of the Government’s response that was focused solely on fines. The ABA suggested that “the OAIC has a number of other regulatory powers available that involve supporting entities or instructing entities to improve their security and retention capabilities. Effective use of these tools would do as much, or more, to promote improved awareness of and compliance with Privacy Act obligations”.
For the most part, companies are collecting personal data for one of two reasons: to meet specific Know Your Customer (KYC) requirements or, as part of customer onboarding. As is the case in many countries, the KYC data retention rules are enshrined in legislation that was established prior to widespread digital onboarding practices and should be revised as part of a holistic national cyber security strategy. Interestingly - and despite common practice - the Australian Transaction Reports and Analysis Centre’s (AUSTRAC – the Australian government’s financial intelligence agency) customer identification procedure does not require organisations to copy documents. The guidance is clear “you don’t have to copy documents (for example you can record details of a driver’s licence or passport rather than photocopying them)”. It must simply be convenience that leads a great number of service providers to automatically reach for the photocopier.
With respect to the collection of personal information as part of customer onboarding, additional guidance is most definitely required. For example, it’s common practice across the world for hotels to photocopy passports and driver's licenses and for over collection of personal information for digital sign ups. Now that Australia’s review of the Privacy Act is finally complete all the signs are there that we will see sweeping reforms to ensure better processes and outcomes.
Significantly, revisions to Australia’s original Privacy Act created in 1988 include provisions for new information sharing among domestic and international regulators, as well as clarity about how the Australian laws apply to overseas companies. As these new provisions come into force next year, it’s likely that service providers will need to rethink current practices.
Improved guidelines for identification of individuals
Taking a leadership position in offering better guidance for the identification of individuals, the USA’s National Institute of Standards and Technology (NIST) has published a revised draft of its Digital Identity Guidelines for comment. With some helpful framing for the digital economy NIST highlights that the proposed amendments are intended to address “the need for reliable, equitable, secure, and privacy-protective digital identity solutions.” And in line with the wider global shift, there’s increased focus on taking a “human-centered” approach, addressing the impact on and needs of individuals and communities, in addition to the earlier focus on organisations.
The consultation document specifically asks “How might equity, privacy, and usability impacts be integrated into the assurance level selection process and digital identity risk management model?” Further, it cites the need to take “revocation of consent” and “limitations of use” into consideration. Helpfully, the review also points to Mobile Driver’s Licenses and Verifiable Credentials and asks how this ‘digital evidence” might be incorporated in identity proofing.
The guidelines make interesting reading for public and private sector entities alike because they specifically consider the challenges that digital identity systems present, acknowledging the potential “to create privacy-related problems for individuals”. Building on this, they refer to the processing of PII as “a problematic data action”. Our expectation is that this helpful framing will directly shape systems design in the coming year and that adherence to such guidelines will become a hygiene factor in data and identity solutions that come to market. As a result, organisations seeking to verify the identity of an individual may ask for a credential from an approved body, such as their bank, rather than raw PII.
Improved guidelines for data sharing
Widely recognised as a pioneer of data privacy as a result of GDPR, the European Commission announced new measures that came into effect in June 2022 and will apply in full from September 2023.
The Data Governance Act (DGA) acknowledges that data-driven innovation will deliver significant benefit to its citizens and the economy and looks to ensure equitable access to data while ensuring “data portability and interoperability, and avoiding lock-in effects.” Alongside this, it recognised that because of low trust (among other issues) in data sharing, the potential benefits are not being realised.
Like NIST, the EU calls out the need for a “human-centric” approach, with greater transparency regarding data use. In the words of the European Commission “The Data Governance Act provides a framework to enhance trust in voluntary data sharing for the benefit of businesses and citizens.”
To facilitate this, the DGA outlines a new category of service provider – a Data Intermediary – that has a fiduciary duty to act in the interests of data subjects. As part of this, a Data Intermediary must also help individuals exercise their rights under GDPR.
Helpfully, the Commission’s explanatory guide provides examples of organisations that fit the intermediary characteristics, which includes MyData Global whose goal is to ‘empower individuals by improving their right to self-determination regarding their personal data.’ Meeco has been an awarded MyData Operator for the last three years, and this recognition by the Commission is important validation of the far-sighted work of MyData Global.
These new measures will be overseen by a newly established European Data Innovation Board (EDIB) that will provide oversight of Data Intermediaries as well as ensuring data interoperability when they come into effect next year. Building on GDPR, this very practical next step will undoubtedly determine which types of organisations will succeed in the European data economy.
Penalties for data privacy breaches
If the DGA intends to provide a carrot for trusted data sharing, there are plenty of examples of hard-hitting sticks on the global arena.
Most recently Epic Games was fined in the US for violating federal child privacy laws. Reflecting calls for transparency and more ethical data practices, the makers of Fortnite were charged for underhand practices that led to accidental purchases, as well as collecting personal information from children under the age of 13, without parental consent.
The violation of the Children's Online Privacy Protection Act (COPPA) attracted a USD $245 million fine from the US Federal Trade Commission and the instruction to adopt stronger privacy protections. In what we hope will be the turning of a much bigger tide, Epic Games met the fine with a commitment to “build safeguards that help keep our ecosystem safe”.
Also in the US, the Northern District Court of California San Jose Division, approved a $90 million settlement with Meta Platforms. The settlement highlights that Meta’s alleged practice of tracking users’ activity on pages that displayed a “Like” button violated four different state and federal laws, including the California Invasion of Privacy Act.
In a similar vein, the European Data Protection Board (EDPB) announced this month that it had adopted three dispute resolution decisions relating to Meta’s Facebook, Instagram and WhatsApp platforms. While the specifics of the settlement and any fines are not yet in the public domain, informed commentary suggests that Meta will no longer be able to combine consent to the use of personal data for advertising in the acceptance of its “provisions of service”. As a result, Meta may need to offer a version of the apps that does not use personal data for advertising. The fallout from this ruling is likely to have a significant impact on the monetisation of personal data and consent more broadly.
Changes in consumer attitudes
What’s interesting about the above actions against the social media platforms is that they are initiated by private individuals and organisations representing the individual – rather than regulators.
In particular, NOYB (initiator of the action against Meta) whose name is derived from (my privacy is) None Of Your Business has a clear mission to address the fact that there is only limited enforcement of privacy laws. NOYB’s policy work embeds the view that “as we are transitioning to the digital age we have the challenge of privacy and how to implement it”. The recent ruling by the EDPB came about as a result of action that NOYB commenced in 2018.
Increased consumer awareness and action has also been catalysed by Apple’s moves to enable customers to “take control over their data”. Announced in January 2021, the requirement for apps to get users’ permission prior to tracking data across 3rd party apps and websites had a snowball effect. Less than a year after the changes were implemented, Facebook pinned its steep fall in earnings in one quarter on the iPhone’s new privacy settings.
With adoption as the primary goal of any service provider, this attitudinal shift must form part of every organisation’s data strategy. And with this in mind, it seems likely that we’ll see greater focus on innovation in this area. A snapshot of the Australian Fintech landscape¹ published recently by KPMG found that Payments remains the largest fintech sector, with Blockchain and Crypto Currency recording the fastest growth since last year’s snapshot. As 2022 draws to a close, it seems very plausible that the proliferation of data breaches and increased focus on privacy will see rapid growth in Meeco’s home category of Data and Analytics in the year ahead.
Where to from here?
With new guardrails and guidelines in place we’re confident that the coming year will see the necessary step change in data privacy. And NIST’s newly released guidance provides a fitting description for the new paradigm that we must all embrace:
If you intend to heed this call to action, get in touch, we have ISO 27001 accredited tools that you can deploy to accelerate your mission.
¹ The 2022 Australian Fintech landscape document, published by KPMG is no longer available. The 2023 version can be viewed here.