Insights + Resources

12 December, 2018

AR and the trailing footsteps of Australian privacy law

Augmented reality (‘AR’) technology allows its owner to gather, process and eventually monetise information.  Whilst undoubtedly brimming with potential to positively impact human experience, AR also gives rise to very specific and significant privacy concerns under Australian privacy laws.  Whilst the need to address these concerns is widely recognised, the rate of growth of augmented reality technologies is likely to cause challenges for some time. We take a closer look below.

What is AR?

“Augmented reality” or “AR” refers to the use of technology to overlay digital information on images or video feed of physical reality. It is part of the same digital revolution as “virtual reality” or “VR”, but is distinguishable in that it does not aim to completely simulate reality through the use of computers. Where as VR aims to create a new, all encompassing reality, AR’s ultimate goal is to enhance human interaction with current reality.

Centred on a desire to minimise interface between the physical and digital worlds, AR technologies have reimagined the ways we collect, analyse and present information. Used appropriately, AR thus presents profound advancement opportunities for the human race. However, used irresponsibly, AR applications and hardware threaten the privacy of individuals, and pose challenges for regulators as they try to augment the law to stay in touch

Commercial applications of AR

AR was traditionally a technology of niche uses such as overlaying advertising and visual representations of commentary onto sports broadcasts. However, following huge leaps in computing power and data storage capabilities, AR is now one of the most commercially lucrative technologies in industries such as tourism, marketing, retail, education and health services. Examples of the most popular applications include Wikitude World Browser (travel), Pokémon Go (gaming), functions of Google Translate (real-time text translation), and Ink Hunter (real-time “trying on” of tattoos).

In some of the most significant endorsements of AR, Facebook announced last year that it would increasingly support developers who wish to integrate AR technologies into their platform, whilst smartphone developers and content hosts such as Apple and Google are marketing the technology as one of the core features of their new products.

Some argue that AR is already set to reshape our society as significantly as the internet, but the rate at which this will occur is now sure to accelerate given the size of Facebook’s user base (roughly 2 billion monthly active users at the time of writing) and the extensive consumer reach of the Apple App Store and the Google Play Store.

AR and General Privacy challenges

The value of AR technology stems from its capability to gather, process and eventually monetise information. Each of these steps gives rise to very specific and significant privacy concerns that in Australia predominantly falls under the mandate of the Privacy Act 1988 (Cth) (‘Act’), though other legislation and legal doctrines also frequently come into play.

It is often the case with rapidly evolving technology that the law lags behind. In Australia, the privacy of individuals is somewhat haphazardly protected by a patchwork of narrow legislation and circumstantially specific legal principles, whose complexity can pose compliance challenges for businesses. Whilst the need to address these inadequacies is widely recognised, the current rate of growth of AR technologies is likely to cause challenges for both individuals and businesses for some time.

The Privacy Act aims to regulate the collection and use of individuals’ personally identifiable information (‘PI’) by certain bodies in Australia. Some key categories of PI include:

•   Sensitive information, such as information about individuals’ racial or ethnic origin, political or religious beliefs, or other highly personal information such as sexual orientations or criminal record;

•   Personal information, such as an individuals’ contact details or date of birth;

•   Health information, such as someone’s medical record;

•   Employee records; and

•   Credit and taxation information.

The substantive requirements of the law are outlined by the Australian Privacy Principles (‘APPs’) which apply to “APP entities” such as businesses with annual turnovers of more than $3 million, or organisations whose primary trade is in the information of others.  As at 2018, breaches of the APPs may result in fines of up to A$1.7 million for businesses or A$340,000 for individuals.

However, unlike privacy legislation in Europe, the Privacy Act and related laws often fail to establish clear rules and benchmarks for businesses to avoid liability. Instead much of Australian privacy law is based on subjective thresholds of “reasonableness”. Whilst this was a flexible approach in the past, the ambiguity is increasingly a hindrance for businesses seeking to avoid liability stemming from emerging technologies.

In the remainder of this paper we focus on the following specific privacy protection challenges that AR poses to the PI of Australians:

•  New ways of collecting PI; and

•  New ways of analysing, manipulating and using PI.

 

New ways of collecting PI

The first set of challenges arises from the novel ways in which information is collected via AR. Given improvements in hardware, AR technologies can now record information in a wide variety of media and data formats including pictures, video, audio, logs of wireless and network frequencies, spatial tracking data, and other statistical information gathered from user inputs.

Gathering information in some of these ways can pose clear threats to the privacy of individuals. For instance, individuals are increasingly likely to be the subject of unwanted surveillance by hardware such as wearable cameras. Whilst Australian privacy law ostensibly deals with this sort of behaviour, the extent of protection for individuals is ultimately limited by State laws that do not align and are often severely outdated. In NSW, for example, visual surveillance is addressed by restricting the instillation or maintenance of optical devices, whilst the corresponding laws in Victoria, Western Australia and the Northern Territory focus on the nature of the activity being recorded and instead restrict the surveillance of “private activities”. This lack of legislative uniformity not only creates unnecessary confusion for privacy-minded individuals who travel interstate, but can also cause headaches in compliance for businesses who intend to roll out AR technology in multiple states.

To a degree, concerns about collection of PI through digital surveillance can be addressed by obtaining the consent of the individuals whose information is being collected. However, in practice the issue is far more complex and challenging, and there is significant likelihood of AR technologies breaching important consent provisions of Australian law.

For example, a tourist might download an application like Wikitude World Browser to their smartphone through which they aim to enhance a travel experience. In exchange for access to the device’s camera the application may offer trivia about sights, but in doing so may also lead to the tourist documenting data such as the presence of other individuals at a tourist site. Whilst this action of itself may be permissible under Australian law on the basis of “reasonable necessity”, it is important to recognise AR’s capacity to record and analyse many forms of data concurrently. Consequently, it is possible for the tourist’s video feed of third parties to be processed by real-time algorithmic applications of facial and body recognition technology which can not only identify the parties’ physical appearance, but also “sensitive” biometric PI about them such as their ethnicity, age or even psychological state.

A business whose use of AR technology does this, even unintentionally, will be in breach of the consent requirement of APP 3 (“collection of solicited personal information”). APP 3 stipulates that entities may only collect sensitive information that is “reasonably necessary” for the entity’s functions with the express or implied consent of the individual. In the scenario above, even if the information about the third party might meet the requirement of reasonable necessity, it cannot rightly be said that an unaware individual in a public space may implicitly agree to such invasive biometric analysis.

APP 3 permits the use of AR technology to collect personal information, such as contact details or consumer preferences, without consent as long as the information is “reasonably necessary” for the entity’s functions. As AR technologies improve their ability to collect and analyse data will increasingly blur the lines between seemingly innocuous information and PI. For example, in early January a significant threat against military security was revealed when analysts realised a user-generated Global Heatmap released by fitness app creators Strava effectively revealed sensitive military information. In this case, even the anonymised publication of legally collected was capable of revealing other protected, sensitive information. Unfortunately, given our current legal framework incidents like this will not be isolated, and it is likely the legal challenges and risks for individuals and businesses will only increase.

New ways of analysing, manipulating and using PI

Once information has been collected, the real power of AR technologies lies in their ability to manipulate and use such information. In this regard APPs 4 through APP 8, and APP 11, are particularly important for Australian individuals wanting to protect their privacy and for businesses seeking to avoid legal liability. At a high level, these privacy principles provide that APP entities:

•   Are restricted in how and for what purposes they can use unsolicited PI (APP 4: Dealing with unsolicited personal information);

•   Must notify individuals of certain matters when their PI is collected (APP 5: Notification of the collection of personal information);

•   Can normally only use or disclose PI only for certain purposes for which it is collected (APP 6: Use or disclosure or personal information);

•   Can only use personal information for direct marketing purposes when its’ reasonable to do so (APP 7: Direct marketing);

•   Must ensure PI disclosed overseas is adequately secure and subject to similar regulation and security measures as in Australia (APP 8: Cross-border disclosure of personal information); and

•   Must destroy or de-identify PI in certain circumstances, or must otherwise take reasonable steps to protect collected PI from misuse, interference, loss, unauthorised access, modification or disclosure (APP 11: Security of personal information).

Whilst the concerns for individuals’ privacy are evident, businesses who utilise AR technology must also be wary of the significant liabilities they face in the storage, transmission and use of data collected. For example, an organisation that has taken less than “reasonable steps” in maintaining server or device security against data interference will be in breach of APP 11 (“security of personal information”).

Specifically, relevant causes of concern lie in how information is processed and dealt with once collected. To this end it is important to understand the role that standardised browsers play in the development of AR applications. AR browsers are similar to pre-packaged toolboxes that provide AR application developers with ways of controlling the hardware components of devices. In this sense, AR browsers are the “baton” used by AR developers to conduct and control the gathering, transmission and analysis of information.

Given their nature, AR browsers offer possibilities for malafide parties looking to intercept or interfere with data. For example, browsers may be exploited in modern interpretations of traditional cybersecurity threats such as phishing schemes, which could overlay user interfaces with intentionally misleading links that redirect data transmissions.

Alternatively, malicious parties may also develop innovative means of harnessing the capabilities of AR devices. For instance, the tourist’s AR device could be compromised and subsequently have its hardware programmed to respond to visual cues such as certain landmarks or possible everyday objects. This could subsequently result in many combinations of intrusive outcomes such as cataloguing or redirecting the tourist’s video feed, or even pass control of the entirety of the device’s components and data into the wrong hands.

Other matters

As global society and business models become increasingly integrated and internet-dependent, companies using AR technology will most likely transfer or hold Australians’ information overseas at some point. Accordingly, they must remain mindful of their obligations under APP 8 (“Cross-border disclosure of personal information”) to take “reasonable steps” in ensuring overseas recipients are similarly obligated to protect the information. This will be of concern if work is outsourced to less regulated jurisdictions and will require specific contractual clauses, but Australian businesses will also need to be prepared to meet the privacy requirements of more stringent jurisdictions. For example, the General Data Protection Regulation (‘GDPR’), which became enforceable in May 2018 , requires companies to comply with strict requirements for explicit consent, as well as mandating the default inclusion of data protection measures such as data minimisation in emerging AR technologies. These measures are more demanding than Australia’s current laws, but they also represent an opportunity to establish best practice for businesses seeking to minimise their legal liabilities in Australia.

Finally, as of 22 February 2018 all APP entities must also comply with the Notifiable Data Breaches (‘NDB’) scheme. If an APP suffers a data breach that is likely to result in serious harm to an effected individual, then the APP entity must notify the relevant individual(s) and the Office of the Australian Information Commissioner. It does not matter if the breach was caused by a malicious third-party or accidentally, the relevant entity will need to “reasonably” and quickly assess whether a breach is serious enough to be reported. If they get this calculation wrong, they may face serious fines for not complying with the APPs.

Concluding remarks

The growth of AR presents a duality for modern consumers. On the one hand the benefits of the technology arise from the innovative ways information can be collected, amalgamated and analysed. However, these factors also provide the basis through which the privacy of individuals can be readily breached and, in turn, highlights risks faced by organisations utilising AR technologies.

Presently, Australian privacy laws lag behind emerging technologies such as AR. Nevertheless interests of individuals and businesses should be aware of relevant privacy laws that apply, and stay abreast of the inevitable augmented reforms that will attempt to address the trailing footsteps of regulators.

The information above is general in nature. If you would like to learn more about the collection of personal information under the Privacy Act, please contact us below.