Augmented and virtual-reality technologies have the potential to create immersive experiences with a host of applications. The amount of personal data they need to gather to be really useful, though, poses serious challenges around user privacy.
A recent report from the Future of Privacy Forum [PDF] sets out recommendations to tackle the privacy risks associated with immersive augmented (AR) and virtual-reality (VR) technologies that are increasingly being implemented in education and training, gaming, multimedia, navigation and communication.
As AR and VR applications that let users explore a shared digital overlay of the physical world in real-time become more widely adopted and improved, they will likely converge into one ‘extended reality’, or XR. These technologies accumulate and process vast amounts of sensitive personal information including biometric data, unique device identifiers, location and information about homes and businesses. Like other emerging technologies such as artificial intelligence and 5G communications, however, this creates a risk to data subjects that could undermine the further adoption of AR and VR platforms by limiting their usefulness. Without this data, XR technologies simply cannot function.
The Future of Privacy Forum is a think tank that brings together academics, consumer advocates and industry to explore the challenges posed by technological innovation and to develop privacy protections, ethical norms and workable business practices. The purpose of its report is to consider current and future use-cases for XR technologies and provide recommendations for industry to implement them responsibly through the adoption of privacy guidelines. This includes advice for policymakers who need to consider how their data-protection law obligations are implemented in regard to the collection of personal data by XR technologies; for example, how hardware manufacturers maintain transparency in their data collection, use and sharing as well as how developers can process data locally.
The report makes several key recommendations. First, that policymakers should carefully consider how existing or proposed data-protection laws can provide consumers with meaningful rights and companies with clear obligations regarding XR data. It also suggests that hardware makers should consider how XR data collection, use and sharing can be performed in ways which are transparent to users, bystanders and other stakeholders.
For their part, XR developers should consider the extent to which sensitive personal data can be processed locally and kept on-device, while ensuring that sensitive personal data is encrypted in transit and at rest.
Platforms and XR experience providers should implement rules about virtual identity and property that mitigate, rather than increase, online harassment, digital vandalism and fraud. They also need to establish clear guidelines that mitigate physical risks to XR users and bystanders, and provide a wide range of customisable avatar features that reflect the broader community, encouraging representation and inclusion.
It’s also their responsibility to consult with the larger community of stakeholders including industry experts, advocates, policymakers, XR users and non-XR users, and integrate community feedback into decisions about software and hardware design and data collection, use and sharing.
For their part, researchers working in this area need to obtain informed consent prior to conducting research via XR technologies and consider seeking review by an institutional or ethical review board if consent is impractical.
Many of these recommendations are aimed at industry and the considerations are best reflected within privacy policies as well as end-user license agreements (EULAs). The difficulty with attempting to retrofit existing policies and agreements for this industry is that they run the risk of not fully appreciating the way these technologies operate. The usual box-ticking method of agreeing to EULAs and privacy policies common with software and websites may not fit so easily into XR.
The extensive personal data collected by XR technologies helps create better immersive experiences, but it can also exacerbate privacy risks – the unique nature of these technologies makes it difficult to mitigate risks by applying existing privacy policies and practices from other digital media sectors and requires new innovative approaches to choice, security and transparency. For example, VR headsets can capture large amounts of personal data such as dexterity, ease of movement and reaction times, potentially building up a health profile – eye-tracking technology is far more intrusive than cookies.
VR manufacturers need to ensure privacy is protected and data processed and stored securely – and only if they have the express consent to do so. They should also consider, where a device’s processing of medical data assists with diagnosis or treatment, whether medical device authorisation is necessary and whether the privacy risks arising from research could be mitigated by data anonymisation.
The FPF recommendations aim to provide a way for industry and policymakers to tackle this without compromising the benefits provided by XR technologies. The report considers that XR technologies are ever-evolving and seeks to address this by focusing on actual harms tied to user data.
The desired outcome is clearly for policymakers to work within an innovation-friendly regulatory environment by clarifying and harmonising existing rules and introducing industry-standard recommendations specifically tailored for the XR industry. The best way to do this is for industry and policymakers to get behind recommendations such as those provided by the FPF and consider how best to implement these within their existing practices.
Rayyan Mughal is a commercial contract lawyer and associate with Marks & Clerk.
Sign up to the E&T News e-mail to get great stories like this delivered to your inbox every day.