Data is a measure of reality. Observations become records; records translate to patterns; patterns allow predictions. It's a simple arithmetic that governs scientific processes and conclusions about our online world. But in a world where we are constantly monitored online, data becomes reality. The purchase of seeds may reflect a new gardening hobby. An increase in searches regarding illness may suggest a recent diagnosis. A tendency to look up home design ideas may show dissatisfaction with living conditions. Every click, text, or purchase compiles data that reflects a user's thoughts and intentions. Nevertheless, tracking user data is nothing new.

Data analytics are a daily internet phenomenon. Nearly every internet user has encountered tracking Cookies or social media feeds geared to their interests. For the most part, data protection measures can stop Big Data from prying into people's minds. Private policies and the choice to opt out of tracking Cookies help users set limits on their data extraction. Although some data slips through the cracks, data-protection measures create a veil of separation between Big Data and the causal browser. But what happens when a new technology rips through the fabric?

Enter virtual reality into the conversation. Since the 2000s, VR has plunged users into virtual worlds. With astounding virtual visuals and gloves with virtual touch, users interact with people and worlds wrapped in a headset and gear. VR is not just a laptop with a few buttons–its features promise true immersion. Yet, with immersion comes greater penetration into our personal lives.

VR's miraculous powers grow from biometric data: hyper-specific data that monitors physical behaviors. Through a headset, they can collect data on a user's walking pattern, body height, shoulder breadth, eye movement, and hyperacute responses such as skin tension [1,2]. By compiling and analyzing this data, they can refine and perfect virtual experiences to emulate the real world. However, biometric data can be used in other, more nefarious ways.

Biometrics data can reveal quite a bit about users' minds, even uncovering the ticks of their subconscious minds. In a study from the University of Melbourne, Dr. Imogen Bell reveals that biometrics can betray our level of truthfulness, trust, and stress [2,3]. Not only can our VR technologies act as lie detectors, but they can determine if we have a psychological condition such as psychosis or PTSD [3]. Paired with algorithms, data processors can know what makes users tick before they've started the clock. To reach a level of mind-reading, all VR companies need is the user's biometrics.

Considering their data policies, VR companies have ample user data; Meta, Apple, and Sony have built-in biometric software that begins monitoring users upon first use [4,5,6]. These three companies alone own over seventy percent of the VR market, so biometric monitors have mapped users from top to bottom [7]. Whether or not they interpret this data to identify psychological information is unclear, yet their ability to identify users' psychological conditions remains. Nevertheless, these are privacy breach possibilities, not realities. Since biometrics present such a profound threat to mental privacy, there should be laws to prevent Big Data from reading our minds.

Unfortunately, technology companies must often make decisions before relevant laws are passed [8]. This leaves company policy as the user's first line of defense against data misuse in the US [9]. However, for companies whose data profiteering is in their financial interest, users have a right to be concerned about biometric data leaking onto suspicious platforms [10]. As user biometrics remain unregulated, only the promises of Big Data guard the door to the user’s mind. Who says they won’t turn the key themselves?

As mind-reading companies become a reality, our future sounds more like the plot of a dystopian novel than the twenty-first century. But our reality is fact, not fiction. So, to safeguard our rights, it might be time to write to our representatives to stop Big Data from reading our minds.

1. Pfeuffer, Ken, et al. "Behavioral Biometrics in VR: Identifying People from Body Motion and Relations in Virtual Reality." Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems, 2 May 2019, pp. 1-12, https://doi.org/10.1145/3290605.3300340.

2. Holzwarth, Valentin, et al. "Towards Estimating Affective States in Virtual Reality Based on Behavioral Data." Virtual Reality, vol. 25, no. 4, 1 Dec. 2021, pp. 1139-52, https://doi.org/10.1007/s10055-021-00518-1.

3. Bell, Imogen H. "Virtual Reality as a Clinical Tool in Mental Health Research and Practice." Dialogues in Clincal Neuroscience, vol. 22, no. 2, 22 June 2020, pp. 169-77, https://doi.org/10.31887%2FDCNS.2020.22.2%2Flvalmaggia.

4. "Biometric Data Policy for the United States." Sony Pictures, Sony, www.sonypictures.com/corp/biometricdatapolicyfortheunitedstates#:~:text=As%20part%20of%20the%20Spidersona,(%E2%80%9CBiometric%20Data%E2%80%9D. Accessed 28 Nov. 2023.

5. Gonzalez, Bianca. "Apple Vision Pro Headset to Have Optic ID Biometrics Built-In." Biometric Update, Biometrics Research Group, 6 June 2023, www.biometricupdate.com/202306/apple-vision-pro-headset-to-have-optic-id-biometrics-built-in#:~:text=Apple%20announced%20its%20new%20Optic,Pro%20by%20putting%20it%20on. Accessed 28 Nov. 2023.

6. Hunter, Tatum. "Surveillance Will Follow Us into 'the Metaverse,' and Our Bodies Could Be Its New Data Source." The Washington Post, 13 Jan. 2022, www.washingtonpost.com/technology/2022/01/13/privacy-vr-metaverse/.

7. "AR and VR Headsets Market Share." International Data, 6 Oct. 2023, www.idc.com/promo/arvr. Accessed 28 Nov. 2023.

8. Edquist, Alex, et al. "Data Ethics: What It Means and What It Takes." McKinsey Digital, 23 Sept. 2022, www.mckinsey.com/capabilities/mckinsey-digital/our-insights/data-ethics-what-it-means-and-what-it-takes. Accessed 28 Nov. 2023.

9. "Which States Have Consumer Data Privacy Laws?" Bloomberg Law, Bloomberg Industry Group, 27 nove 2023, pro.bloomberglaw.com/brief/state-privacy-legislation-tracker/#:~:text=The%20law%20went%20into%20effect,targeted%20advertising%20and%20sales%20purposes. Accessed 28 Nov. 2023.

10. Stewart, Lauren. "Big Data Discrimination: Maintaining Protection of Individual Privacy without Disincentiving Buisnesses' Use of Biometric Data to Enhance Security." Boston College Law Review, vol. 60, no. 1, 1 Jan. 2019, pp. 349-86.

Comment