Mark Zuckerberg, the CEO of Facebook, is predicting that VR will be the next major computing platform. If that’s the case, it will change everything… from how we work to how we use social media to how we shop.
But there’s a privacy component to virtual reality that a lot of people are overlooking. As you use VR technology, it records data about your movements, your facial expressions, and the tone of your voice. All that data gets stored for later analysis.
The aim of the analysis is to better understand human emotion and motivation with the goal of being able to influence human behavior, not just at an individual level, but at a societal level.
That may sound crazy, but it isn’t unprecedented. Not too many years ago, Facebook conducted an experiment on all of its users. It set up algorithms to influence what appeared in people’s news feeds, skewing those feeds either to the positive or the negative. They then measured the response of users by analyzing the emotional characteristics of their posts. The experiment showed that Facebook could influence emotion on a large scale by controlling what people see and read.
On the surface, all this research into emotion and movement will be used to create more persuasive marketing messages.
But there’s a darker component. It can also be used to influence behaviors in accordance with what politicians or certain organizations want. This kind of research could lead to attempts to change societal behaviors. Most groups talk in terms of better environmental choices, less racist behavior… that kind of thing.
But think about how this could be misused…