If you think virtual reality (VR) is all about video games, think again. Big data researchers are betting on VR technologies to explore new ways to visualize and analyze complex and dynamic datasets. This can make a big difference in how we experience data and make decisions that impact businesses across industries.

Step into your data

Some clues to this intertwining of VR and big data might soon come from EU researchers at CEEDS (Collective Experience of Empathic Data Systems), as they transpose big data into an immersive, interactive virtual environment at Pompeu Fabra University in Barcelona. Funded by nine countries in the EU, the eXperience Induction Machine deploys VR technologies to enable users to step inside large data sets. The idea is to help visualize data in a more “empathic” way that helps humans make better sense of numbers with a constantly adapting presentation style to avoid brain overload.

“They have an ambitious vision, but EU researchers are not alone,” points out Arnie Lund, who heads GE’s Connected Experience Labs exploring the use of immersive environments in the industrial world. The University of Washington Human Interface Technologies Lab and others received a lot of funding by DARPA and have been working in this space for 20 years now. Several other researchers, at Caltech and other institutions, are also working with immersive virtual reality to visualize data.

Navigate the subconscious

It is easy to see why VR technologies are catching on. “Anywhere where there’s a wealth of data that either requires a lot of time or an incredible effort, there is potential,” explains Jonathan Freeman, professor of psychology at Goldsmiths, University of London, and coordinator of CEEDs.

One of the biggest challenges of big data is extracting information in a way that the human mind can comprehend. The human mind is just too feeble to comprehend such vast amounts of information. Immersive environments like the one at CEEDS measure people’s reactions to visualizations of large data sets to understand the subconscious processes of the human brain to determine the optimum amount of information to display. The system acknowledges when participants are getting fatigued or overloaded with information, and it adapts. It either simplifies the visualizations so as to reduce the cognitive load, thus keeping the user less stressed and more able to focus. Or it will guide the person to areas of the data representation that are not as heavy in information.

Building immersion into the industrial world

The industrial world is also experimenting with VR technologies to dive deep into complex and fast-changing datasets. Take GE’s Grid IQ™, for instance. It brings immersive experiences to operations centers and control rooms, where engineers can move through various visualizations of data consolidated from smart meters, other intelligent grid power equipment, sensors, and unstructured social media content to generate actionable information that predicts and resolves failures as well as other issues that impact the performance of power grids.

“We use the immersive nature of big screens to provide situational awareness and shared context, as well as collaborative touch tables so groups can interact with the data visualizations collaboratively as they explore the relationships within the data. We also use wearable and mobile devices as they move around and collaborate, and workstations that relate to the context of the overall room. And we are building out a matrix of connections between these devices to all the visualizations, models and data to be shared, moved, and integrated in different ways,” explains Arnie Lund.

Having tasted success in the power sector, researchers are now extending these immersive experiences to interactive data forensic applications and efficient path prediction for transport as well as training and design. Possible applications, according to CEEDs, include satellite imagery inspection, oil prospecting, astronomy, economics, and historical research.

“We can create even better immersive experiences when we develop a deeper understanding of neuroscience in terms of what our senses, emotions, and brains are optimized to detect, synthesize, and model, and where we are better at deriving insights than the analytics themselves. The next step is to explore the characteristics of the data and visualizations that leverage those capabilities most effectively,” predicts Lund. And that could open up many more Industrial Internet possibilities, especially as data streams from connected devices multiply.

About the author

Pragati Verma


Related insights