Shayna Solis and colleagues have teamed up with NASA to use immersive technologies to visualise geoscientific data in near-real time
When a natural disaster strikes, rapid access to reliable, relevant and real-time geoscientific data is essential. But the datasets involved with natural disasters and their management can be vast and complex, and the people working together to interpret the data and manage the disaster-response efforts frequently come from disparate fields and may not share the same technical languages. Virtual and augmented reality (VR and AR) offer a way to rapidly visualise and simplify complex datasets, and thus have the potential to revolutionise disaster-response efforts.
Shayna Solis, co-founder and CEO of Navteca (a company focused on emerging technologies and innovation), was working with NASA on cloud computing, whereby data storage and computing power are managed by remote data centres, thus enabling storage volumes and performance capabilities that significantly exceed those that could be achieved using any one computer alone. When a friend first showed Shayna an early version of a VR headset, they started thinking about the possibilities of VR for the visualisation of the scientific, geospatial and observational data produced by Earth scientists.
“When you view Earth inside a VR headset, you see the spherical environment and a more realistic representation of the continents, the ocean, the polar regions, and their related data sets.”
Now Shayna and her team are collaborating with NASA to explore the potential of immersive technologies for disaster response – a project that Shayna says has “grown tremendously into a multi-year effort to push the envelope of real-time data rendering and the use of game engines for science data visualization.”
Immersive technologies can be used to visualise complex, multiscale data, such as air quality, sea level or seismic data. Shayna suggests this approach to data visualisation is especially intriguing when the geospatial context and other information, such as topography, population centres, utilities and infrastructure are also layered into the virtual environment because you can then visualise the impacts on people – an approach that could transform communications between various stakeholders and improve decision making.
“The decisions made by city planners, engineers, politicians, and emergency managers shape the fates of millions. Immersive technologies can help them understand different scenarios and visualize the outcomes of their decisions. By making data more understandable, we will create more open, inclusive, and accessible ways for people to obtain scientific information.
“VR/AR is one way we can translate complex information into a visual format that is more accessible. For example, when we talk about data numerically, such as a two-meter storm surge, it is difficult to know what the impacts will be. If we can see a two-meter storm surge as a water layer on top of the specific streets and buildings in a coastal town, then we can better understand the potential impacts. This type of scenario could be used for disaster preparedness and to build resilience in communities.
“Another example is in understanding risks and creating a visual framework where people from different backgrounds, like civil engineers, city planners, emergency managers, and the city mayor, could view data in an interactive, immersive environment. They could run scenarios and models and see what the potential impacts from a flood, fire, or earthquake might be. This could lead to enhanced, data-driven decision making.”
While VR and AR are now used more routinely in a number of geoscientific disciplines, and particularly the applied geosciences, one major advance offered by the collaboration between Navteca and NASA is the ability to ingest and visualise geoscientific data in near-real time, during or shortly after a natural disaster unfolds. NASA’s Disasters Program collects data and tracks a variety of natural disasters, including earthquakes, volcanoes, floods, landslides and oil spills, as they happen, and their Disaster Mapping Portal provides a Geographic Information Systems (GIS)-based interface that allows anyone to view, analyse and download data. Shayna and her team were able to connect to the portal using an API (Application Programming Interface, a way for two or more computer programs to interact) and visualise near-real-time data products such as the Global Landslide Nowcast – a machine-learning-based model that factors in variables such as slope and morphology, rock composition, rainfall and soil moisture, as well as distance to faults, to accurately estimate the likelihood of a landslide within a 1-km-square area, in near-real time.
The decisions made by city planners, engineers, politicians, and emergency managers shape the fates of millions. Immersive technologies can help them understand different scenarios and visualize the outcomes of their decisions
Shayna envisions a world where immersive technologies form a standard and integral part of disaster response, “I think AR technologies will be very useful, for example, in head-up displays on helmets for first responders so they can visualize information overlaid onto the scene of a disaster.” However, she emphasises that these technologies are still in their infancy. For example, there has been a lot of buzz about the metaverse, an interactive and interoperable network of three-dimensional virtual worlds that are accessed through a VR headset and navigated using things like eye movements and voice commands, but this doesn’t exist – yet. “What we are creating are the building blocks needed for a metaverse.”
Before we can achieve this, hardware improvements are required. In particular, the headsets need to be lighter and non-tethered, and must be able to handle large volumes of data. Headsets are evolving quickly and Shayna suggests that the adoption of these innovative tools will become increasingly widespread as more experiences and objects become interconnected and interoperable (and when people feel secure about their data privacy), but she emphasises that the innovative applications are not limited
“There are a lot of exciting advances in technology that will impact the geosciences. The creation of digital twins is one example of how many different types of information can be combined into a virtual replica that can be used for monitoring and modelling everything from buildings to satellites to Earth systems. Other companies are working on incorporating data and information into devices that will seamlessly integrate into our lives, such as vehicle windshields.
“At Navteca we are very interested in human-computer interfaces. We have developed Voice Atlas, a conversational artificial intelligence (AI) search system that allows interaction with unique and specific knowledge sets using natural language. Voice Atlas can connect information across multiple locations and repositories making it easier to find answers via a centralized natural language interface. We are experimenting with combining our conversational AI with the game engine visualization to create a whole new layer of interactivity, and are very excited to see how these enhanced interfaces can improve the way that humans access and interpret science data.”
Shayna Solis is CEO and co-founder of Navteca, based in Washington DC, USA.
(Image: Marco Librero, NASA/Ames)
Interview by Amy Whitchurch
Solis, S. Visualising disasters. Geoscientist 32 (3), 38-39, 2022; https://doi.org/10.1144/geosci2022-028