SIERA (the Seismic Information Extended Reality Analytics tool) is an application for visualizing and analyzing geological data within a virtual reality (VR) environment. Unlike traditional methods, having such 3D data presented in a 3D VR environment allows for a more immersive and intuitive way to interact with such types of data, with the intent to provide an improved experience over conventional tools. Additionally one is able to customize data visualizations according to their needs through the use of data-linked color & transparency manipulation, culling object intersections, and inverted rendering options.

 

 

Features

Seismic Data Visualization

Using SIERA, one is able to visualize complex sub-surface seismic data sets as 3D data volumes such as the one in the image directly below. These are loaded in either from 3D texture or MongoDB database formats, with support for .segy files to be converted into either of these formats. Related data, such as seismic machine learning results, is also able to be visualized on its own or paired with the original seismic data to produce visualizations which allow for comparison of different geological data sets against one another. With support for numerous data volumes to be present in the same VR environment at a time, SIERA allows for the ability to visualize and analyze multiple data sets or different aspects of the same data set simultaneously.

 

 

Color & Transparency Manipulation

One is also able to manipulate the color and transparency for each of the many small cells which together make up a visualization’s volume. This allows for the efficient hiding of data which one does not want to view, or the ability to place emphasis on seeing important aspects of a data set. As can be seen in the two images below, the transparency manipulation is especially useful for allowing one to view 3D sub-surface structures which may not be apparent when viewing the visualization in its entirety. This is all achieved by altering customizable color and transparency gradients that are unique to each visualization and mapped directly to each cell’s corresponding data values.

 

 

 

 

 

 

 

 

 

 

 

Culling Object Intersection

By intersecting the cube, sphere, or plane culling objects with a 3D data volume, one is quickly able to cut away large sections of the visualization at once as can be seen below. This enables one to cut into a data volume and analyze its interior with ease. These objects are also able to lock to the axes of their corresponding volumes, providing ease of use when trying to move them exactly how one needs.

 

 

Inverted Rendering    

By switching into inverted rendering mode, one is now only able to see what is intersecting the culling objects. This provides users with the ability to focus on only small portions of the data volumes at a time. When combined with the plane culling objects as in the image below, this is especially useful to see one or multiple slices of a data volume at a time.

 

Interactions

A hand-mounted menu allows for compact navigation between all of SIERA’s features, including the spawning of data visualizations and culling objects. Once created, these can all be moved around the VR environment by simply grabbing them with one hand, as well as scaled and rotated by grabbing with two hands. For finer rotation and scaling, their toggleable bounding boxes can also be manipulated. Menu panels are also able to be moved about the environment, allowing users to fully customize the layout of their surroundings to fit their needs and preferences.

 

Developed By:

Bryson Lawton, Hannah Sloan, Patrick Abou Gharib, Dr. Frank Maurer

In Collaboration With the Department of Geoscience’s CREWES group:

Dr. Daniel Trad, Dr. Marcelo Guarido de Andrade, Dr. Ali Fathalian https://www.crewes.org/

Acknowledgements

This research was funded by IBM Center for Advanced Studies Alberta (IBM CAS) and the Natural Sciences and Engineering Research Council of Canada (NSERC), who we thank for their support.