Mixed reality HoloLens 3D data visualisation

Have you heard of Microsoft HoloLens? No, nor me. However, I was lucky enough to spend some time with one of our technology lab engineers Layla Gordon to find out more.

While VR (virtual reality) headsets and AR (augmented reality) apps were once pioneering, Microsoft HoloLens utilises an even more cutting-edge mixed reality technology.

VR headsets have been the latest visualisation trend and are mostly well known for their popularity in the gaming industry. As I am sure many of you know, VR headsets simulate entirely virtual worlds and require both a console and controller. The product has no association with reality and as such, creates an immersive experience for the user.

To understand AR, the example Layla offered was the Pokémon Go app. So, while visuals are placed seemingly in to reality, they do not interact with it or its geospatial (locational) information – hence why PokéStops have controversially led people to collect Pokéballs at both Auschwitz and the 9/11 Memorial Pool.

In a way combining the benefits of AR and VR, the Microsoft HoloLens offers what is termed a mixed reality. It offers a virtual world that is both integrated with and impacted by reality. Essentially, it is a headset for viewing and interacting with holograms within the world around you.

The HoloLens interface interactions include gaze, gestures and voice input. Simple gestures are required to open apps as well as select, size, drag and drop holograms in your world. Sensors are built in to allow users to use their gaze to move the cursor around. Voice commands are used to navigate, select, open, command and control the apps.

Here at OS, we capture and build 3D models from various sources. More recently as part of the CityVerve project, new 3D data has been captured using drones. The data is just as detailed as before however, it is much better for visualisation as the roof shapes and geometric shapes are much more preserved. The example below is a clip of Layla exploring Manchester University using HoloLens technology in what we’ve termed a geo immersive reality.

In our app, spatial mapping and understanding are being used for correct placement of holograms in relation to environmental considerations such as furniture. Our app benefits from the TapToPlace feature. This means that the model hologram can be pinned to the surface mesh rather than hovering in mid-air like a Pokémon.

The scope for HoloLens is so vast in terms of industry, with Microsoft’s specific YouTube account demonstrating its use within product design as well as crime scenes. In terms of mapping, OS have multiple directions we can go in. A good example is how this technology could be used to improve mountain rescues or cityscapes and everything in between. While our geospatial information is used already, the HoloLens map app would be able to offer a quick insight in to elements such as mountain height and underground pipes. It would also be helpful in terms of flooding or coast erosion, such as simulating what could happen to a city following a natural disaster or an accurate depiction of what our coasts will look like 30 years down the line. In addition, we could explore BIM (Building Information Modelling) further to map everything from windows to electrical sockets within buildings.

At OS, we are very excited about using this technology to improve our insight and services. While we are still ironing out some creases in our app, we couldn’t help but share the news even at development stage!

Find out more about our work with augmented reality.

You may also like

Tutorial – visualising data in Tableau with the RNLI
Tutorial – visualising data in Kepler with the RNLI
Tutorial – visualising data in QGIS with the RNLI
Meet the team: Joe Harrison

1 Response

Leave a Reply

Your email address will not be published. Required fields are marked *

Name* :

Email* :