We’ve just launched the new augmented reality (AR) layer in our OS Maps app which uses your phone’s camera view to display over 200,000 locations across Great Britain. You can identify hills, lakes, settlements, transport hubs and woodland around you and on the horizon. It’s the first time we’ve made AR widely available, but not the first time we’ve used AR. Our Computer Scientist, Layla Gordon, leads the team that experiments with geospatial data and new technologies to create proof of concepts that are shared with partners. Find out about Layla’s work on OS Maps, and the AR projects that came before it.
It’s fantastic to see the OS Maps app AR layer released and being used. You simply point the camera of your Android or iOS device at the landscape and, using GPS and the compass, accurate points of interest that sit in that view will be highlighted.
Taking a look behind the scenes, I created it using Apple iOS Core Location and Core Motion framework. The app accesses the readings from Gyroscope and Accelorometer, to give the accuracy we need. It calls on the OS Placenames API to retrieve the OS populated places, which delivers points of interest within a set radius based on position and orientation. We’ve then set rules within the app to identify which points of interest to prioritise – as the screen could get cluttered with too many points.
If you haven’t tried it yet, take a look at https://www.os.uk/getoutside/AR. But while this is the first AR experience I’ve created which made it to public release, I’ve been working on AR projects for a couple of years.
My first foray into virtual spaces goes back to May 2015, when OS was sponsoring Digital Shoreditch, a celebration of creative, technical and entrepreneurial talent. In previous years, visitors had highlighted a problem with navigating the venue, a Victorian basement with multiple corridors and rooms, and finding exhibitions. From this, the idea formed to produce a visitor’s app that acts as a guide.
The first step was to use Blender to create a virtual 3D version of the building for users to take a virtual walk-through the building before the event. Then on the day of the event itself, the app could search for exhibitions and guided users with simple AR arrows providing turn-by-turn navigation.
Putting the ‘AR’ in Mars
Twelve months later, planetary scientist Peter Grindrod from the UK Space Agency asked OS to create an OS-style paper map of a section of Mars. I thought it would be interesting to focus an AR experience on the dramatic landscape of Schiaparelli crater.
Using a set of height data for the planet captured by NASA, and with the advice of Peter, I produced a height map in Grey Scale. Then using Blender, I made a 3D terrain model of the crater and its surroundings.
To complete the Mars AR experience, I used Vuforia to create an image target in the cloud that allows for image recognition tools to pick up the feature points I’d programmed in. After a few tweaks, the image target score for the augmentable reached the 4-star stage, which is what’s required for reliable target for recognition.
I made a proof-of-concept native iOS app using the device’s live camera feed to constantly scan the image for the target. Computer vision techniques are used in real-time to trigger the augmentation as soon as the target is found. Once completed the app displays the augmentation on the same camera view and snaps it to the real-world target (in this case the area of Mars). The 3D model is attached and the user can move the device around and, for as long as the target is still being seen, the app tracks and augments in real-time, as if the digital content (augmentation) was truly a part of the real-world object.
Experimenting with Hololens
For CityVerve, the UK’s Internet of Things (IoT) demonstrator project set in Manchester, the interior and exterior of Manchester townhall was captured by our surveying team using the latest Leica scanning equipment, and I used that data to create a model in Hololens. This was demoed at the World Institute of Ideas Forum. The Hololens app was made using Holotoolkit for Unity and Visual Studio. It loads the data and captures the surface mesh and pins the object to the surface mesh for a mixed reality experience.
For an OS and Geovation sponsored British Library event in February, celebrating the future of mapping, I created an AR experience using OS simple building heights and aerial imagery in QGIS. I generated a Hololens mixed reality 3D app that showed Canary Wharf and London. Seeing London in this unexpected way, by simply slipping on a pair of glasses, delighted those who tried it, but the uses of such models are genuinely exciting, especially in relation to construction and Building Information Modelling.
AR has been around longer than most people would expect, but my feeling is that technology is catching up and we’re only scratching the surface of what it can achieve. As with the BIM examples above, I think we are on the verge of a revolution that will bend reality to allow us to simulate a lot more besides.
Find out more about OS Maps AR at: https://www.os.uk/getoutside/AR