We’ve just launched the new augmented reality (AR) layer in our OS Maps app which uses your phone’s camera view to display over 200,000 locations across Great Britain. You can identify hills, lakes, settlements, transport hubs and woodland around you and on the horizon. It’s the first time we’ve made AR widely available, but not the first time we’ve used AR. Our Computer Scientist, Layla Gordon, leads the team that experiments with geospatial data and new technologies to create proof of concepts that are shared with partners. Find out about Layla’s work on OS Maps, and the AR projects that came before it.
It’s fantastic to see the OS Maps app AR layer released and being used. You simply point the camera of your Android or iOS device at the landscape and, using GPS and the compass, accurate points of interest that sit in that view will be highlighted.
Taking a look behind the scenes, I created it using Apple iOS Core Location and Core Motion framework. The app accesses the readings from Gyroscope and Accelorometer, to give the accuracy we need. It calls on the OS Placenames API to retrieve the OS populated places, which delivers points of interest within a set radius based on position and orientation. We’ve then set rules within the app to identify which points of interest to prioritise – as the screen could get cluttered with too many points.
If you haven’t tried it yet, take a look at https://www.os.uk/getoutside/AR. But while this is the first AR experience I’ve created which made it to public release, I’ve been working on AR projects for a couple of years.