By Layla Gordon
Back in February OS took a trip off of this planet to produce a paper map of Mars.
This inspired the Tech Labs team, who had been already involved in Augmented Reality (AR) work, to produce a Mars AR experience using this map.
As all good work with augmentation, the first step was to create some 3D content for augmenting the map. Using a set of height data for the planet captured by NASA, and with the advice of Peter Grindrod from UK Space Agency, I produced a height map in Grey Scale. Then using Blender I created a 3D terrain model of the Schiaparelli crater and its surroundings.
The next and last stage was to create the AR experience. Using a package called Vuforia, I created an image target in the cloud that allows image recognition tools to pick up the feature points. After a few iterations and tweaking of the image target, the score for the augmentable reached the 4-star stage which makes it a reliable target for recognition.
Vuforia has published SDKs for both Android and iOS to produce an AR app using either on-device or cloud image targets. Since the intention was to possibly change the target in future, it was decided to experiment with the cloud version. A proof-of-concept native iOS app was produced which uses the live camera feed from an iOS device (iPhone/iPad) and constantly scans the image looking for the target. Computer vision techniques are used in real time to trigger the augmentation as soon as the target is found. Once this has been done the app displays the augmentation on the same camera view and snaps it to the real world target (in this case the area of Mars) in the same camera view. The 3D model is attached and the user can move the device around and for as long as the target is still being seen, the app tracks in real time and augments also in real time as if the digital content (augmentation) was truly a part of the real world object.
The next step would be to experiment with more sophisticated AR experiences such as interactive/controllable by user 3D models, video content, real-time data streams from sensors, such as wind speed or air quality ones. Also using head on displays such as Gear VR alongside mobile devices. We have already started migrating the native development over to a Unity-led one, as this allows for more streamlining and cross-platform development from the same code base. Now, as we have proven with Mars, the sky is no longer the limit, so watch this space!