Driverless cars depend on Artificial Intelligence (AI) to function. Everything they do, from processing the data captured on their cameras and sensors, to interacting with human passengers, is underpinned by AI.
Here at Ordnance Survey (OS), we are researching how the cars of tomorrow will interact with passengers to receive instructions and keep people aware of what’s going on. One of the challenges we face is the way our language works. When we give commands, for example ‘take me to work through the park’, an autonomous vehicle must understand the different parts of the sentence i.e. ‘me’, ‘to work’ and ‘through the park’, the locations these phrases refer to – for instance, where ‘work’ is for you – and the spatial relationships – ‘through the park’ presumably means on a road crossing the park.
If a vehicle has been driving for an hour and needs to hand back control to a human, how and when in that process does it bring the person up to speed on where they are along the route? We’re exploring how the human-machine interface can support our navigational awareness in a safe and effective way.
In these ways we’re incorporating our knowledge of humans and geography with human interaction techniques to support the development of autonomous vehicles.