"Vision, although our most dominant sense, has its limits,” says Sanjay Sood. “That cannot be the case with self-driving vehicles, however, which need to see through buildings, around corners and 20 miles in advance to manoeuvre safely.” Sanjay Sood is head of Highly Automated Driving at HERE. The company creates digital maps which are far more precise than those currently used for navigation.
To fully realise the vision of autonomous driving, vehicles will need to understand the road environment beyond the range of their on-board sensors.
Head of the department Highly Automated Driving at HERE
Billions of pixels
These latest HD maps show a wealth of information, not just roads and routes. Accurate to the centimetre, the environment is contained completely in the many billions of pixels: from the trees at the roadside to the height of the kerb. All of this data is captured and displayed in three-dimensional images.
The raw material for the map is not provided by a camera but by Lidar (Light Detection and Ranging). A highly sensitive, vehicle-mounted laser scanner shoots high-frequency pulses of laser light. These are reflected by objects and returned to the sensor, which can measure the distance from each individual point. NASA, for example, used this technology to accurately measure the exact topography of the moon.
How does an automated vehicle see things?
provide information about the driving environment and traffic; they also recognise road signs and traffic lights.
Radar and ultrasound
measure distances to other vehicles and objects in the vicinity.
(Light Detection and Ranging) is a laser scanner which creates a 3D image of the vehicle’s surroundings. Lidar also works in the dark, and in extremely bright light.
function as a fourth sensor. The data from the other three technologies is compared with the HD maps, providing the car with all pre-recorded details about the environment, including information outside the vehicle's sensor range.
Even in the future, sensors will not be able to supply every detail about their surroundings; it is always possible that road markings are difficult to see or that a traffic sign may be covered or bent. HD maps help in such cases, and ultimately increase safety for cars and their drivers: each piece of information is available twice and this enables foresighted driving.
"We see the map as a sort of additional sensor," says Klaus Büttner, BMW Group's Vice President ProjectsAutonomous Driving. At BMW, Büttner’s focus is on making vehicles so intelligent that, in automated mode, they behave correctly in every traffic situation. "We’re working with reinforcement learning," Büttner explains. “In other words, we play as many traffic situations as possible to the computer together with an assessment of those situations. Gradually, it develops its own understanding of which driving strategies are most suitable. ” According to the experts at BMW, the algorithm is being trained.
Even today, assistance systems in production models such as the current BMW 5 Series can regulate the speed of the vehicle depending on traffic, make sure that the vehicle remains in its lane, and assist in manoeuvres such as changing lanes. However, drivers have to keep their hands on the steering wheel, and need to stay alert and ready at all times to take back control of the vehicle.
From 2021, BMW will offer a package for highly automated driving on motorways. The driver will only need to stay alert to take over control quickly if the system reports a problem. Theoretically, highly automated driving can also support the blind in their daily lives.
Along with constantly increasing computing power, a new generation of sensors have led to remarkable progress in artificial intelligence. These sensors are considered a nucleus for the next big development: real-time maps.
Crowdsourcing on the road
In 2018, BMW will already set the course for such real-time maps. The Munich-based car producer is cooperating with Israeli technology company Mobileye, an Intel subsidiary that is global leader in vision-based advanced driver-assistance systems.
The idea is for BMW to provide real-time, camera-based information on the driving environment. The data is then aggregated at the back end and used to update the highly precise digital map. Crowdsourcing on the road would have huge advantages – as soon as the critical mass of vehicles with on-board sensors is reached, it will be possible to keep the map material up-to-date at all times. In other words, the map will achieve real-time capability.
Steering wheels will be around for a long time to come.
Vice President Projects Autonomous Driving
Büttner insists, however, that in spite of the recent advances, the car of the future should still not be equated with a computer on wheels. Cars will still need to be robust machines that are completely safe and reliable, even when they are not connected.
With all these technological advances, will the pleasure of driving still be maintained? "Cars will assist us on routes that we are not interested in experiencing as drivers. At first on motorways, then in urban rush hour traffic. But steering wheels will be around for a long time to come," says Büttner.