Self-driving cars? Like child’s play.
Have you ever walked your child to elementary school? Or to kindergarten? You’ll surely have explained road safety rules. How, for example, to react to other road users – pedestrians, cyclists, automobiles, or other obstacles that might be hard to predict. Your child learns step by step to spot the signals and convert them into an appropriate reaction. And as the adult, you gradually need to exert less and less influence over your’s child’s behavior around traffic. Eventually, after a lot of repetitions, he or she can take independent account of oncoming pedestrians and correctly judge the distance and speed of an approaching vehicle when crossing the road. Sooner or later, your child will be able to manage the trip to school unaided. We can apply a similar concept to the development of autonomous driving, a process underway since 2004 at the BMW Group.
As they learn to get used to road conditions, young children face the combined challenges of inexperience, a restricted field of vision, and small stature. They can therefore be rapidly overwhelmed by what are sometimes complex road and traffic situations. Just like children, self-driving cars also need to learn how to behave in real-life road conditions. To enable this, test vehicles are equipped with sophisticated sensor technology. Artificial intelligence and machine learning then teach the car how to recognize and react to objects on the road.
Self-driving cars continually – and without being distracted – “sense” their surroundings. They thereby collect a great deal of data that also depicts the wider environment, such as buildings, green spaces, and people. Among the basic kit items in such a “seeing” car are cameras. They detect signs, traffic lights, and other items involved in the road conditions. Ultrasound sensors measure the distance from the car to other objects, while radar sensors also detect the speed of those objects. Laser scanners create a 3D image of the environment.
Just like children, self-driving cars also need to learn how to behave in real-life road conditions.
In this context, HD maps act as a kind of safety net, allowing the vehicle to see further ahead. Thus, the car’s location is geo-located on the map as real-time data from the sensors is synchronized with the mapping data. The on-board computer processes all the information from the various technological components into an overall picture and calculates the route that the vehicle will take. And on the subject of predictive driving, provided that sufficient data is available and is correctly interpreted, self-driving cars can also predict potential traffic situations.
Quantity? No, quality.
To enable self-driving cars to cope safely with any road situation, millions of test miles are needed, along with millions of pieces of information. It should be remembered, however, that a high number of test miles doesn't necessarily equate to better driving. With test driving as in many other fields, quality is more important than quantity, as anybody can drive in absolutely perfect conditions. One of the major challenges involved in developing autonomous driving technology is the need for calculations to take account of extreme situations such as evening light, heavy rain or snowfall, and the unpredictable behavior of other road users.
Dealing with such complex situations is simply impossible without a highly developed artificial intelligence system. Simulation therefore plays a key role in the development process. Because test vehicles can’t gather all of the data needed on the actual road, in general almost 95% of all test miles are driven on a virtual, simulated basis. To ensure that a particular feature will work reliably in all conditions, situations are identified and modified on the basis of real-life data. Here, machines face the exact same challenge that confronts a child walking to school for the first time: the need to learn how to behave in actual road conditions. Only by gaining this knowledge will the child – and the self-learning car – build their own skills. Incidentally, autonomous driving also opens up a whole new future of mobility for people with disabilities.
The future for automotive brands? As tech businesses.
People represent another further key factor in the future success of self-driving cars. Alongside the customer, this applies particularly to the developer, whose workplace is changing from a traditional corporate structure to that of an agile technology business with a start-up mentality. At the BMW Group, this digital transformation is exemplified by the Autonomous Driving Campus, opened in April 2018 at Unterschleißheim near Munich. Here, experts in every field have been brought together to make the future of mobility a reality.
Data-driven development: 95% of all test miles are driven virtually.
With an area of 23,000 square meters, the campus is the perfect setting in which to design the future of mobility. In small, agile teams, a total of 1,800 experts in their fields, drawn from a variety of disciplines and recruited from all over the world, drive the development of self-driving cars. Traditional team leader and project manager roles are a thing of the past. Instead, a “product owner” defines every aspect of the features and components of a product, which is implemented in parallel by a number of self-managed teams, composed of mathematicians, developers, and engineers. The advantages? Easier communication, greater transparency, and shorter decision paths. Each and every member contributes different skills and expertise. In a 14-day process, the teams work on the latest practical examples. The hierarchies are flat enough and the team structures agile enough to ensure that any issues arising can be resolved directly.
The campus is a hub for testing, programming, and simulation. The self-driving cars will cover about 240 million virtual kilometers on the journey to being mass production-ready. Petabytes of data are being collected – every day. On the campus, specialists evaluate the data and are then able to code the results directly. Or vice versa – software developers sit in the vehicle with their laptops and test the code they’ve just written.
The process is similar to the move from the horse to the car, in that the mode of transport we’ve grown up with and come to know is undergoing a seismic change. So, what’s the ultimate objective? Will we use self-driving cars to make our lives faster or to slow them down? As a mobile business lounge, a traveling entertainment system, or a moving hotel room? What would you opt for?