Skip
Autonomes Fahren Konzept
Back to overview

The latest on autonomous driving

When will the robot cars be on the road?

For several years now, manufacturers have promised that self-driving cars will soon be ready to launch on the market. But the industry is gradually becoming disillusioned and schedules are being pushed back. So what's the problem?

‘Next year’, says Elon Musk in a viral YouTube video once again, and also repeats himself again and again in interviews from 2014 to 2021. The self-driving car should finally be here by then, promises the famous tech billionaire and CEO of US electric car manufacturer Tesla. But 2022 is now coming to a close and if you look at our roads, all you can see is people behind the steering wheel still. Except perhaps in Phoenix in the US state of Arizona, where Google’s sister company Waymo has offered a pilot service with driverless taxis for several years now – but in a limited and precisely mapped operational area under very controlled conditions: almost always in good weather and with lots of expensive sensors in every car.

General Motors subsidiary Cruise is also working on a driverless taxi service in San Francisco that unfortunately repeatedly makes the headlines whenever its robot cars stall and block the roads. Added to this is the fact that Ford and Volkswagen have now pulled out of the autonomous mobility start-up Argo AI after investing billions in it, which suggests that there is not a great deal of confidence in the technology. So when will self-driving robot cars be available to everyone – perhaps never? To answer this question, we need to look at what is already reality in German cars today.

Mercedes-Benz has launched the first Level 3 system on the international market

Modern new cars from many manufacturers can already get in and out of parking spots on their own, keep to their lane, maintain a set speed and distance from the car in front on motorways and even overtake under their own steam. To do so, the vehicles combine established technologies such as cruise control, distance control and lane departure warning systems. This so-called ‘partial automated driving’ (Level 2 on the SAE scale) is thus already reality. However, these aren’t self-driving cars like those that have been promised by Musk and other CEOs in the vehicle industry for many years. This is because human drivers need to constantly monitor the vehicle's actions, pay attention to the traffic and be able to intervene in seconds if there is a problem. Even Tesla's somewhat misleadingly named ‘Autopilot’ is also merely a Level 2 system because the driver still needs to observe all the time and keep their hands on the steering wheel. And the person driving is always liable for any accidents, even if the automated systems were in control at the time.

Alexandros Mitropoulos is spokesperson for autonomous driving at Mercedes-Benz
Alexandros Mitropoulos is spokesperson for autonomous driving at Mercedes-Benz

It was therefore a huge step forward when Mercedes-Benz  launched the world's first internationally certified Level 3 system in Germany in May 2022 as a special option on the S-Class and premium model EQS. The US states of California and Nevada will follow suit shortly. With ‘highly automated driving’, the driver can finally take their hands off the steering wheel and hand over control to the vehicle without having to supervise. If an accident happens while driving within the permitted areas of application because of a malfunction of the automatic driving system, the car manufacturer will generally be liable. Mercedes drivers will now therefore be able to read, surf the Internet, chat, watch films and play video games while behind the wheel. ‘With DRIVE PILOT, we have developed an innovative technology for our vehicles that offers the customer a unique, luxury driving experience and gives them back something very precious: time’, says Alexandros Mitropoulos, Mercedes-Benz spokesperson for autonomous driving. Honda has previously also launched a similar Level 3 system on the market for the Legend Hybrid Ex saloon, but only in Japan.

Has artificial intelligence in cars run into a dead end?

In a nutshell: driving is much more complex than previously thought, whereas progress made in artificial intelligence (AI) is slower than expected. ‘After the hype at the start of all this ten years ago, the industry is now experiencing a certain degree of disillusionment,’ says Dr Christian Müller, Head of the Competence Centre Autonomous Driving at the German Research Centre for Artificial Intelligence (DFKI). It is Germany's leading research institution in this field and its partners include technology companies such as Google and Nvidia plus car manufacturers such as Daimler, BMW and Volkswagen. At the DFKI, Müller researches topics such as predicting pedestrian behaviour and how a machine perceives differently to a human.

Dr Christian Müller is head of the Competence Centre Autonomous Driving at the German Research Centre for Artificial Intelligence (DFKI).
Dr Christian Müller is head of the Competence Centre Autonomous Driving at the German Research Centre for Artificial Intelligence (DFKI).

Müller says that the AI approach that the industry is mostly pursuing for autonomous driving, so-called deep learning, is increasingly proving to be a dead end. In this approach, machines analyse large sets of data (e.g. camera images) at lightning speed with the help of neuronal networks and are constantly learning independently, without human intervention. What may sound incredibly promising in theory has its pitfalls in practice, says Müller. “Deep learning AI has an uncanny ability to recognise and classify objects, but can be completely useless if the context changes even slightly’. Rain, fog, snow, darkness, damaged road signs, light reflections, unexpected behaviour of other road users: all these and more factors in countless possible interactions mean that robot cars can come across new situations all the time where they do not know what to do. This is in contrast to the brain of a human driver who intuitively knows whether a green light is real or simply painted on and can immediately stop, even if they have never seen a kangaroo hop across the road before.

Even AI trained over a billion kilometres still makes mistakes

The US car manufacturer Tesla, one of the leaders of technology in this field, proves just how difficult it is to perfect deep learning AI. Unlike any other competitor, it has fed its high-performance supercomputers with masses of customer data for several years to train the AI. Records of the situations where human drivers had to take over from the Autopilot or where there were accidents are of particular relevance. Since 2020, over 160,000 Tesla drivers in the USA and Canada have received the ‘Full Self-Driving Beta’ update, where the car can take over the driving on country roads and in cities thanks to pedestrian, cyclist and traffic light detection – but it is still a Level 2 system where the driver has to continue to pay attention and be able to intervene.

So what’s the verdict after several billion Autopilot kilometres driven in varying landscapes, traffic situations and weather conditions? YouTube offers up some examples where Tesla drivers have documented their trips in the robot car through cities such as San Francisco, evaluating each action like a driving instructor would. There are repeated moments where they needed to take over the wheel at lightning speed because the car comes to a standstill in traffic or looks like it might hit something. According to the National Highway Traffic Safety Administration, Tesla leads accident statistics in the USA for Level 2 systems. According to them, from July 2021 to May 2022, there were 273 accidents in Autopilot mode, and five people died. This may be a small amount given the hundreds of thousands of Teslas on US roads and compared to human drivers, but one of the biggest hurdles autonomous driving needs to overcome is that of ensuring maximum safety. In Germany it would be unthinkable for customers to test this technology in inner cities and for manufacturers to improve them gradually only after collating data.

Redundancy should ensure greater safety

Mercedes-Benz, for example, made its DRIVE PILOT ready for the market by testing it on its own test tracks in Immendingen, south Germany, and was granted approval for use on public motorways only after comprehensive tests were carried out by the authorities. To guarantee the highest possible safety for automated driving, the German manufacturer uses redundancy. In addition to brakes, steering and power supply, the sensors and even the algorithms of the AI are duplicated so they can check each other, explains company spokesperson Mitropoulos. ‘This includes radar, lidar, a special stereo camera in the windscreen, a mono camera in the rear window and microphones, particularly to detect blue light and other special signals from emergency vehicles, as well as a moisture sensor in the wheel housing’.

This is a different approach to the one taken by Tesla, which primarily relies on video from the eight external cameras on the vehicle. No radar (detection of the environment using radar waves) or lidar (using laser beams) is used. This is aimed to save costs and harness the power of the computers for video. But AI researcher Müller thinks this is the wrong approach. ‘The robustness of passive camera data suffers in darkness, rain and fog. Active sensors such as radar and lidar, however, can continue to detect objects easily’, he explains.

Real autonomous driving in 15 years at the earliest

At the same time, Müller tempers expectations that customers will be able to buy a robot car in the near future that can drive or pick them up at any time and from anywhere independently, just like a human chauffeur can do in the current infrastructure. ‘I can only see us developing AI like that, which fulfils the necessary safety regulations, in 15 years’ time at the earliest’, he says. And only if manufacturers take a new approach to development. ‘The limits of deep learning highlight that we should not leave machines to their own devices. Instead, we need trustworthy AIs that not only learn things from data alone, but also with human support’.

Widespread use of Level 4 systems without human drivers, he says, can only happen sooner if we reduce requirements on the AI by adapting our infrastructure. If the robot car no longer has to deal with pedestrians or cyclists, for example, the task would be a lot easier. This could be made possible by barriers separating lanes, pedestrian bridges and tunnels, such as those in Chinese cities. ‘We need to ask ourselves what our cities, villages and roads should look like’, states Müller, adding: ‘These are discussions for the whole of society and they have to date not taken place often enough in the context of autonomous driving’.

Torsten Gollewski Executive Vice President Autonomous Mobility Systems at ZF
Torsten Gollewski Executive Vice President Autonomous Mobility Systems at ZF

‘A shortage of drivers in transport and logistics companies, driving bans in inner cities and ambitious but necessary climate goals – these are major challenges, but they come with huge opportunities. We need to learn to rethink mobility. At ZF, we are convinced that autonomous transport systems can help us to redesign public transport and reorganise supply chains in the logistics industry. This means that in the foreseeable future, autonomous driving will make traffic safer, more efficient and more convenient. This is already evident in our project in Rotterdam, for example, where six fully electric autonomous third-generation ZF shuttles will shortly be transporting up to 3000 passengers a day in a separate lane’.

Ralph Müller Press spokesperson for technology Toyota Deutschland GmbH
Ralph Müller Press spokesperson for technology Toyota Deutschland GmbH

‘Toyota is striving for a society that offers safety, seamless movement and freedom of movement for everyone. To achieve our goal of no transport-related fatalities, we are continuing to develop technologies that will prevent as many accidents as possible. Toyota’s development philosophy is Mobility Teammate Concept (MTC), which optimally supports drivers in the respective driving situation. It is important that the driver retains control. Our long-term goal is continuous development until we achieve autonomous driving’.

Volker Wetekam, Executive Vice President Automated Driving at Bosch
Volker Wetekam, Executive Vice President Automated Driving at Bosch

‘Fewer emissions, connected and autonomously driving vehicles - with a higher degree of automation, cities can be relieved in the future with modern mobility offers and thus become more attractive for their inhabitants: More green spaces instead of parking spaces, less noise, better air quality. At the same time, new mobility concepts enable even more comfort and safety. Bosch is pursuing the Vision Zero: emission-free mobility and no serious injuries or fatalities in road traffic. In addition, autonomous driving is starting where there is a shortage of drivers: To fill the shelves in supermarkets and supply industry with the necessary materials, driverless driving can help. Bosch is an innovation leader in this field and laid the foundation for all levels of automation early on with driver assistance systems and the associated sensor technology. We are working persistently to make mobility more comfortable, flexible and safe for everyone’.

Tags

  • Future Mobility