A traffic jam on a multi-lane highway. A royal blue Tesla sedan is slowly crawling along with the flow of the stop-and-go traffic. So far nothing unusual. But a driver takes a closer look in the adjacent lane, does a double take, and can scarcely believe what he is seeing.
The man at the wheel of the Tesla Model S has nodded off, fast asleep, his head bent to the side, his eyes closed. The sleeping driver in his electric car repeatedly creeps past the bewildered and disconcerted observer, who is now filming the unusual occurrence with his mobile phone.
The recent video of the slumbering Tesla driver has caused a sensation worldwide. Millions of people viewed it on YouTube and many raved about the wonderful new world of automobiles in which electronics chauffeur us through traffic.
But then the news of a fatal accident suddenly shook technology fans awake from their dreams. In late June, it became known that a driver of a Tesla Model S was killed in an accident when the car was in self-driving mode.
Bringing insufficiently safe technology onto the road and letting it mature in regular traffic is irresponsible. Heiko Wolframm, Safety Expert, German automobile association ADAC
The wreck that claimed the life of Joshua Brown, 40, happened in Florida in early May. It was most likely the first death in an – almost – self-driving car. The fact is the computer software, cameras, sensors and other digital assistants were actually supposed to prevent just such an accident.
The U.S. Securities and Exchange Commission has launched an investigation into whether the Silicon Valley-based carmaker has violated securities laws by failing to disclose the deadly crash to investors, the Wall Street Journal reported Monday, citing a person familiar with the matter.
A Tesla spokeswoman told the paper the company disclosed the accident in a blog post on June 30 and that the crash did not require disclosure to investors. The SEC declined to comment to the Wall Street Journal.
The accident has not only alerted regulators, but has also sparked a debate over the capabilities of Tesla’s “Autopilot,” as the company self-confidently calls it, and yes, about automated driving itself.
Without a doubt, better technology is urgently needed to prevent accidents. Each year, 1.25 million people die on the roads around the world, and in about 90 percent of the cases the cause is driver error. As the technology experts see it, humans are the uncertainty factor. Replace them with computers, they say, and the streets will be safer.
But is that really true? Germany’s BMW, Audi and Mercedes-maker Daimler are being more cautious. For example, they have consciously refrained from calling the existing functions that assume control of the car for short periods "autopilot." Instead, they're considered driver assistance systems.
“This accident is really tragic. In our view, the technology today isn’t yet ready to go into production,” Harald Krüger, the chief executive of luxury carmaker BMW, said two weeks ago when he announced a partnership with U.S. chipmaker Intel and an Israeli software firm Mobileye to develop self-driving technologies.
The company announced its intention to bring its first fully autonomous vehicle to the market by 2021. Until then, the manufacturer is taking autonomous driving one step at a time. “Safety comes first,” Mr. Krüger said.
The phased approach towards self-driving has five levels that have been or will be rolled out in the coming years. Each level involves the computer assuming a bit more control from the driver.
At level one, for example, the vehicle maintains a driving distance by itself. At level two, it stays in the driving lane and can park itself. Level three means the driver can take his or her hands off the steering wheel. At level four, a human driver is no longer required, and engineers are speaking of level five as a “robotaxi” without even a steering wheel.
But the threat exists of whole new dangers arising, particularly in the transitional stages when sensors, cameras and algorithms are not yet capable of mastering every conceivable situation. How does the computer alert the human driver to get back behind the wheel in time during risky situations? How can it be ensured that the driver doesn’t lose concentration by constantly switching between operating the vehicle and being a mere passenger?
Those aren’t just questions for specialists, and the answers will decide the future of the still-nascent technology. Should there be repeats of accidents like Mr. Brown’s wreck in Florida, no customer will very quickly place themselves and their family members completely in the trust of electronics.
If trust failed, car companies such as BMW, Daimler and Audi, in addition to Tesla and internet giant Google, would have to write off billions in investments. And the concept of robotaxis, which transportation services such as Uber and Lyft plan to use to make private transport superfluous in cities, would remain just a dream.
Following Mr. Brown’s accident, Tesla CEO Elon Musk released a statement explaining that the company’s Autopilot system was not designed to turn Tesla into a self-driving vehicle. Autopilot is an assistance system that “does not allow the driver to abdicate responsibility” for constantly keeping an eye on the traffic.
While both U.S.-based General Motors, Volkswagen and BMW are partnering with Israeli camera and software Mobileye, Daimler is programming its intelligent camera systems on its own.
That may be, but the marketing, critics argue, has been dangerous. With “the term ‘Autopilot,’ Tesla is suggesting to its customers a function that the vehicle can neither fulfill nor is allowed to fulfill,” said Heiko Wolframm of the German automobile association ADAC.
Mr. Wolframm is the specialist for driver-assistance systems in the car club’s technical center in the southwestern Bavarian town of Landsberg. He had tested the range of capabilities of the Tesla Model S at the start of the year, and the results were worrisome.
“The system immediately shuts down in ambiguous driving situations without sufficient advanced warning,” Mr. Wolframm said.
European guidelines for such assistance systems dictate that the driver must maintain control of the vehicle at all times when driving faster than walking speed. While German and European lane-keeping assistance systems raise an alarm within seconds after a driver lets go of the steering wheel, the model made by U.S.-based Tesla didn’t react at all for minutes. That’s according to the results of the ADAC test and also in test drives conducted by German weekly business news magazine Wirtschaftswoche. The Tesla model drove on unperturbed.
By contrast, Daimler’s Mercedes-Benz E Class has an “Intelligent Drive Active Emergency Stop Assist” that ensures the warning lights will activate and the vehicle will come to a safe stop if a driver takes their hands off the steering wheel for too long.
The developers at rival Audi took particularly great effort with the new Audi A8’s traffic-jam pilot so that the driver isn’t surprised by suddenly having to take over the wheel. The feature will be available in 2017.
But it's not a requirement: “At least with the version of software tested by ADAC,” Mr. Wolframm said, “it was also possible to activate the system away from the freeways – such as while driving in the cities.”
That, he said, is unacceptable.
“The producers must effectively prevent the misuse of the technology,” instead of passing on the blame for autopilot errors to the driver through legal clauses, he said. “Bringing insufficiently safe technology onto the road and letting it mature in regular traffic is irresponsible.”
For automotive newcomers such as Tesla, Google, Uber and Lyft, there's no time to wait. They've learned lessons from the internet industry: The company that’s first to put robotaxis on the roads will dominate the market. And so Tesla is also following the IT industry’s principle: The product matures with the customer, and new software versions can iron out problems and mistakes.
That can work out well. Or, as has now happened, can end up being deadly.
On the other hand, Audi, BMW, Mercedes, Sweden-based Volvo and the rest want to market fully-developed products that are safe according to the state of the technology. But a new model represents the technical level of maturity of three years ago. It doesn’t improve a bit until the next model generation comes out.
“There is a huge power behind the robotaxi business model of Google and Uber,” said Wolfgang Bernhart, a partner at management consultant Roland Berger’s Automotive Competence Center in Stuttgart. By 2030, autonomous driving taxis, the “robocabs,” are expected to hold a market share of just under 30 percent of the total mobility options available. And so-called self-driving cars will account for about 40 percent of the auto industry’s total earnings by 2030.
In order for the vehicles to be really safe, the automakers will have to adopt different strategies than they do now. Since genuine autopilots are extremely expensive and their development is time-consuming, the car companies will be forced to partner more and more with software companies and competitors. The cooperation between BMW, Intel and the Israeli camera and software specialist Mobileye is an example. The companies’ objective is an open technical platform and standards for environment sensing in autonomous driving.
But the partners have a lot of convincing to do.
“We are pressing forward with the subject with our own resources,” announced rival Daimler rather coolly in response to the BMW deal, saying common standards are not necessarily needed technically. But the Stuttgart-based carmaker doesn’t want to rule out cooperation. The company says it’ll consider cooperating when it comes to communication or on infrastructure matters, such as when traffic signals or bridges warn vehicles of accidents or icy roads.
A possible reason for the reluctance is that Daimler and Toyota are the only globally-active car producers that aren’t relying on Mobileye’s technology regarding intelligent image capture and processing.
While both U.S.-based General Motors and Germany’s Volkswagen are partnering with the startup from Jerusalem, Daimler is programming its intelligent camera systems on its own. Apparently with success: the new E- Class is considered to be a benchmark for semi-autonomous driving. Why should Daimler jeopardize its lead?
As it is, German carmakers are generally well positioned in the global race to robo-mobility. Among traditional carmakers, German firms are slightly ahead of those in the United States, Japan and Sweden, according to a Roland Berger study conducted for Wirtschaftswoche.
The Germans’ leading role, however, could soon come under threat. That’s according to an exclusive patent analysis conducted for Wirtschaftswoche by the FIZ Karlsruhe – Leibniz Institute for Information Infrastructure. No firm is applying for more patent rights in the field of autonomous driving than Japan’s Toyota. But in line with the company’s philosophy of being cautious, the Japanese are still not making much ado of it.
“Our main objective is avoiding accidents,” said Seigo Kuzumaki, Toyota’s chief safety technology officer.
And Tesla, Google and Uber? The U.S.-based companies forcing the whole industry to action show up in the statistics only under other applicants. The firms are apparently relying mainly on technology purchased from outside sources.
But something can definitely be achieved with externally-purchased technology.
In the aftermath of Mr. Brown’s accident, Tesla pointed out that its own vehicles with its Autopilot feature had driven over 200 million kilometers (over 130 million miles) without a single fatal incident. In the United States, on the other hand, a person in a normal vehicle is killed on average every 150 million kilometers or 94 million miles.
The technology is clearly headed in the right direction, but it wasn’t yet advanced enough to have saved Mr. Brown.
This article first appeared in Germany business weekly WirtschaftsWoche, a sister publication of Handelsblatt. To contact the authors: [email protected], [email protected], [email protected] and [email protected]