Audi’s Autonomous Driving Cup: A Big Deal For Small Cars
By Rene Tellers, 2025AD.com
November 13, 2018
Audi prepares for its Autonomous Driving Cup, Uber ends its wound licking, and Waymo defends its driverless tests against skepticism: read our weekly analysis of the most important news in automated driving!
With the winter semester underway at German universities, Audi is moving swiftly to get exposure with students and the public. The Bavarian car manufacturer is finishing its preparations for the three-day “Audi Autonomous Driving Cup 2018” that starts today. It’s kind of a big deal – but for smaller cars. And no, I’m not talking about A1s…I’m talking really small.
The vehicles used are in fact 1:8 scale models of the Audi Q2. The cars may be models but the tech is real: each is equipped with a front camera, reversing camera, laser scanner and ultrasound sensors. Using a standard software tool supplied by Audi as a platform, teams developed the cars themselves. The Cup – now in its fourth year – is all about demonstrating their technical prowess as they send their cars around the miniature track “autonomously avoiding obstacles, correctly negotiating junctions with crossing traffic, recognizing road signs or following a preceding car at an appropriate distance. But that’s not all. Teams are also scored on a scientific presentation and a creativity test. The team with the highest combined score takes home the €10,000 prize money.
Fun as such student competitions are, they also serve strategic purposes. First, it’s a two-way recruitment tool: students get an insight to a potential employer, but Audi also gets to window-shop the best of student talent: which is becoming ever more necessary as firms battle it out to attract top AD talent.
A second purpose of such races is so-called gamification, or the application of game-design elements to other contexts. This is a way of putting autonomous cars – albeit models – on display for visitors and the wider public. And as I’ve said before, every little helps in winning over the public’s trust of the technology.
Uber’s back in the saddle
A different race is also heating-up, as Uber wants to resume tests of autonomous cars on public roads less than eight months after one of its test vehicles killed a pedestrian in the first such fatal crash involving a self-driving car. Now the company has asked the state of Pennsylvania for permission to resume testing after improving safety features.
In May Uber had suspended its public tests in the wake of the fatal crash and reportedly let off hundreds of workers. Then only two weeks ago Uber CEO Dara Khosrowshahi tried to reassure observers that his company was under no time pressure to restart tests. “For Uber, this is not a sprint,” he wrote in a blog post in early November.
However, with competitors moving ahead, Uber’s cautious wound licking is now apparently over. In late October Volvo had already announced that it was beginning to ship vehicles to Uber next-year despite delays to the testing scheme caused by the accident.
Uber has however announced several adjustments to its proposed tests in Pittsburgh including real-time third-party monitoring of safety drivers, setting limits on the time that drivers can work per day and improving their training. Most importantly, both in terms of safety and costs to Uber, the company said it would henceforth employ two human test drivers to monitor each car. Overkill? Perhaps. But maybe that’s what it takes to regain trust.
Waymo’s accident adds fuel to the guinea pig debate
One of Uber’s competitors that has recently dashed ahead with its testing schemes is Waymo. The company just became the first to obtain a coveted permission to test autonomous cars on Californian roads without safety driver.
But criticism soon came from organizations such as Consumer Watchdog. The US pressure group warned that Waymo would “turn all of us into human guinea pigs for testing their robot cars.” Such skeptics turned even louder when a report was published showing that less than two weeks before the permission was granted a Waymo self-driving car had caused an accident that sent the rider of a motorcycle to hospital.
In defense of its technology however, Waymo CEO John Krafcik quickly pointed out that a human error made by Waymo’s backup driver had been responsible for the crash.
According to the official state accident report a car had merged into the lane of the Waymo car. The backup driver then took over, merged to the right and collided with the motorcycle. “Our review of this accident confirmed that our technology would have avoided the collision”, insisted Waymo CEO Krafcik. “Our simulation shows the self-driving system would have responded to the passenger car by reducing our vehicle’s speed, and nudging slightly in our own lane, avoiding a collision.”
Difficult as that may seem to independently assess, one thing is certain: it fuels the debate as to whether humans or machines are the safer driver.
So long, drive safely (until cars are driverless),