On the last day of his life, Jeremy Banner woke before dawn for his morning commute. He climbed into his red Tesla Model 3 and headed south along the fringes of the Florida Everglades. Swamps and cropland whizzed past in a green blur.
Banner tapped a lever on the steering column, and a soft chime sounded. He’d activated the most complex and controversial auto-safety feature on the market: Tesla Autopilot. It’s a computer system that performs all the functions of normal highway driving without any input from the driver. When the computer is in control, the car can speed up, change lanes, take exits, and—if it spots an obstacle ahead—hit the brakes.
Tesla Inc. aims to dominate the global auto market by building the world’s first self-driving car, and it considers Autopilot to be the crucial first step. Customers adore it. They’ve logged more than 1.5 billion miles on Autopilot, often pushing the limits of the software. Although the owner’s manual warns drivers to closely supervise the car at all times, that hasn’t stopped some from reading books, napping, strumming a ukulele, or having sex. Most of the time, the car gets them where they’re going.
But on that morning in March, Banner’s sedan failed to spot a tractor-trailer crossing the four-lane highway ahead of him. So did Banner, whose attention had apparently strayed. He struck the trailer broadside at 68 mph, the top of his car shearing off like a sardine can. The 50-year-old father of three died instantly.
Computer mistakes don’t look like human mistakes. Autopilot has lightning reflexes and its attention never flags, but it sometimes fails to spot hazards in its path. Such oversights appear to have played a role in four of five known fatalities since Autopilot was introduced in 2015. Banner’s wreck, in fact, bore an uncanny resemblance to an earlier one. In August, Banner’s estate sued Tesla under Florida’s Wrongful Death Act. The estate’s argument is a straightforward product-liability claim: Tesla promised a safe car and delivered a dangerously defective one.
But Autopilot is unlike almost any other consumer product in history, in ways that offer a preview of the uncomfortable questions we’ll confront in the dawning robot age. Tesla’s flamboyant chief executive officer, Elon Musk, says the technology saves lives, and legions of Tesla owners offer their own testimonies of hazards spotted and collisions avoided. (And they have YouTube videos to prove it.) It’s possible that both sides are right, that the computers are killing a few drivers who otherwise would have lived, but that they’re also saving the lives of many more. In the coming years, society—in particular, regulators and the courts—will have to decide whether that’s an acceptable trade-off.
The question is no longer academic. Musk’s decision to put Autopilot in the hands of as many people as possible amounts to an enormous experiment, playing out on freeways all over the world.
I was in the passenger seat, heading north on Interstate 405 in Los Angeles, when Omar Qazi took both of his hands off his steering wheel. We were going about 50 mph on the most heavily traveled highway in the country, and the wheel of Qazi’s black Model 3 turned slightly to the left, keeping the car centered in the gently curving lane. “This is like L.A. rush-hour traffic, right?” said Qazi, a 26-year-old software engineer. “It’s, like, flawless.”
Tesla has legions of die-hard fans, many of them well-to-do, tech-obsessed, and male. Qazi is pretty close to the archetype. His Twitter handle, @tesla_truth, is a bottomless font of Muskolatry. Before we met in August, he’d emailed Musk to give him a heads-up and encourage him to speak with me. The billionaire CEO, who declined to be interviewed for this story, replied to his fan the same day. “Your Twitter is awesome!” he said, before adding a warning: “Please be wary of journalists. They will sweet talk you and then wack you with a baseball bat.” Musk cc’d me on the message. Tesla also declined to comment.
Qazi met me at the charging station outside Tesla’s L.A.area offices, with one of Musk’s SpaceX booster rockets looming nearby like an industrial obelisk. Qazi wore a day’s worth of stubble and blue Nike Airs. He immediately showed me the experimental Smart Summon feature, at the time available only to a select group of Tesla beta testers. (Qazi got it after begging Musk on Twitter; the feature rolled out to regular customers in September.) He pressed a button on his phone, and his car pulled out of its spot. Qazi watched it cross the parking lot and roll toward him. “It’s not useful—yet,” he said, grinning. But he loves showing off this trick so much he’s been known to linger in a parking lot, waiting for an audience.
Smart Summon offers a tiny glimpse of the driverless future Musk is promising, but for road driving, Autopilot is as close as it currently gets. Tesla says the technology isn’t reliable enough yet for humans to turn their attention away, even for a second, so it requires them to keep their hands on the wheel. Because most U.S. states are still figuring out how they’ll handle driverless cars, this also serves a legal purpose. To state regulators, Autopilot is just an advanced driver- assistance program—a souped-up cruise control, basically. Autopilot can’t yet tackle off-highway features such as traffic lights and stop signs. But during its four years on the road, it has gradually shouldered more complex tasks: merging smoothly, avoiding cars that cut in, and navigating from one highway to another.
“It can’t drive itself perfectly, but the rate of advancement of the software is like—every couple of weeks you get an update, and the car’s driving a little more humanlike. It’s very eerie,” Qazi said. A few minutes later, a silver sedan cut into our lane, and the car smoothly braked to let it in. “See that?” he asked.
Continue Reading with Magzter GOLD
Log-in, if you are already a subscriber
Get unlimited access to thousands of curated premium stories and 5,000+ magazines
READ THE ENTIRE ISSUE
October 14, 2019