試す 金 - 無料
A New Code Of Conduct
Fortune India
|June 2018
What are the legal ramifications of a self-driving car at fault? A look into what happens legally after robots go awry.
MARCH 18 CHANGED EVERYTHING—and nothing—in the frenzied and nascent world of autonomous vehicles. That Sunday evening, in a Phoenix suburb that has become a hub for testing autonomous vehicle technology, an Uber self driving vehicle struck and killed pedestrian Elaine Herzberg. The vehicle was in autonomous mode at the time of the collision, with a human test driver behind the wheel.
The incident is believed to be the first death caused by a fully autonomous vehicle. It prompted Uber to halt autonomous vehicle testing on public roads in four cities and other companies to pause their own public road tests. It also led Arizona Governor Doug Ducey, a proponent of autonomous vehicle technology, to suspend Uber from operating in the state. Advocacy groups called for a national moratorium on self-driving tests.
A fatal self-driving car crash seemed inevitable. Though the arrival of the robot car promised a dramatic reduction in the 1.25 million road traffic deaths that occur around the globe each year, there was also the sneaking suspicion that someday, for some reason, a self-driving vehicle would cause a collision—perhaps because of a string of bad code, an unexpected equipment failure, or an impossible decision.
このストーリーは、Fortune India の June 2018 版からのものです。
Magzter GOLD を購読すると、厳選された何千ものプレミアム記事や、10,000 以上の雑誌や新聞にアクセスできます。
すでに購読者ですか? サインイン
Translate
Change font size

