Interesting People mailing list archives

Re Report: Software bug led to death in Uber's self-driving crash


From: "Dave Farber" <farber () gmail com>
Date: Tue, 8 May 2018 16:53:17 -0400

Right on


Begin forwarded message:

From: Mary Shaw <mary.shaw () gmail com>
Date: May 8, 2018 at 4:50:17 PM EDT
To: Dave Farber <dave () farber net>
Subject: Re: [IP] Report: Software bug led to death in Uber's self-driving crash

The telling paragraph:

"Uber had been racing to meet an end-of-year internal goal of allowing customers in the Phoenix area to ride in 
Uber’s autonomous Volvo vehicles with no safety driver sitting behind the wheel," Efrati added.

It has been known for a long time that things can go badly wrong when nontechnical considerations are allowed to 
interfere with "the technological practitioner's first duty: utter probity toward the engineered object -- from its 
conception through its commissioning for use". See, for example, the rich set of case studies in Arthur Squires' "The 
Tender Ship" -- published in 1986 

On Mon, May 7, 2018 at 8:49 PM, Dave Farber <farber () gmail com> wrote:



Begin forwarded message:

From: Lauren Weinstein <lauren () vortex com>
Date: May 7, 2018 at 6:27:41 PM EDT
To: nnsquad () nnsquad org
Subject: [ NNSquad ] Report: Software bug led to death in Uber's self-driving crash


Report: Software bug led to death in Uber's self-driving crash

https://arstechnica.com/tech-policy/2018/05/report-software-bug-led-to-death-in-ubers-self-driving-crash/

     The fatal crash that killed pedestrian Elaine Herzberg in
   Tempe, Arizona, in March occurred because of a software bug in
   Uber's self-driving car technology, The Information's Amir
   Efrati reported on Monday. According to two anonymous sources
   who talked to Efrati, Uber's sensors did, in fact, detect
   Herzberg as she crossed the street with her bicycle.
   Unfortunately, the software classified her as a "false
   positive" and decided it didn't need to stop for her.
   Distinguishing between real objects and illusory ones is one
   of the most basic challenges of developing self-driving car
   software. Software needs to detect objects like cars,
   pedestrians, and large rocks in its path and stop or swerve to
   avoid them. However, there may be other objects--like a
   plastic bag in the road or a trash can on the sidewalk--that a
   car can safely ignore. Sensor anomalies may also cause
   software to detect apparent objects where no objects actually
   exist.

- - -

--Lauren--
Lauren Weinstein (

Archives | Modify Your Subscription | Unsubscribe Now         




-------------------------------------------
Archives: https://www.listbox.com/member/archive/247/=now
Modify Your Subscription: https://www.listbox.com/member/?member_id=18849915
Unsubscribe Now: 
https://www.listbox.com/unsubscribe/?member_id=18849915&id_secret=18849915-a538de84&post_id=20180508165326:D8A6D67A-5301-11E8-8F62-E668218ECB1F
Powered by Listbox: http://www.listbox.com

Current thread: