Arizona Governor Bans Uber From Autonomous-Testing Over “Unquestionable” Safety Concerns

A week after the  first pedestrian death caused by an autonomous vehicle, AP reports that Arizona’s Governor has suspended Uber’s license to test self-driving cars on public roads in the state.

AP reports that Arizona Gov. Doug Ducey told the company in a letter Monday that video footage of the crash raised safety concerns.

He called the crash “an unquestionable failure” to comply with safety expectations.

The move comes days after The New York Times reported that the company’s own documents showed the testing program was rife with issues. The documents showed trouble with driving through construction zones and requiring far more human intervention than competing companies.

Experts have told The Associated Press that the company’s technology should have detected the pedestrian in time to avoid the crash.

This ban comes hours after Amnon Shashua, the co-founder of Israeli start-up Mobileye, said on Monday its computer vision system would have detected the pedestrian who was killed in Arizona by a self-driving Uber vehicle, and called for a concerted move to validate the safety of autonomous vehicles.

In a blog post, Shashua also criticized “new entrants” in the self-driving field that have not gone through the years of development necessary to ensure safety in the vehicles.

Now Is the Time for Substantive Conversations about Safety for Autonomous Vehicles

Society expects autonomous vehicles to be held to a higher standard than human drivers. Following the tragic death of Elaine Herzberg after being hit last week by a self-driving Uber car operating in autonomous mode in Arizona, it feels like the right moment to make a few observations around the meaning of safety with respect to sensing and decision-making.

First, the challenge of interpreting sensor information. The video released by the police seems to demonstrate that even the most basic building block of an autonomous vehicle system, the ability to detect and classify objects, is a challenging task. Yet this capability is at the core of today’s advanced driver assistance systems (ADAS), which include features such as automatic emergency braking (AEB) and lane keeping support. It is the high-accuracy sensing systems inside ADAS that are saving lives today, proven over billions of miles driven. It is this same technology that is required, before tackling even tougher challenges, as a foundational element of fully autonomous vehicles of the future.

To demonstrate the power and sophistication of today’s ADAS technology, we ran our software on a video feed coming from a TV monitor running the police video of the incident. Despite the suboptimal conditions, where much of the high dynamic range data that would be present in the actual scene was likely lost, clear detection was achieved approximately one second before impact. The images below show three snapshots with bounding box detections on the bicycle and Ms. Herzberg. The detections come from two separate sources: pattern recognition, which generates the bounding boxes, and a “free-space” detection module, which generates the horizontal graph where the red color section indicates a “road user” is present above the line. A third module separates objects from the roadway using structure from motion – in technical terms: “plane + parallax.” This validates the 3D presence of the detected object that had a low confidence as depicted by “fcvValid: Low,” which is displayed in the upper left side of the screen. This low confidence occurred because of the missing information normally available in a production vehicle and the low-quality imaging setup from taking a video of a video from a dash-cam that was subjected to some unknown downsampling.

Images from a video feed watching a TV monitor showing the clip released by the police. The overlaid graphics show the Mobileye ADAS system response. The green and white bounding boxes are outputs from the bicycle and pedestrian detection modules.  The horizontal graph shows the boundary between the roadway and physical obstacles, which we call “free-space”.

The software being used for this experiment is the same as included in today’s ADAS-equipped vehicles, which have been proven over billions of miles in the hands of consumers.

Recent developments in artificial intelligence, like deep neural networks, have led many to believe that it is now easy to develop a highly accurate object detection system and that the decade-plus experience of incumbent computer vision experts should be discounted.

This dynamic has led to many new entrants in the field. While these techniques are helpful, the legacy of identifying and closing hundreds of corner cases, annotating data sets of tens of millions of miles, and going through challenging preproduction validation tests on dozens of production ADAS programs, cannot be skipped. Experience counts, particularly in safety-critical areas.

The second observation is about transparency. Everyone says that “safety is our most important consideration,” but we believe that to gain public trust, we must be more transparent about the meaning of this statement. As I stated in October, when Mobileye released the formal model of Responsible Sensitive Safety (RSS), decision-making must comply with the common sense of human judgement. We laid out a mathematical formalism of common sense notions such as “dangerous situation” and “proper response” and built a system to mathematically guarantee compliance to these definitions.

The third observation is about redundancy. True redundancy of the perception system must rely on independent sources of information: camera, radar and LIDAR. Fusing them together is good for comfort of driving but is bad for safety. At Mobileye, to really show that we obtain true redundancy, we build a separate end-to-end camera-only system and a separate LIDAR and radar-only system.

More incidents like the one last week could do further harm to already fragile consumer trust and spur reactive regulation that could stifle this important work. As I stated during the introduction of RSS, I firmly believe the time to have a meaningful discussion on a safety validation framework for fully autonomous vehicles is now. We invite automakers, technology companies in the field, regulators and other interested parties to convene so we can solve these important issues together.

via RSS https://ift.tt/2Gr0rAP Tyler Durden

Leave a Reply

Your email address will not be published. Required fields are marked *