fbpx

Hands on the Wheel: Consumer Watchdog calls for changes to self-driving regulations

Consumer Watchdog camped out at the Automated Vehicles Symposium 2016 in San Francisco yesterday to protest against Tesla's self-driving cars. The organization turned up with a white truck highlighting the shortcomings of Tesla's Autopilot in the death of Joshua Brown earlier this year.

By Mike Pickford

Peterborough, Ontario — July 18, 2016 — Consumer Watchdog is lobbying both the US government and American automotive giant Tesla Motors to implement new driving regulations on the company’s range of potentially groundbreaking autonomous vehicles.

The organization sent representatives to the opening day of the Automated Vehicles Symposium 2016, currently taking place in San Francisco. Consumer Watchdog is pushing for a new rule that would require a driver’s hands to remain on the steering wheel at all times when autopilot is in use, while also pleading for the Obama administration to slow its push to deploy self-driving robot car technology.

The move comes hot on the heels of news that a man lost his life in a collision caused by an Autopilot malfunction in one of Tesla’s vehicles. Joshua Brown died on May 7 after his Tesla Model S failed to stop for a turning tractor-trailer, instead plowing into the underside of the truck.

In a statement released to the media, Tesla officials noted the cameras in Brown’s vehicle failed to distinguish the white side of the tractor-trailer from a brightly lit sky.

Watchdog President Jamie Court said he was “deeply concerned” about the failure of Tesla and the National Highway Traffic Safety Administration (NHTSA) to accept responsibility for Brown’s death in a letter intended for respective organization heads Elon Musk and Mark Rosekind. The message called for the NHTSA to abandon its voluntary guidelines and adopt enforceable standards.

“The tragic fatal Tesla crash … demonstrates the need for Federal Motor Vehicle Safety Standards rather than voluntary guidelines,” Court wrote. “Not only did Tesla S’ video camera fail to distinguish a white truck from a white sky, but its automatic emergency braking system failed to apply the brake with a tractor-trailer stretched across the road right in front of it.”

He added, “By releasing Autopilot prematurely in Beta mode, Tesla is unconsciously using our public highways as a test lab and its customers as human guinea pigs.”

While he has remained tightlipped across mainstream media since the incident, Tesla CEO Musk took to Twitter on Thursday hinting that the company was looking into ways to modify its Autopilot system.

“Working on using existing Tesla radar by itself (decoupled from camera) (with) temporal smoothing to create a coarse point cloud, like lidar … Good thing about radar is that, unlike lidar (which is visible wavelength), it can see through rain, snow, fog and dust,” the tweets state.

Court went on to demand that Tesla and other developers of self-driving technology assume legal liability when their self-driving technology causes a crash, such as companies like Volvo and Mercedes have done.

“If the manufacturers lack the confidence in their products to stand behind them and assume responsibility and liability when the systems they design are in control, and innocent people are injured or killed as a result, those vehicles do not belong on the road,” Court said.

SHARE VIA:
Facebook
LinkedIn
Twitter
Email

Sign-up for the Collision Repair daily e-zine and never miss a story –  SUBSCRIBE NOW FOR FREE!

Related Posts

Leave a Reply

Your email address will not be published. Required fields are marked *

Arslan Accuvision
PlayPlay
Fix Conference
PlayPlay
Fix Canada Conference
PlayPlay
previous arrow
next arrow

Recent Products

Recent Posts

Stay on top of the latest INDUSTRY news and trends by subscribing to our daily e-zine!

Our other sites

Our other sites

Days
Hours
Minutes
Seconds