Top
Tesla’s Autopilot fails haven’t shaken my faith in self-driving cars. Here’s why. – ANITH
fade
135472
post-template-default,single,single-post,postid-135472,single-format-standard,eltd-core-1.1.1,flow child-child-ver-1.0.0,flow-ver-1.3.6,eltd-smooth-scroll,eltd-smooth-page-transitions,ajax,eltd-blog-installed,page-template-blog-standard,eltd-header-standard,eltd-fixed-on-scroll,eltd-default-mobile-header,eltd-sticky-up-mobile-header,eltd-dropdown-default,wpb-js-composer js-comp-ver-5.0.1,vc_responsive

Tesla’s Autopilot fails haven’t shaken my faith in self-driving cars. Here’s why.

Tesla’s Autopilot fails haven’t shaken my faith in self-driving cars. Here’s why.


It looks bleak out there for autonomous vehicles. A pedestrian was killed in the first self-driving car accident with an Uber test vehicle, and then a driver in a semi-autonomous Tesla fatally crashed into a highway barrier. Earlier this month, a terrifying video purportedly showed Tesla’s Autopilot feature sending a car straight into danger.

These aren’t just mistakes on the road to autonomy, they’re deadly reminders of what’s at stake.

But despite the setbacks to self-driving industry, I can’t help but be optimistic that pedestrian, driver, and passenger deaths will keep going down. As this new technology is tested and developed, the road will get safer and safer.

Sure, with more self-driving and automated vehicles on the road there’s a greater chance for accidents, mistakes, crashes, and deadly consequences. A recent Morning Consult survey found that 50 percent of adults feel autonomous vehicles are less safe than human drivers. Back in January, only 36 percent of those surveyed said the vehicles were less safe. 

For a chilling experience, watch the many videos of Tesla’s Autopilot failing at its semi-automated job. But remember this: Crossing the street is already a dangerous prospect that we repeatedly do without second thought. And every time I get into a car, I’m risking my life.

Human driving numbers are scary, but we’ve become numb to the fear. The U.S. National Highway Traffic Safety Administration tracked 5,987 pedestrians killed in traffic crashes in the U.S. in 2016. A total of 37,461 were killed in crashes on U.S. roads that year. Driver error (of the human variety) is involved in 90 percent of accidents.

Suna Taymaz, VP of autonomous vehicle strategy at AAA Northern California, Nevada, and Utah, spoke at a transportation conference in the Bay Area last month. As semi-autonomous and other self-driving features become more common and more familiar, she explained, we’re starting to accept them.

And University of Virginia professor Madhur Behl, a computer science professor who researches high-performance autonomous racing cars that are a tenth the size of a normal car, is confident in the cars’ abilities to be agile and safe. “Overall, these cars will prove to be safer than human counterparts,” Behl said.

The challenge in building a safe vehicle is that it can never encounter all the possible traffic situations, but the cars can learn from mistakes and crashes. 

Behl understands, of course, that humans are still part of the equation even if they aren’t the ones driving. Trusting the autonomous vehicle is crucial and means the machines need to be more transparent, explaining why they braked or swerved. 

“[The car] has to become more human,” he said. “It’s not about how clever the cars are … but designing them to work with humans.” As he reminded me, we’re not robots.

But we do think we have beyond-human capabilities when it comes to driving. An Arity distracted-driving report released last week found that most drivers think their driving is better than others and that they can avoid distractions better. We don’t realize how distracted we get, or how drowsy or unobservant we really are. Human hubris, that’s what really scares me on the road.

Ching-Yao Chan, associate director of UC Berkeley’s Berkeley DeepDrive, also presented at that transportation conference last month, addressing challenges to autonomous safety. He used the Uber fatality to show that human driver perception and reaction time can’t compete with the potential of self-driving tech. 

In a properly functioning autonomous car, the vehicle should’ve have seen the pedestrian 3 to 4 seconds before the crash, about 200 feet away, he said. A human driver in the same situation would need up to 2.5 seconds to stop in time. A car going just under 40 mph would need 2.5 seconds of hard braking. 

Last month I rode in a self-driving shuttle with no steering wheel, brakes, or gas pedal. It was surprisingly uneventful as we drove around a traffic circle and parking lot at Bay Area office park. The EasyMile shared autonomous vehicle was abundantly cautious at every turn and stop. It was very smooth, easy, and not in the least bit scary.

There’s always a flip side to the doom and gloom about self-driving tech — just look up “Autopilot saved my life” videos in which the tech is praised and credited for helping drivers in sticky situations. That 40-percent crash reduction rate from a NHTSA report that Tesla often cites is heartening and reassuring.

“They are not perfect and won’t be perfect for a while,” Behl says about self-driving cars. “Accidents will happen.” And that’s OK. Because as he assured me, “Self-driving cars have the ability to learn quickly from these mistakes.”  

There’s so much potential. I’m not about to give up. 

http://platform.twitter.com/widgets.js
!function(f,b,e,v,n,t,s){if(f.fbq)return;n=f.fbq=function(){n.callMethod?
n.callMethod.apply(n,arguments):n.queue.push(arguments)};if(!f._fbq)f._fbq=n;
n.push=n;n.loaded=!0;n.version=’2.0′;n.queue=[];t=b.createElement(e);t.async=!0;
t.src=v;s=b.getElementsByTagName(e)[0];s.parentNode.insertBefore(t,s)}(window,
document,’script’,’https://connect.facebook.net/en_US/fbevents.js’);
fbq(‘init’, ‘1453039084979896’);
fbq(‘track’, “PageView”);



Source link

Anith Gopal
No Comments

Post a Comment