Uber Calls Off Self-Driving Tests After Death

Rengar

Level 17
Thread author
Verified
Top Poster
Well-known
Jan 6, 2017
835
Pedestrian hit and killed by autonomous vehicle in Tempe, Arizona.
In the race to get self-driving vehicles and autonomous ride hailing service installed, are companies taking a little too much liberty with their efforts? It may appear so, according to reports that a 49-year-old Arizona woman has died from her injuries after she was struck by an autonomous Uber vehicle outside of a pedestrian crosswalk. The vehicle, operating in autopilot mode, had a human driver inside the vehicle.

Uber announced it is now suspending all street testing of its autonomous vehicles and is cooperating completely with the investigation, which marks a refreshingly humble change from a company that has had no shortage of “wrong side of right” issues since its founding.


Uber halts self-driving car tests after death in Arizona.

Series of issues
The race Uber faced with names like Alphabet (Waymo), Apple, Toyota, and Tesla may have been too much, too soon for a fledgling technology. Uber has already come under fire for news that its autonomous vehicles have run red lightsand ignored pedestrian crosswalks in previous street tests. At the time, the company tried to push back against those allegations, but YouTube video quickly surfaced of the incidents.

Scrutiny
Uber is no stranger to public scrutiny and scorn. From its lengthy battles with privacy violations and sexual harassment in the workplace to its fight with various cab companies, it seems like the ride hailing service is always in the news…but not in a good way. Most recently, reports surfaced that Nikolas Cruz, the Parkland, Florida, shooter who killed 17 people in a high school on February 14th, actually took an Uber-with his AR-15 in hand–to the school that day, prompting critics to ask why the driver didn’t alert the authorities.

Future plans
There is no word yet on when or if Uber plans to resume road tests of its self-driving vehicles, but there is reason to believe it will not be until the police finish their investigation into the pedestrian death.
 

RejZoR

Level 15
Verified
Top Poster
Well-known
Nov 26, 2016
699
I'm a tech geek and I just don't trust the damn autonomous vehicles no matter how awesome and fancy it sounds. One of things with driving is prediction. A lot of it. If I see bunch of kids on the sidewalk, I'll ready my foot on the brake because kids can push each other around and one may land on the street in front of my car. Or observing 3rd brake lights through windows of several cars ahead of me. This way you see 3 cars ahead if you'll have to slow down. If I see traffic halting half a kilometer ahead on highway I'll fire up hazard lights first and start slowing down. All the cameras and radars will never scan things so far away with attached context and prediction. Which is why I don't trust them.

What I do support greatly are secondary safety features like auto braking, lane assist, alertness warnings and stuff like that. So that when human driver does fail for whatever reason, that feature kicks in. That I think is a great stuff and has proven to work really well. With self driving stuff, it's WAY too early to roll them on the streets. We'll have them eventually, there is no way of stopping that, but it' just too early.
 
P

plat1098

The lady in the Uber car is also a victim, she is probably scarred for life with trauma. Although the concept in theory is amazing, cars ultimately were designed to be controlled by human drivers behind the wheel--over a century ago. Probably the concept should be limited further to only certain scenarios; it's got a ways to go, it seems.
 

upnorth

Moderator
Verified
Staff Member
Malware Hunter
Well-known
Jul 27, 2015
5,459
If a selfdriving vehicle can't stop or atleast start to break/slowdown in a case of emergency it should never be allowed out in public.

 
Last edited:
D

Deleted member 65228

Who is going to be punished for this death? Let me guess, another failed technology test so it's all okay? That's not right, it shouldn't have happened. Companies will never ever learn and will keep re-doing the same crap as each other constantly until an example is set for punishment.
 
P

plat1098

True, @Opcode, no one is really justifying anything. When you remove yourself from the driver's seat, you are relinquishing the vast majority of responsibility and handing it over to the technology. The poor lady is probably replaying the final moments over and over. This is what I meant.
 
D

Deleted member 65228

True, @Opcode, no one is really justifying anything. When you remove yourself from the driver's seat, you are relinquishing the vast majority of responsibility and handing it over to the technology. The poor lady is probably replaying the final moments over and over. This is what I meant.
Oh okay, yes I agree with you. It would have been sad for the woman as well, she didn't mean for anyone to be harmed. I guess this is the issue, we put too much trust into technology... Well, Uber do.
 

upnorth

Moderator
Verified
Staff Member
Malware Hunter
Well-known
Jul 27, 2015
5,459
2016
The first known death caused by a self-driving car was disclosed by Tesla Motors on Thursday, a development that is sure to cause consumers to second-guess the trust they put in the booming autonomous vehicle industry. The 7 May accident occurred in Williston, Florida, after the driver, Joshua Brown, 40, of Ohio put his Model S into Tesla’s autopilot mode, which is able to control the car during highway driving.

Tesla driver dies in first fatal crash while using autopilot mode

 
D

Deleted member 65228

Damn, that dude is so lucky. I wouldn't risk doing that, too dangerous... I mean getting yourself injured is one thing, but imagine if you were in an auto-pilot car and then you survived a crash and someone else died, that'd be on your mind forever more. That'd be really hurtful, a family minding their own business and your self-driving car takes their family away.

I think self-driving cars are just a really silly and irresponsible idea. Technology should go further yes, but where do you draw the line. By all means do it on your private testing areas but don't bring it on the roads where you put other innocent drivers at risk of death.
 
P

plat1098

My mind was going along similar lines. Let's say robots become autonomous and their creators want to have them mingle into human society (lol). Can you really expect flawless integration? Driverless cars technically should be on designated routes and not integrated with human-driven cars. But the cost would be astronomical. Good think I ain't involved in this puppy. My conscience wouldn't permit it.
 
F

ForgottenSeer 58943

Don't blame the car. The sheriff already stated their investigation revealed no fault of the vehicle but that fault rests with the pedestrian. According to the report I read, the lady was not crossing on the designated crossing point/light that was a mere 20 yards from where she was crossing. It was very dark. She was lunging out into traffic. She is a felon, with a long history of drug abuse and was potentially high. It is highly improbable anyone would have been able to stop in time, machine or otherwise.

For me, I trust autonomous/semi-autonomous technologies VASTLY more than I trust humans. 95% of my driving each day is avoiding 'morons' on the road. They don't follow the law, they drive recklessly, they don't merge properly, they don't signal and change lanes properly, they cut in front of me, etc. I'm tired of being a moron avoidance expert.

I'm taking delivery of my semi-autonomous vehicle this week and test drove it for the first time yesterday. It was an enlightening experience, akin to the first time I touched a computer, watched cable or used a microwave oven. The car drove me 20 miles in rush hour traffic and I didn't really do much other than monitor it and touch the controls once every 60 seconds or so. (Semi = need to be in seat, need to provide human feedback every so often) It was so fast, so predictive that it reacted WAY quicker than I would to 'morons' on the road. For example it was going along and started to break, I didn't see why until someone cut in front of me. The car 'predicted' this action was about to take place by the slight movement and millions of calculations done on the other vehicle. That's remarkable..

My house is already filled with robots. I have a Robotic Vacuum/duster on each floor, and a Robotic Mop on each floor. I haven't 'manually' vacuumed or mopped in a couple of years. I will actually feel safer with more autonomous or semi-autonomous vehicles on the road because people are idiots.

These cars aren't perfect yet (we'll get there), but as a GENERAL RULE they are vastly safer than idiots on the road. I don't see myself spending 95% of my time on the road avoiding morons when semi-autonomous or emergency assist systems are in place everywhere.. It's crap out there right now.
 
Last edited by a moderator:
D

Deleted member 65228

She was lunging out into traffic. She is a felon, with a long history of drug abuse and was potentially high. It is highly improbable anyone would have been able to stop in time, machine or otherwise.
Oh wow, well that really does change things. In that case, it would feel unfair of me to blame the car as well...
 
  • Like
Reactions: Rengar

Janl1992l

Level 14
Verified
Well-known
Feb 14, 2016
648
Dont want to be rude. But why the hell he is "sleeping" on the self driving testcar? Something is realy wrong there! That is just so wrong.
 
  • Like
Reactions: frogboy and Rengar
F

ForgottenSeer 58943

Dont want to be rude. But why the hell he is "sleeping" on the self driving testcar? Something is realy wrong there! That is just so wrong.

It's a 'she'.. Apparently there are monitoring equipment down to the middle, she was watching that and glancing up and down monitoring it.

Also, I read that the LIDAR and other tech SHOULD have spotted the stoned woman lunging into traffic in the middle of the night. So there appears to be both a combination of Murphy's Law AND technological failure. I'm thinking even if it was a normal drive, she' d of died. I certainly couldn't react that quick on a dark winding road!
 
  • Like
Reactions: Rengar
F

ForgottenSeer 58943

I hate to sound callus, but this lady was a fool. She broke the law and died. To me, it's irrelevant what car killed her, she was still at fault and decided to ignore a safe crossing area, run across a dark road ignoring oncoming traffic WHILE wearing a black sweater?!?! Some possessed drug addict putting an end to semi-autonomous progress would be ridiculous, especially when this almost looks like a suicide.. The number of lives that will be saved by this technology is remarkable, all things considered.

It would be like allowing Taxi Cab unions to halt the progress of Lyft or Uber, which by conservative estimates, have saved hundreds of thousands of lives. Some cities have experienced a 60%+ reduction in drunk driving deaths since Uber/Lyft went active. Progress isn't always perfect, but progress in the end can save lives and offer tremendous benefits.
 
  • Like
Reactions: Rengar

kellysi

Level 1
Feb 19, 2018
13
"Dont blame the car" that reminds me of the movie Visitors with Jean Reno when they start smashing the poor mans car.
Anyway, Its the companies fault, yes. But that woman want be brought back. I like the development of technology because we watched movies like back to the future and wondered whats it going to be like. But when technology starts killing people thats not useful, nor helpful nor is it perceived as good. The majority of people use IoT and they are not aware of possible dangers. Few days ago I read about coffee machine boiling empty water tank, that is going to set itself on fire if none is around. Seriously frightening and that self driving car is next to creepy and the "It" clown.
 

Atlas147

Level 30
Verified
Honorary Member
Top Poster
Content Creator
Well-known
Jul 28, 2014
1,990
Even though tragic, this driverless car has only been in 1 accident throughout testing, as compared to the average number of traffic accidents per year I think it's a pretty damn good start.

Driverless cars will almost never be 100% safe, but neither will normal drivers.
 

About us

  • MalwareTips is a community-driven platform providing the latest information and resources on malware and cyber threats. Our team of experienced professionals and passionate volunteers work to keep the internet safe and secure. We provide accurate, up-to-date information and strive to build a strong and supportive community dedicated to cybersecurity.

User Menu

Follow us

Follow us on Facebook or Twitter to know first about the latest cybersecurity incidents and malware threats.

Top