Self-Driving Shuttle Bus In Crash Two Hours Into First Day

Rengar

Level 17
Thread author
Verified
Top Poster
Well-known
Jan 6, 2017
835
Human to blame, not the autonomous vehicle.
A driverless electric airport shuttle bus that made its debut downtown Las Vegas was involved in a traffic accident just a few short hours after entering into service. Ironically, it seems that the human driver of the other vehicle involved was to blame, and not the bus.

The crash happened at low speed, and none of the eight passengers aboard the driver-less vehicle suffered injuries, and neither did the truck driver responsible for the collision.

bus.png

The vehicle – carrying “several” passengers – was hit by a lorry driving at slow speed.

The only damage of significance has been to the front bumper of the shuttle bus. AAA, which is sponsoring the latest pilot program, confirmed on Twitter that the accident was due to “human error” on the part of the truck driver.

“A delivery truck was coming out of an alley,” public information officer Jace Radke said. “The shuttle did what it was supposed to do and stopped. Unfortunately, the human element, the driver of the truck, didn’t stop.”

First of its kind
Speaking to the BBC, a spokesman for the City of Las Vegas said the crash was nothing more than a minor “fender bender” and that the shuttle would more than likely be back out on the road on Thursday, after the mainly cosmetic damage was repaired after some routine diagnostics tests.

According to Jenny Wong, a passenger on the shuttle at the time of the crash, told local news station KSNV: “The shuttle just stayed still. And we were like, it’s going to hit us, it’s going to hit us. And then it hit us.”

“The shuttle didn’t have the ability to move back. The shuttle just stayed still.” And in that respect, the bus did exactly what it was supposed to do. Taking corrective action either by swerving is not something the bus has been thus far been approved to do. The experimental shuttle bus had predicted that an accident was about to happen, and according to it algorithmic protocols, stopped.

Tickets please
The Metropolitan Police Department said officers responded at 12.07 pm to an accident involving the shuttle and a delivery truck on the 100 block of South Sixth Street, near Fremont Street. Damage was minor, and no one was hurt, police said. The driver of the truck did however receive a ticket for not stopping.

Vegas officials were bullish in their response to questions about the overall safety of AI-driven vehicles on public roads. The year-long pilot project, sponsored by AAA Northern California, Nevada and Utah, is expected to carry 250,000 people.

Human error
Self-driving technology has been involved in crashes before, but almost all reported incidents have been due to human error.

The AAA said human error was responsible for more than 90% of the 30,000 deaths on US roads in 2016, and that robotic cars could help reduce the number of incidents.

The bus was developed by French company Navya and uses GPS, electronic kerb sensors and other technology to find its way along Vegas streets with a strict 15mph limit. The oval-shaped shuttle can seat up to eight people and has an attendant and computer monitor, but no steering wheel or brake pedals.
 

Weebarra

Level 17
Verified
Top Poster
Well-known
Apr 5, 2017
836
This is why i can't see these type of vehicles being the norm. An AI can not predict human error and it's probably a lucky escape for the passengers that the truck was not going at a much higher speed. Why, i wonder did the bus not have the ability to go backwards ? Much more work needs to be done on these projects but even then, i don't think i would feel very safe and although certain technologies are to be embraced, this is one that i won't be welcoming any time soon :)
 

gorblimey

Level 2
Verified
Aug 30, 2017
99
These incidents are so-o-o-o interesting. BTW, I'm with @ravi prakash saini on this one.

Most of the interest in these incidents comes in the Comments at the bottom of the articles.
This is why autonomous tech is dangerous.
The car saw the crash coming, but was unprepared to escape the danger.
All it knew how to do is stop.

A human driver likely would have avoided the accident by:
-----------------------------------------------------------------------------------------------------------------------
#1: Speeding up slightly, getting out of the way of the path of the truck.
#2: Altering lanes (After a quick check that it is safe to do so), missing the truck completely.
#3: Backing up (After a quick check that it is safe to do so), missing truck completely.
#4: Using the shoulder of the road (After verifying it is safe), missing truck completely.
#5: Blowing the horn furiously while stopping, catching the attention of the rogue truck driver.
#6: Detected the problem sooner, stopping sooner, missing truck completely.
#7: Combination of these or other methods.
------------------------------------------------------------------------------------------------------------------------
Autonomous vehicles undoubtedly possess reaction times usually attributed to gods, but they are ill prepared to actually think & act like a human being on the road.

It would seem the quotee above doesn't consider the truck driver to be human?

I wonder if the quotee above actually has seen any tech reports like:
Self-driving technology has been involved in crashes before, but almost all reported incidents have been due to human error.

The AAA said human error was responsible for more than 90% of the 30,000 deaths on US roads in 2016, and that robotic cars could help reduce the number of incidents.

I witnessed a similar sort of thing some years ago on a narrow-ish lane in Osborne Park, a suburb a few Km north of Perth WA. A truckie wanting to reverse a semi up a commercial driveway was getting the required twisting and turning fairly well, until a young lady approached from the east, and continued creeping the car forward. At the same time, a young bloke brought his car in from the west... With no room to make the next swing, the truckie switched off and opened his door. By this time there was now six cars on the west side and five on the east. Thr truckie swung down, locked his door and said "I'm off for a coffee. When you're all gone, I'll come back."

Nobody is saying Robo-Cars have it all together yet. But human drivers lost it completely back in 1902. I would take the Robo-Tech today now and feel perfectly safe. Oh yes. Some people consider me a "Senior Citizen".
 

viktik

Level 25
Verified
Well-known
Sep 17, 2013
1,492
First, we need to develop A.I. which will keep messages to our girlfriend daily, so that we don't need to do it ourself.

Machine learning will learn what she wants and send the relevant message.
 
  • Like
Reactions: XhenEd

grumpy_joe

Level 1
Verified
Oct 18, 2017
38
Wait, lets say I am the truck driver can I legally even get a ticket for accident with an robot? I mean lets say that I don't consider it as a car because there is no human responsible for using it.

I bet there are some law loops around this. :) which I won't research but just point out for others to find out.
 
  • Like
Reactions: Weebarra and XhenEd

jogs

Level 22
Verified
Top Poster
Well-known
Nov 19, 2012
1,112
Whichever way you see it, its Human error only. AI is ultimately programmed by Humans. If a robo does anything wrong its error on the part of the programmer. So, there will always be Human error unless we get a robo designed by Aliens. ;)
 

Ink

Administrator
Verified
Staff Member
Well-known
Jan 8, 2011
22,361
I do not trust this technology. :oops:
I do not trust a human driver. A car with some autonomous-tech and/or sensors may prevent (or reduce seriousness) of a collision, that a human simply cannot.

For example, automatic braking sensors or blind spot indicators.
Automatic Emergency Braking: MyCarDoesWhat.org
Safety

From article:
"... And in that respect, the bus did exactly what it was supposed to do. Taking corrective action either by swerving is not something the bus has been thus far been approved to do. The experimental shuttle bus had predicted that an accident was about to happen, and according to it algorithmic protocols, stopped."
 
  • Like
Reactions: frogboy

About us

  • MalwareTips is a community-driven platform providing the latest information and resources on malware and cyber threats. Our team of experienced professionals and passionate volunteers work to keep the internet safe and secure. We provide accurate, up-to-date information and strive to build a strong and supportive community dedicated to cybersecurity.

User Menu

Follow us

Follow us on Facebook or Twitter to know first about the latest cybersecurity incidents and malware threats.

Top