Self-Driving Shuttle Bus In Crash Two Hours Into First Day

Discussion in 'Technology News' started by Rengar, Nov 12, 2017.

  1. Rengar

    Rengar Level 14

    Jan 6, 2017
    690
    4,390
    Greece
    Windows 8.1
    Avast
    Human to blame, not the autonomous vehicle.
    A driverless electric airport shuttle bus that made its debut downtown Las Vegas was involved in a traffic accident just a few short hours after entering into service. Ironically, it seems that the human driver of the other vehicle involved was to blame, and not the bus.

    The crash happened at low speed, and none of the eight passengers aboard the driver-less vehicle suffered injuries, and neither did the truck driver responsible for the collision.

    [​IMG]
    The vehicle – carrying “several” passengers – was hit by a lorry driving at slow speed.

    The only damage of significance has been to the front bumper of the shuttle bus. AAA, which is sponsoring the latest pilot program, confirmed on Twitter that the accident was due to “human error” on the part of the truck driver.

    “A delivery truck was coming out of an alley,” public information officer Jace Radke said. “The shuttle did what it was supposed to do and stopped. Unfortunately, the human element, the driver of the truck, didn’t stop.”

    First of its kind
    Speaking to the BBC, a spokesman for the City of Las Vegas said the crash was nothing more than a minor “fender bender” and that the shuttle would more than likely be back out on the road on Thursday, after the mainly cosmetic damage was repaired after some routine diagnostics tests.

    According to Jenny Wong, a passenger on the shuttle at the time of the crash, told local news station KSNV: “The shuttle just stayed still. And we were like, it’s going to hit us, it’s going to hit us. And then it hit us.”

    “The shuttle didn’t have the ability to move back. The shuttle just stayed still.” And in that respect, the bus did exactly what it was supposed to do. Taking corrective action either by swerving is not something the bus has been thus far been approved to do. The experimental shuttle bus had predicted that an accident was about to happen, and according to it algorithmic protocols, stopped.

    Tickets please
    The Metropolitan Police Department said officers responded at 12.07 pm to an accident involving the shuttle and a delivery truck on the 100 block of South Sixth Street, near Fremont Street. Damage was minor, and no one was hurt, police said. The driver of the truck did however receive a ticket for not stopping.

    Vegas officials were bullish in their response to questions about the overall safety of AI-driven vehicles on public roads. The year-long pilot project, sponsored by AAA Northern California, Nevada and Utah, is expected to carry 250,000 people.

    Human error
    Self-driving technology has been involved in crashes before, but almost all reported incidents have been due to human error.

    The AAA said human error was responsible for more than 90% of the 30,000 deaths on US roads in 2016, and that robotic cars could help reduce the number of incidents.

    The bus was developed by French company Navya and uses GPS, electronic kerb sensors and other technology to find its way along Vegas streets with a strict 15mph limit. The oval-shaped shuttle can seat up to eight people and has an attendant and computer monitor, but no steering wheel or brake pedals.
     
  2. frogboy

    frogboy Level 61
    Trusted

    Jun 9, 2013
    6,227
    64,788
    Heavy Duty Mechanic.
    Western Australia
    Windows 10
    Emsisoft
    I do not trust this technology. :oops:
     
    Rengar, Jake Miguel, plat1098 and 8 others like this.
  3. ravi prakash saini

    Apr 22, 2015
    604
    3,199
    india
    Windows 10
    Kaspersky
    let it come to India,fully loaded with all sorts of sensor and super computer on board, we will crash it within minute. American people are so slow they took 2hours for one minute job
     
    jogs, Rengar, Jake Miguel and 5 others like this.
  4. Weebarra

    Weebarra Level 7

    Apr 5, 2017
    338
    8,380
    Somewhere in Scottieland
    Windows 7
    Kaspersky
    This is why i can't see these type of vehicles being the norm. An AI can not predict human error and it's probably a lucky escape for the passengers that the truck was not going at a much higher speed. Why, i wonder did the bus not have the ability to go backwards ? Much more work needs to be done on these projects but even then, i don't think i would feel very safe and although certain technologies are to be embraced, this is one that i won't be welcoming any time soon :)
     
  5. gorblimey

    gorblimey Level 2

    Aug 30, 2017
    83
    224
    Eastern Indian Ocean
    Windows 7
    Zemana
    These incidents are so-o-o-o interesting. BTW, I'm with @ravi prakash saini on this one.

    Most of the interest in these incidents comes in the Comments at the bottom of the articles.
    It would seem the quotee above doesn't consider the truck driver to be human?

    I wonder if the quotee above actually has seen any tech reports like:
    I witnessed a similar sort of thing some years ago on a narrow-ish lane in Osborne Park, a suburb a few Km north of Perth WA. A truckie wanting to reverse a semi up a commercial driveway was getting the required twisting and turning fairly well, until a young lady approached from the east, and continued creeping the car forward. At the same time, a young bloke brought his car in from the west... With no room to make the next swing, the truckie switched off and opened his door. By this time there was now six cars on the west side and five on the east. Thr truckie swung down, locked his door and said "I'm off for a coffee. When you're all gone, I'll come back."

    Nobody is saying Robo-Cars have it all together yet. But human drivers lost it completely back in 1902. I would take the Robo-Tech today now and feel perfectly safe. Oh yes. Some people consider me a "Senior Citizen".
     
    Weebarra, XhenEd, Rengar and 2 others like this.
  6. Jake Miguel

    Jake Miguel Level 2

    Nov 14, 2016
    98
    566
    Singapore
    It is very funny. :ROFLMAO:
     
  7. viktik

    viktik Level 24

    Sep 17, 2013
    1,377
    3,794
    Unoccupied
    Hazaribagh
    Windows 10
    Kaspersky
    First, we need to develop A.I. which will keep messages to our girlfriend daily, so that we don't need to do it ourself.

    Machine learning will learn what she wants and send the relevant message.
     
    XhenEd likes this.
  8. grumpy_joe

    grumpy_joe Level 1

    Oct 18, 2017
    26
    134
    Unspecified
    Other OS
    Wait, lets say I am the truck driver can I legally even get a ticket for accident with an robot? I mean lets say that I don't consider it as a car because there is no human responsible for using it.

    I bet there are some law loops around this. :) which I won't research but just point out for others to find out.
     
    Weebarra and XhenEd like this.
  9. Joniantrey

    Joniantrey Level 1

    Nov 8, 2017
    9
    32
    London, UK
    mac OS X
    Kaspersky
    Whether you like it or not, Autonomous Electric vehicles are the future. Why? With improvements in Battery Technology, and rapid growth in computing technology, electric vehicles are far too efficient and economical than combustion engines. Anyway, as my lawyer said me (wesettle.com) they can sometimes get sometimes a lot of money from the state when the workers of state transport is too stupid to be involved in a car accident.
     
  10. jogs

    jogs Level 11

    Nov 19, 2012
    525
    1,138
    Whichever way you see it, its Human error only. AI is ultimately programmed by Humans. If a robo does anything wrong its error on the part of the programmer. So, there will always be Human error unless we get a robo designed by Aliens. ;)
     
  11. Spawn

    Spawn Administrator
    Staff Member Content Creator

    Jan 8, 2011
    16,256
    24,183
    I do not trust a human driver. A car with some autonomous-tech and/or sensors may prevent (or reduce seriousness) of a collision, that a human simply cannot.

    For example, automatic braking sensors or blind spot indicators.
    Automatic Emergency Braking: MyCarDoesWhat.org
    Safety

    From article:
    "... And in that respect, the bus did exactly what it was supposed to do. Taking corrective action either by swerving is not something the bus has been thus far been approved to do. The experimental shuttle bus had predicted that an accident was about to happen, and according to it algorithmic protocols, stopped."
     
    frogboy likes this.
Loading...
Similar Threads Forum Date
Apple Gives Private Demo Of Self-Driving Tech Technology News Dec 18, 2017
Nvidia Unveils Solution For Fully Autonomous Self-Driving Cars Technology News Oct 23, 2017
US gets federal guidelines for safe deployment of self-driving cars General Security Discussions Sep 21, 2016