Page 1 of 1

Uber Death

PostPosted: March 20th, 2018, 12:48 am
by Event Horizon
Sadly, I have to report that a woman has been run down and killed by a driverless car operated by Uber using (If I recall a Google technology) in a road-going trial in the States.

This reminded me of Asimov's three laws of robotics, and in particular: "A robot may not harm a human being, or by inaction allow a human being to be harmed"

Is it not time to make these laws binding and universal?

Re: Uber Death

PostPosted: March 20th, 2018, 2:15 am
by zetreque
I have been 100% against this technology being used on public roadways for multiple reasons. I actively comment against it in policy decisions but as always, money and ignorance rule. There is significant money behind this tech. It's going to cut costs for corporations, it will make the rich richer, it will make the lazy lazier, it's good for publicity, and companies like Tesla are taking advantage of tax payers on all of this. Ohhhh but it's cool technology (sarcasm) and supposedly going to boost the economy by MAYBE supplying SOME jobs (while eliminating thousands) and keeping a technological lead over over states or countries as if this is the one technology that can do that.

I just don't see how they can get through the ethical programming problem.


Re: Uber Death

PostPosted: March 20th, 2018, 3:14 am
by someguy1
A couple of interesting facts from this tragedy. One, the car was going 40 (in a 35 zone) and never slowed down according to the police. Clearly Uber's software is not ready for prime time if it failed to see a pedestrian.

Secondly, there was a human aboard whose ironic title was, "Human safety driver." It will be interesting to find out what the so-called safety driver was doing while the car plowed into a ped without even slowing down.

According to one report I read, the local DA is considering filing charges. Now we will begin to see the legal issues develop. Who exactly is criminally responsible? Uber? The human "safety" driver? The programmers? It's a real legal conundrum.

I do hope that all the state legislators (such as in California) who are rushing to legalize autonomous cars on the public roads will wake up and realize that these vehicles are not ready to be unleashed on a defenseless public. Last year a Tesla in autonomous mode slammed into a semi truck in broad daylight, killing the driver/passenger.

RIP to the victim, 49 year old Elaine Herzberg, who was walking her bicycle across the street at the time. She was not in the crosswalk, a fact that will no doubt be seized upon by Uber's lawyers, but that in no way mitigates the engineering screwup here.

Event Horizon » March 19th, 2018, 10:48 pm wrote:
Is it not time to make these laws binding and universal?

It's not that the car didn't know not to kill a person. It's that the car's sensors and software didn't detect the person. It's an engineering problem.

Re: Uber Death

PostPosted: March 20th, 2018, 11:53 am
by SciameriKen
This article provides better details of the accident and links to a twitter page that has pictures of the roadways where the incident occured: ... ult-police

This one would have been difficult to avoid - even for a human driver - even for one that was paying attention it seems like. Pedestrian crossing in an area that has signage not to cross there - dark - bushes obscuring view -- not good!

I am with Zet on most of what he says and I am mostly against the implementation of these cars, although I also see this as the inevitable future that will happen within the next 10-20-30 years. In any case - the robot is innocent!

Re: Uber Death

PostPosted: March 20th, 2018, 12:05 pm
by TheVat
Pretty clear that the lack of innocence goes to the management team at Uber that thought it was okay for the pedestrians and cyclists of an Arizona community to be guinea pigs in the testing of a technology that is still in the experimental phase. Somguy is right: it's an engineering problem. Ergo, you work out the problem in a laboratory setting (one with an outdoor track and maybe a mockup of an urban street situation).

The ethics seem to be more clearly defined with experimental aviation. You put the new jet through its paces over an empty desert region of Nevada, rather than buzzing the Strip in Las Vegas and hoping everything goes okay. Surely, management can figure out that basic logic applies whenever you have experimental machinery and any degree of lethality when it meets up with a human body.

We have two threads going on this topic. Maybe that's a good thing. It seems important to be thinking about a future with this level of automation, and whether or not we have any say in the matter.

Re: Uber Death

PostPosted: March 20th, 2018, 1:12 pm
by zetreque
Braininvat » Tue Mar 20, 2018 9:05 am wrote:Pretty clear that the lack of innocence goes to the management team at Uber that thought it was okay for the pedestrians and cyclists of an Arizona community to be guinea pigs in the testing of a technology that is still in the experimental phase.

I would have thought these states would be first to jump on board.
These states have introduced bills to protect drivers who run over protesters ... index.html

PS: Nevada isn't so barren or "lonely" anymore for multiple reasons.

Re: Uber Death

PostPosted: March 20th, 2018, 3:21 pm
by toucana
There is a much quoted urban myth that back in 1895, there were only two automobiles in the the entire state of Ohio, and that they managed to collide with each other, creating America's first ever automobile accident. It isn't actually true -

But what probably did happen is almost as grimly amusing. It actually happened in 1891, and there was just one automobile, a prototype constructed by a man called Lambert. He was driving around town testing out a new type of stirrup-based steering system (presumably for the benefit of drivers more accustomed to horses), when he lost control of his contraption and crashed it into a tree, destroying the vehicle in the process.

The lesson seems to be that when it comes to testing prototype vehicles on public roads, even one is too many.

Re: Uber Death

PostPosted: January 1st, 2019, 11:46 am
by PaulN ... e=Homepage

I admit to some sympathy with the couple whose son was nearly struck by one of the Waymo test cars. This underscores the point made above that this highly experimental technology belongs on a testing ground, not in a populous suburb. Keep slashing tires, maybe the company will get the message that people don't care to be involuntary guinea pigs.

Re: Uber Death

PostPosted: January 5th, 2019, 1:57 pm
by Event Horizon
I thought perhaps that if the 3 laws were invoked, the experiments would have to be run in on private land until it complied.
The laws were designed for the future we are now in. Its probably better to do it now, as it may take years to implement. You wouldn't be able to send your NinjaBot to assassinate anyone anymore!

Re: Uber Death

PostPosted: January 8th, 2019, 10:47 am
by toucana
Promobot Humanoid Robot

Tesla has found itself involved in yet another self-driving car accident – and this time, its victim was a $2,000-per-day rentable humanoid robot.

In what many are speculating was an over-the-top PR stunt, Promobot revealed one of its model v4 robots was ‘killed’ by a Tesla Model S on a Las Vegas street ahead of CES.

The accident occurred on Paradise Rd Sunday night as engineers transported the firm’s robots to the display booth.

According to Promobot, a number of robots were making their way to the booth around 7 p.m. when one of them stepped out of line and into the parking lot roadway.

As it did, it was struck by a Tesla Model S operating in autonomous mode.

The crash tipped the robot onto its side, causing ‘serious damage,’ Promobot says.

Now, with parts of its body, head, arm mechanisms, and movement platform destroyed, it cannot be put on display.

The firm says the damage is likely irreparable.

Re: Uber Death

PostPosted: January 8th, 2019, 11:31 am
by TheVat
If that's a PR stunt, it's a weirdly awful one, uniquely suited to set back the progress of both companies. LoL.