Cyclist killed by self-driving car while walking her bike in AZ
Our Community › Forums › Crashes, Close Calls and Incidents › Cyclist killed by self-driving car while walking her bike in AZ
- This topic has 56 replies, 18 voices, and was last updated 7 years, 3 months ago by
VikingMariner.
-
CreatorTopic
-
March 20, 2018 at 4:27 am #920300
n18
ParticipantShe was walking her bike at 10 PM Sunday outside the crosswalk from west to east at this spot in Google map(Street view link). Uber self-drivng car was going around 40 MPH in a 45 MPH zone, and it didn’t show signs of slowing down when hitting the pedestrian. It was in autonomous mode with a driver behind the wheel. AZ is not a pure contributory negligence state, only 4 left: VA, MD, NC, and AL.
-
CreatorTopic
-
AuthorReplies
-
March 22, 2018 at 6:31 pm #1086029
Judd
Participant@mstone 176772 wrote:
yes and no–cameras typically have more limited range than human eyes, but human eyes vary widely from person to person.
It’s kinda important to realize that the cars are still in development and not finished products. And uber’s program is nowhere near the most mature. (I honestly don’t know why they even have one, except that they have too much cash.)
I’m in the category of people that don’t see very well at night. My GoPro and my Fly12 also don’t see in the dark as well as I see in the dark.
Uber has a driverless car program because pay is so low that labor shortages are the most critical challenge to long term viability. I get a letter from Uber about once every month trying to get me to sign up to be a driver.
March 22, 2018 at 6:43 pm #1086030mstone
Participant@Judd 176776 wrote:
Uber has a driverless car program because pay is so low that labor shortages are the most critical challenge to long term viability. I get a letter from Uber about once every month trying to get me to sign up to be a driver.
Yes, but they aren’t a car company. The odds that they’ll end up making self driving cars faster & better than actual car companies is near zero. It’s like if buggy whip manufacturers decided to get into building locomotives (iron horses!) instead of, I don’t know, leather steering wheel covers or something similarly in their field of competence.
March 22, 2018 at 6:53 pm #1086031Brendan von Buckingham
ParticipantI had to stop from crying when I saw that footage. It showed exactly what I predicted. But I had the new feeling of why oh why was she crossing there in those conditions. She made more than one bad decision that I hope I never make, of if I do, that some driver is paying attention to save my ass. Driverless car couldn’t do that. I think this is a terrible example of a driverless system being programmed according to map and regulations of fallible human design. I bet human drivers intuitively drive more alert at that spot because they have real knowledge that pedestrians cross there against the rules. They know the intersection as culture rather than data.
Anyone know if driverless cars use incident or accident data in their programming? Seems to me that if they did they could drive slower or ready to react in high-incident areas.
March 22, 2018 at 6:59 pm #1086032LhasaCM
Participant@mstone 176777 wrote:
Yes, but they aren’t a car company. The odds that they’ll end up making self driving cars faster & better than actual car companies is near zero. It’s like if buggy whip manufacturers decided to get into building locomotives (iron horses!) instead of, I don’t know, leather steering wheel covers or something similarly in their field of competence.
But if they view themselves as a mobility company – i.e., their mission is to get people from A to B – then this avenue actually makes a bit of sense – at least as much as their partnership with JUMP! And if that’s their goal – then they don’t need to be faster/better than the actual car companies or Googles of the world…just cheaper for them.
March 22, 2018 at 7:14 pm #1086035Steve O
ParticipantA human driver paying reasonable attention would have slammed on the brakes in advance of the collision. The resultant reduction in speed may not have prevented the crash but may have been enough to prevent a fatal injury.
March 22, 2018 at 7:18 pm #1086036lordofthemark
ParticipantMy preliminary take, everyone is at fault.
Bad infrastructure – a long stretch between crosswalks on a fast multilane road in a place where trails might induce someone to cross away from the crosswalk. Especially likely to tempt someone unfamiliar with the area?
Mistake by the ped/cyclist – An example of the “wrong kind” of jaywalking. As much as I might resent the placement of the crosswalks, I would not cross at a place like that, even in broad daylight, much less at night.
Mistake by the driver and/or AV. It does look like (on the video, and I gather from the discussion?) there was still a chance to slowdown, if not actually stop in time, that was not taken.
My own concern as a bike ped advocate is to deal with the infra. In other contexts I advocate to be a PAL, and I leave to others policy concerns about AVs.
March 22, 2018 at 7:28 pm #1086038Steve O
Participant@lordofthemark 176783 wrote:
My preliminary take, everyone is at fault.
I like this article (https://www.economist.com/blogs/democracyinamerica/2013/11/cycling-v-cars) in the way it differentiates attitudes about “fault” from European countries to the US. I think we agree that the ped/cyclist who was killed in AZ did not act prudently.
However, in The Netherlands the driver of this car would have been found at fault for not taking due care to protect vulnerable road users, even if they act stupidly. The right to pilot a 2500-pound missile that can maim and kill comes with a much greater responsibility there than here. They have adopted the attitude that making a human error like that should not be punishable by death.March 22, 2018 at 8:10 pm #1086040dasgeh
Participant@lordofthemark 176783 wrote:
Mistake by the ped/cyclist – An example of the “wrong kind” of jaywalking. As much as I might resent the placement of the crosswalks, I would not cross at a place like that, even in broad daylight, much less at night.
I get that everyone is judging the woman. From the video, it seems like it was very dark out and she picked a horrible place to cross. But as has been mentioned, this is a video, and it may not have been so dark out. This may be a spot that people cross at all the time. There doesn’t seem to be another car on the road, and she made it across 2 other wide lanes of traffic. If the Uber-car had braked just a bit a while back or just changed lanes, she would have been fine. It is entirely possible that people cross there all the time, and just expect that the one car on the road will brake a little, or change lanes, or do whatever possible to avoid the collision.
It’s also possible that it was this dark out, and she’s the only one who crosses there. I just don’t think we know enough to know it’s her fault.
March 22, 2018 at 8:19 pm #1086041n18
Participant@Steve O 176782 wrote:
A human driver paying reasonable attention would have slammed on the brakes in advance of the collision. The resultant reduction in speed may not have prevented the crash but may have been enough to prevent a fatal injury.
Yep, a human driver not only would slam the brakes, but swerve and use the horn at the same time, reducing the severity of the crash. Also, like others suggested, the video may not represent what a human might perceive. Drivers in Arizona who visit the crash site may have a better idea of whether it’s avoidable or not when a driver is paying attention.
March 22, 2018 at 8:23 pm #1086043lordofthemark
Participant@dasgeh 176787 wrote:
I get that everyone is judging the woman. From the video, it seems like it was very dark out and she picked a horrible place to cross. But as has been mentioned, this is a video, and it may not have been so dark out. This may be a spot that people cross at all the time. There doesn’t seem to be another car on the road, and she made it across 2 other wide lanes of traffic. If the Uber-car had braked just a bit a while back or just changed lanes, she would have been fine. It is entirely possible that people cross there all the time, and just expect that the one car on the road will brake a little, or change lanes, or do whatever possible to avoid the collision.
It’s also possible that it was this dark out, and she’s the only one who crosses there. I just don’t think we know enough to know it’s her fault.
I should have used a better word than “fault” – and again as an advocate my focus would be on infra. If there are regular ped crossings there, that makes the state of the infra there more egregious. But crossing outside a crosswalk anywhere is technically illegal. There are places where its reasonably prudent. I have a hard time seeing a five (!) lane road, with a median in the middle (which typically encourages FASTER speeds) and apparently a 45MPH posted speed limit as being a prudent place to do so (in fact I would probably question the placement of an unbuffered bike lane on a road like that) even if most of the time things work out okay.
If we are going to challenge AV’s and how they operate, and also challenge bad infra, based on particular incidents, as we should, we can’t ignore when contributory factors such as ped mistakes play some role. Because if we don’t acknowledge those, others will point them out.
I agree with Steve O, the penalty for a ped mistake should not be death. That is why I support lowering speed limits in the City of Alexandria, and considering speed an issue even when a pedestrian jaywalked – the spirit of VZ, IIUC, is to reduce deaths, period, not to assign blame. I just wanted to acknowledge the complexity of the causal factors here.
March 22, 2018 at 8:32 pm #1086044accordioneur
Participant@dasgeh 176787 wrote:
I get that everyone is judging the woman. From the video, it seems like it was very dark out and she picked a horrible place to cross. But as has been mentioned, this is a video, and it may not have been so dark out. This may be a spot that people cross at all the time. There doesn’t seem to be another car on the road, and she made it across 2 other wide lanes of traffic. If the Uber-car had braked just a bit a while back or just changed lanes, she would have been fine. It is entirely possible that people cross there all the time, and just expect that the one car on the road will brake a little, or change lanes, or do whatever possible to avoid the collision.
It’s also possible that it was this dark out, and she’s the only one who crosses there. I just don’t think we know enough to know it’s her fault.
Whether the video looks dark or not is somewhat immaterial, as the sensors in self-driving cars see differently than we do. In addition to the visual spectrum they use both passive light outside the visual spectrum (IR) and active LIDAR (and sometimes other spectrum). Given that she was a pretty good sensor target with little background clutter, I suspect this was an algorithmic failure rather than a sensor failure (“the car saw something but didn’t react to it” rather than “the car didn’t see it”). A sober, alert human driver paying attention would have done better. But an imperfect human driver – someone texting while driving or my son driving home at 10 PM (which is the time of the accident) after a 14 hour shift at the hospital? Maybe not.
This is a tragic outcome, but I wouldn’t write off “self-driving cars” as a whole because this particular prototype did poorly compared with what an ideal human driver would have done.
March 22, 2018 at 8:56 pm #1086046jabberwocky
ParticipantMy favorite tech site (Ars Technica) has been covering this. Their take on the video that police posted:
https://arstechnica.com/cars/2018/03/video-suggests-huge-problems-with-ubers-driverless-car-program/My take on it is that a human driver likely wouldn’t have done much better (she’s just outside a street light, which is the worst place to be because human eyes just don’t see things in the dark from a well lit area very effectively). I was watching for her on first viewing and even knowing what was coming I didn’t see her until about 2 seconds before collision, which even with an attentive driver likely wouldn’t have made a lot of difference.
But as the Ars article says, the driverless sensors absolutely should have seen her and reacted accordingly. Human eyes may have had trouble making her out but lack of light doesn’t affect LIDAR sensors, so she should have been registered by those well beforehand. The fact that they did not means something was wrong with their self driving system.
March 22, 2018 at 9:27 pm #1086050trailrunner
ParticipantNow that I’ve seen the video, the autonomous sensor should have done better. There wasn’t a lot of clutter in the scene. The fact that we only see the victim for a second is irrelevant and an artifact of the video. The vehicle was most likely using a LIDAR, which is an active system (typically in NIR or SWIR), and maybe a passive LWIR sensor (these are already commercially available on some cars with algorithms to detect humans or animals). The sensors on the Uber also probably had a wider field of regard than what we’re seeing.
As I said earlier, what might’ve confused the algorithm is that she was walking with a bike. Some of the algorithms I developed looked for certain traits in humans, such as swinging legs or certain aspect ratios (height to width ratios). Having a bike in front of the person might’ve messed that up. However, the sensor still should have detected a human-sized object in a crossing trajectory and should have stopped the vehicle. But these sensors take some time to build up a history of the scene, and if the person walked out from the bushes, and if the car was going 40-45 mph, there might not have been enough time to build an adequate history. And, although I said that there didn’t seem to be much clutter in the scene, without seeing the LIDAR cloud, or the IR image, I can’t say any of this with any certainty. She might’ve been behind a bush in the median, and it’s hard to understand what the scene might’ve looked like in LWIR, which senses heat around ambient and body temperatures.
And as I also said earlier, I’m not at all surprised something like this happened. Despite the optimistic claims being made, I think we’re a long way from completely self-driving cars. I’m not an expert in this area, but I’ve developed these kind of sensors and know what they really can and can’t do.
March 22, 2018 at 9:38 pm #1086051mstone
Participant@jabberwocky 176794 wrote:
But as the Ars article says, the driverless sensors absolutely should have seen her and reacted accordingly. Human eyes may have had trouble making her out but lack of light doesn’t affect LIDAR sensors, so she should have been registered by those well beforehand. The fact that they did not means something was wrong with their self driving system.
Sure, that’s why they’re still in development and not mass produced. It’s clear that there was a bug, because regardless of whether the system identified the ped as a human, it shouldn’t have hit whatever it was. It should have noticed there was a collision course with something and reacted. Or its close-in systems should have noticed something directly in front of it and braked. There were likely multiple failures involved, but that doesn’t say anything about autonomous vehicles generally (just this one uber car) or how they’ll do in the future (when they’re actually out of testing). My main point was just that there’s a good chance a normal human driver might not have done any better. (“Normal” here means, playing with their phone on a stroad nowhere near a stop light. Main difference is that a human driver probably would have been speeding more.) 30k people still die every year from non-autonomous vehicles, and not many of those get nationwide coverage or much more than a “he pedestrian came out of nowhere” with no evidence other than the surviving driver’s word. In this case at least there’s some data. If it had been a non-autonomous vehicle there probably wouldn’t have even been an investigation.
March 22, 2018 at 9:47 pm #1086052n18
Participant@lordofthemark 176790 wrote:
If we are going to challenge AV’s and how they operate
In software, you see the term “this software is licensed, not sold”. What this means is turn it from a product that you can do anything with it(such as loaning to a friend, like a book), to a contract with conditions and limits on how it’s used. Likewise, because of liability; I think manufacturers and dealers for the consumer market are going to switch from selling you a car, to a service requiring a contract to absolve them from liability in case of crashes caused by software errors, or defective parts, and require the driver to pay full attention and switch to manual mode to avoid any crash. So basically it would be treated more like driver-assist rental, rather than self-driving car. The driver doesn’t own the car, he/she just has the full right to use it. The car can’t be “sold”, but the driver can transfer the contract to another person that agrees with all conditions. Of course whether manufacturers follow this software model would depend on the outcome of this case and other cases like it.
If testing driver-less cars continues, limiting speed to 20 MPH at night, and training drivers to always look and not trust the technology would help reduce crashes until the technology is perfected.
-
AuthorReplies
- You must be logged in to reply to this topic.