People have mentioned this elsewhere but there's a couple of factors that make this more complex, or something:
The video is quite dark—but probably less dark than it would've been irl; see for example this picture for a better feel for what it would've looked like. If the car hadn't noticed, the driver should've (or would they have waited, assuming the car would stop? What's their policy?)
Besides that LIDAR should've caught it. Here's an alleged expert, giving some interesting insight. Mostly, that Uber claims to be much better than they are.
Thirdly, I've heard some half-assed attempts at saying the person should've been paying attention more, and philosophising about blame, etc, and how to solve that. But we've already got something veeeery analogous to someone sitting in a self-driving car: train drivers. They also just kind of sit there, hitting buttons whenever necessary and adjusting for the circumstances, but the basics are more or less the same: you get the thing going and assume everything is OK, but jump in when necessary. And I think that might make it easier to solve this issue. The driver wasn't paying attention, sure, but otoh what kind of training did they get from Uber? If that was lacking, hefty fine (near crippling if possible, imo) along with mandated readjustments of their training programmes, and permits; then have a look at punishing the driver. I mean, apparently they were in Tempe because the law is a clusterfuck and allows this kind of easy testing… What is it they say, "privatize profits, socialize losses"?
As a kid, I got to drive trains a bit. Maybe the first thing you notice when you sit down is that there's no steering wheel, only a throttle. Trains've been automated increasingly for decades. Sometimes there's nobody in the yard anymore. Or, like, one guy with chestpack controls. All of which means the legal standard of "who had the last clear chance to avoid the accident" dissolves into a simple arithmetic equation of mass and velocity.
As in whoever has the most wins. Like it is with bike couriers and garbage trucks. Inevitably, when you do hit a cow, after you finish peeling it off the front of the engine, it's always tragic. The farmer'll tell you it's a crying shame because that particular cow had just won a gold medal at the state fair. Every time. Railroads pay, every single time. Seems like that's going to be one more factor in the cost of doing business around automated cars as well.
As policy develops, we'll have to make sure that the incentives aren't perverse. In China, when a driver hits a pedestrian the rule is to back up and run them over a few times to be sure they're dead, because the penalties for a kill are so much less than if the pedestrian survives. Let's hope that kind of calculation isn't rolled up into the AI.
It looks to me like the car just ran over her when she was walking across the road. I don't see how the car couldn't have seen her.
People have mentioned this elsewhere but there's a couple of factors that make this more complex, or something:
The video is quite dark—but probably less dark than it would've been irl; see for example this picture for a better feel for what it would've looked like. If the car hadn't noticed, the driver should've (or would they have waited, assuming the car would stop? What's their policy?)
Besides that LIDAR should've caught it. Here's an alleged expert, giving some interesting insight. Mostly, that Uber claims to be much better than they are.
Thirdly, I've heard some half-assed attempts at saying the person should've been paying attention more, and philosophising about blame, etc, and how to solve that. But we've already got something veeeery analogous to someone sitting in a self-driving car: train drivers. They also just kind of sit there, hitting buttons whenever necessary and adjusting for the circumstances, but the basics are more or less the same: you get the thing going and assume everything is OK, but jump in when necessary. And I think that might make it easier to solve this issue. The driver wasn't paying attention, sure, but otoh what kind of training did they get from Uber? If that was lacking, hefty fine (near crippling if possible, imo) along with mandated readjustments of their training programmes, and permits; then have a look at punishing the driver. I mean, apparently they were in Tempe because the law is a clusterfuck and allows this kind of easy testing… What is it they say, "privatize profits, socialize losses"?
As a kid, I got to drive trains a bit. Maybe the first thing you notice when you sit down is that there's no steering wheel, only a throttle. Trains've been automated increasingly for decades. Sometimes there's nobody in the yard anymore. Or, like, one guy with chestpack controls. All of which means the legal standard of "who had the last clear chance to avoid the accident" dissolves into a simple arithmetic equation of mass and velocity.
As in whoever has the most wins. Like it is with bike couriers and garbage trucks. Inevitably, when you do hit a cow, after you finish peeling it off the front of the engine, it's always tragic. The farmer'll tell you it's a crying shame because that particular cow had just won a gold medal at the state fair. Every time. Railroads pay, every single time. Seems like that's going to be one more factor in the cost of doing business around automated cars as well.
As policy develops, we'll have to make sure that the incentives aren't perverse. In China, when a driver hits a pedestrian the rule is to back up and run them over a few times to be sure they're dead, because the penalties for a kill are so much less than if the pedestrian survives. Let's hope that kind of calculation isn't rolled up into the AI.
Especially with all the radar and other tech it's supposedly equipped with.