+39 39 0
Published 7 years ago by geoleo with 5 Comments
Additional Contributions:

Join the Discussion

  • Auto Tier
  • All
  • 1
  • 2
  • 3
Post Comment
  • ChrisTyler
    +5

    Yes, because Mercedes wants to actually sell these cars and not very many people would buy one if they knew that- in some circumstances, the car was literally programmed to kill them.

    • NinjaKlaus
      +4

      Right, I know I wouldn't buy one that was set up to kill me, for that matter I'm not really happy with everybody wanting to make cars I don't control to begin with.

  • kxh (edited 7 years ago)
    +4

    An AI cannot really tell what things in front of the car are, just that they are there. AI is really not that sophisticated. You want it to spend minutes deciding if something in front of the car is a human, an animal, a cardboard cutout of a human, a cardboard box or a concrete post? It's too late when you're driving. AIs are not human and they don't work like a human or think like a human.

  • SteveRoy
    +3

    This was a topic once brought up by someone who doesn't know anything about programing. Elon Musk addressed it like an engineer. Effectively saying that if the car is ever in a situation where this question matters, the AI has already totally failed and likely has no control over anything that would matter.

    This is an interesting thought experiment: "Say the car is spinning out of control, and on course to hit a crowd queuing at a bus stop. It can correct its course, but in doing so, it'll kill a cyclist for sure. What does it do?" If it's spinning out of control then there is no longer any control. It cannot correct itself. If it had control, the spin never would have gotten started.

    This is a decision by the marketing team, not then AI development team. I guarantee you they are more worried about how to tell if the center berm in a dirt road with two well worn tire tracks is too high or not.

Here are some other snaps you may like...