More or less Tesla’s autopilot is not as safe as Tesla would have you believe.

  • RandomBit@sh.itjust.works
    link
    fedilink
    arrow-up
    7
    ·
    1 year ago

    I don’t think this is a fair comparison since an Autopilot crash is a 2 stage failure: the Autopilot and then the driver both failed to avoid the crash. The statistics do not include the incidents where Autopilot would have crashed but the human took control and prevented it. If all instances of human intervention were included, I doubt Autopilot would be ahead.

    • Kepler@lemmy.world
      link
      fedilink
      arrow-up
      2
      ·
      1 year ago

      If all instances of human intervention were included, I doubt Autopilot would be ahead.

      Why would you interpret non-crashes due to human intervention as crashes? If you’re doing that for autopilot non-crashes you’ve gotta be consistent and also do that for non-autopilot non-crashes, which is basically…all of them.

      • RandomBit@sh.itjust.works
        link
        fedilink
        arrow-up
        3
        ·
        1 year ago

        If a human crashes and their action/vehicle is responsible for the crash, the crash should be attributed to the human (excepting mechanical failure, etc). I believe that if an advanced safety systems, such as automatic braking, that prevent a crash that otherwise would have occurred, the prevented crash should also be included in the human tally. Likewise, if Autopilot would have crashed if not for the intervention of the driver, the prevented crash should be attributable to Autopilot.

        As has been often studied, the major problem for autonomous systems is that until they are better than humans WITHOUT human intervention, the result can be worse than both. People are much less likely to pay full attention and have the same reaction times if the autonomous system is in full control the majority of the time.