Tesla under investigation by California attorney general over Autopilot safety, marketing::The California attorney general is investigating Tesla over the electric car company’s driver assistance technology, CNBC has learned.

  • XTornado@lemmy.ml
    link
    fedilink
    English
    arrow-up
    10
    arrow-down
    2
    ·
    edit-2
    1 year ago

    About the pileup because other case it very clear, don’t take me wrong if the Tesla wouldn’t have stopped it wouldn’t have caused the accident, that I agree… and it’s terrible and should be disabled until these issues are solved.

    But at the end most of the pileup was caused by people being people like it could have been a normal car or even a Tesla stopped for a good reason and they would have caused that pileup anyway… The first cars stopped fine but then some people that didn’t keep a safe distance/going too fast/distracted started crashing.

    • ghariksforge@lemmy.world
      link
      fedilink
      English
      arrow-up
      9
      arrow-down
      1
      ·
      1 year ago

      You can see the pileup in the video. Tesla has a phantom break for no reason, and the cars driving behind slam into it.

      • shinjiikarus@mylem.eu
        link
        fedilink
        English
        arrow-up
        9
        arrow-down
        1
        ·
        1 year ago

        They wouldn’t have slammed into it, if they’d kept their safe distance as @XTornado@lemmy.ml wrote. I’m in no way defending Tesla‘s „Autopilot“, it should be banned until they pass a very difficult test proving true self driving capabilities and multiple layers auf fail safes (which they can’t right now). But examples where an autopilot Tesla did something stupid and other people making human errors are disingenuous: if somebody drops their cigarette and breaks unexpectedly and the cars behind don’t keep their distance and slam into it, the reason they have an accident is not the cigarette but their dangerous safety distance.

        • jtk@lemmy.sdf.org
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          1 year ago

          It’s not just Tesla’s that have automatic breaking that can fail. Even basic cars have that “feature” now. I fucking hate it. My entire family was in the car and we almost got rear ended super hard because ours kicked in for the right reasons, but did it super far back from the incident and fast as hell. The person behind us had less than a second to react, luckily, there was no one in the next lane and they were able to swerve around. I’ve been terrified to drive it ever since.

          I agree no one should be tailgating but the algorithm needs to factor that in when it’s happening. I knew exactly how I wanted to handle the situation but the stupid car prevent me from doing it the safer way.