Tesla knew Autopilot caused death, but didn’t fix it::Software’s alleged inability to handle cross traffic central to court battle after two road deaths

  • Doug7070@lemmy.world
    link
    fedilink
    English
    arrow-up
    46
    arrow-down
    5
    ·
    1 year ago

    “A bit misleading” is, I think, a bit of a misleading way to describe their marketing. It’s literally called Autopilot, and their marketing material has very aggressively pitched it as a ‘full self driving’ feature since the beginning, even without mentioning Musk’s own constant and ridiculous hyperbole when advertising it. It’s software that should never have been tested outside of vehicles run by company employees under controlled conditions, but Tesla chose to push it to the public as a paid feature and significantly downplay the fact that it is a poorly tested, unreliable beta, specifically to profit from the data generated by its widespread use, not to mention the price they charge for it as if it were a normal, ready to use consumer feature. Everything about their deployment of the system has been reckless, careless, and actively disdainful of their customers’ safety.

    • Thorny_Thicket@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      9
      arrow-down
      12
      ·
      1 year ago

      You don’t even seem to get the terms right so makes me question how well informed you really are on the subject.

      Autopilot is the most basic free driver assist version that comes with every Tesla. Then there’s Enhanced Autopilot which costs extra and is more advanced and lastly there’s Full Self Driving BETA. Even the name indicates you probably shouldn’t trust your life with it.

    • anlumo@feddit.de
      link
      fedilink
      English
      arrow-up
      10
      arrow-down
      19
      ·
      1 year ago

      Everybody who has a bit of an idea what an autopilot in a plane actually does is not mislead. Do people really think that commercial airline pilots just hit the “autopilot” button in their cockpit after disengaging the boarding ramp and then lean back until the boarding ramp at the destination is attached?

      • Einar@lemm.ee
        link
        fedilink
        English
        arrow-up
        23
        arrow-down
        5
        ·
        edit-2
        1 year ago

        So I need to understand the autopilot of a plane first before I buy a car?

        I would be mislead then, as I have no idea how such autopilots work. I also suspect that those two systems don’t really work the same. One flies, the other drives. One has traffic lights, the other doesn’t. One is operated by well paid professionals, the other, well, by me. Call me simple, but there seem to be some major differences.

        • Caculon@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 year ago

          I would have though people would read autopilot and think automatic. At least that’s what I do. I guess pilot is closely associated with planes but it certainly isn’t what I think of.

        • CmdrShepard@lemmy.one
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          2
          ·
          1 year ago

          This is a pretty absurd argument. You could apply this to literally any facet of driving.

          “I have to learn what each color of a traffic light means before driving?”

          “I have to learn what white and yellow paint means and dashes versus lines? This is too confusing”

          God help you when you get to 4-way stops and roundabouts.

          • Einar@lemm.ee
            link
            fedilink
            English
            arrow-up
            2
            ·
            edit-2
            1 year ago

            Not absurd, but reality. We do that in driving school.

            I don’t know where you are from and which teaching laws apply, of course, but I definitely learned all those lessons you mentioned.

            • CmdrShepard@lemmy.one
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              1
              ·
              1 year ago

              That’s precisely my argument and why “learning my new car’s features is too confusing” is an absurd argument.

        • anlumo@feddit.de
          link
          fedilink
          English
          arrow-up
          5
          arrow-down
          9
          ·
          1 year ago

          Yeah, there are some major differences in the vehicles, but both disengage when there’s anything out of the ordinary going on. Maybe people base their understanding of autopilots on the movie “Airplane!” where that inflatable puppet groped the Stewardess afterwards.

            • anlumo@feddit.de
              link
              fedilink
              English
              arrow-up
              1
              ·
              edit-2
              1 year ago

              True, good point. As far as I know, it does turn itself off if it detects something it can’t handle, though. The problem with cross traffic is that it obviously can’t detect it, otherwise turning itself off would already be a way of handling it.

              Proximity detection is far easier up in the air, especially if you’re not bound by the weird requirement to only use visible spectrum cameras.

              (To make things clear, I’m just defending the engineers there who had to work within these constraints. All of this is a pure management failure.)

                • meco03211@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  2
                  ·
                  1 year ago

                  At least for the base autopilot (AP), the only system I’ve experience with, this is not true. My car doesn’t know it’s on a road with cross traffic. It only knows it’s on a road that it can see lane lines to differentiate lanes. It doesn’t even know which direction they are supposed to travel. If I cross the center line, I could activate AP and it would keep me centered in the lane with oncoming traffic. This was all thoroughly explained to me when I bought it. I had no misconceptions about it’s capability.

                  It really feels like the people who are so opposed to it are working off some major disinformation which only muddied the conversation when they state things that are plainly wrong or misuse terms.

            • Ocelot@lemmies.world
              link
              fedilink
              English
              arrow-up
              1
              ·
              1 year ago

              I’m sorry, what? If you set an airplane to maintain altitude and heading with autopilot, it will 100% fly you into the side of a mountain if there’s one in front of you.

      • r00ty@kbin.life
        link
        fedilink
        arrow-up
        18
        arrow-down
        2
        ·
        1 year ago

        They’re not buying a plane though. They’re buying a car with an autopilot that is labeled as “full self driving”. That term does imply it will handle a complete route from A to B.

        People are wrongly buying into the marketing hype and that is causing crashes.

        I’m very concerned about some of the things I’ve seen regarding FSD on Teslas. Such as sudden hard braking on highways, failing to avoid an accident (but it’s OK it disengaged seconds before impact so the human was in control) and of course the viral video of FSD trying to kill a cyclist.

        They should not be allowed to market the feature this way and I don’t think it should be openly available to normal users as it is now. It’s just too dangerous to put in the hands (or not) of normal drivers.

        • Ocelot@lemmies.world
          link
          fedilink
          English
          arrow-up
          4
          arrow-down
          2
          ·
          edit-2
          1 year ago

          Autopilot has never been “Full Self Driving”. FSD is an additional $15,000 package on top of the car. Autopilot is the free system providing lane keeping with adaptive cruise, same as “Pro Pilot Assist” or “Honda Sensing” or any of the other packages from other car companies. The only difference is whenever someone gets in an accident using any of those technologies we never get headlines about it.

        • anlumo@feddit.de
          link
          fedilink
          English
          arrow-up
          3
          arrow-down
          8
          ·
          1 year ago

          I’ve never sat in a Tesla, so I’m not really sure, but based on the things I’ve read online, autopilot and FSD are two different systems on Tesla cars you can engage separately. There shouldn’t be any confusion about this.

          • Miqo@lemmy.world
            link
            fedilink
            English
            arrow-up
            4
            ·
            1 year ago

            I’ve never sat in a Tesla, so I’m not really sure

            There shouldn’t be any confusion about this.

            U wot m8?

          • r00ty@kbin.life
            link
            fedilink
            arrow-up
            1
            ·
            1 year ago

            Well, if it’s just the lane assistance autopilot that is causing this kind of crash. I’d agree it’s likely user error. The reason I say if, is because I don’t trust journalists to know or report on the difference.

            I am still concerned the FSD beta is “out there” though. I do not trust normal users to understand what beta means, and of course no-one is going to read the agreement before clicking agree. They just want to see their car drive itself.

            • anlumo@feddit.de
              link
              fedilink
              English
              arrow-up
              1
              ·
              1 year ago

              If it were about the FSD implementation, things would be very different. I’m pretty sure that the FSD is designed to handle cross traffic, though.

              I do not trust normal users to understand what beta means

              Yeah, Google kinda destroyed that word in the public conciousness when they had their search with the beta flag for more than a decade while growing to be one of the biggest companies on Earth with it.

              When I first heard about it, I was very surprised that the US even allows vehicles with beta self-driving software on public roads. That’s like testing a new fire fighter truck by randomly setting buildings on fire in a city and then trying to stop that with the truck.

            • Ocelot@lemmies.world
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              1
              ·
              edit-2
              1 year ago

              Yeah, I don’t trust a machine that has been trained for millions of hours and simulated every possible traffic scenario tens of millions of times and has millisecond reaction time while seeing the world in a full 360 degrees. A system that never drives drunk, distracted or fatigued. You know who’s really good at driving though? Humans. Perfect track record, those humans.

      • El_illuminacho@lemmy.world
        link
        fedilink
        English
        arrow-up
        7
        arrow-down
        2
        ·
        1 year ago

        Why do you think companies need to warn about stuff like “Caution, Contents are hot” on paper coffee shops? People are stupid.

        • anlumo@feddit.de
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          7
          ·
          1 year ago

          Those labels are there because people made a quick buck suing the companies when they messed up, not to protect the stupid customers.

          If the courts would apply a reasonable level of common sense, they wouldn’t exist.