Tesla knew Autopilot caused death, but didn’t fix it::Software’s alleged inability to handle cross traffic central to court battle after two road deaths

  • Zink@programming.dev
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    I did not know that about Mercedes, so I had to go read about it. Level 3 is huge because that’s when the system is approved to not have constant human monitoring. It’s the difference between being able to read a book or use your phone on a boring trip, even if it might not get you fully door to door on many trips.

    It can’t drive you home drunk, and you can’t sleep in your car (you have to be available to take over when requested) but it’s a huge jump in most practical usage.

    • renohren@partizle.com
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 year ago

      Realisticaly, i think FSD has the potential to be level 3 officially and probably some car makers have the tech to do it too BUT in the EU, if the car has a level 3 autonomous driving, the car maker becomes legally responsible of accidents when the driving conditions are met ( most EU states limit it to highways). For the time being,only Mercedes had the courage to try it (probably because they have ample knowledge of driving assistance through their trucking production.)