• amanneedsamaid@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      37
      arrow-down
      3
      ·
      1 year ago

      Bill the manufacturer 100%, IMO. Thats why I think self driving cars beg an unanswerable legal question, as when the car drives for you, why would you be at fault? How will businesses survive if they have to take full accountability for accidents caused by self-driving cars?

      I think its almost always pointless to hold back innovation, but in this case I think a full ban on self driving cars would be a great move.

      • DauntingFlamingo@lemmy.ml
        link
        fedilink
        English
        arrow-up
        8
        arrow-down
        2
        ·
        1 year ago

        The most basic driving like long stretches of highway shouldn’t be banned from using AI/automated driving. The fast paced inner city driving should be augmented but not fully automatic. Same goes for driving in inclement weather: augmented with hard limits on speed and automated braking for anything that could result in a crash

        • snooggums@kbin.social
          link
          fedilink
          arrow-up
          7
          arrow-down
          2
          ·
          1 year ago

          “Self driving with driver assist” or whatever they call it when it isn’t 100% automated is basically super fancy cruise control and should be treated as such. The main problem with the term autopilot is that for airplanes it means 100% control and very misleading when used for fancy cruise control in cars.

          I agree that it should be limited in use to highways and other open roads, like when cruise control should be used. People using cruise control in the city without being in control to brake is the same basic issue.

          Not 100% fully automated with no expectation of driver involvement should be allowed when it has surpassed regular drivers. To be honest, we might even be there with how terrible human drivers are…

          • GonzoVeritas@lemmy.world
            link
            fedilink
            arrow-up
            10
            arrow-down
            1
            ·
            1 year ago

            Autopilot systems on airplanes make fewer claims about autonomous operation than Tesla. No pilot relies completely on autopilot functionality.

          • Amju Wolf@pawb.social
            link
            fedilink
            arrow-up
            0
            ·
            1 year ago

            Autopilot in aircraft is actually kinda comparable, it still needs a skilled human operator to set it up and monitor it (and other flight controls) all of the time. And in most modes it’s not even really all that autonomous - at most it follows a pre-programmed route.

              • Amju Wolf@pawb.social
                link
                fedilink
                English
                arrow-up
                2
                ·
                1 year ago

                They can, but the setup is still non-trivial and full auto landing capability isn’t used all that much even if technically available. It also isn’t just the capability of the aircraft, it requires a shitton of supporting infrastructure on the ground (airport) and many airports don’t support this.

                That would be equivalent to installing new intersections where you’d also have a broadcast of what the current signals are for each lane, which would help self-driving cars immensely (and regular cars eventually too, with assistive technologies to help drivers drive more safe), but that’s simply not a thing yet.

        • Dudewitbow@lemmy.ml
          link
          fedilink
          arrow-up
          2
          ·
          1 year ago

          Its why im all for automated trucking. Truck drivers is a dwindling source and living the lifestyle of a cross country truck driver isnt highly sought after job. The self driving should do the large trip from hub to hub, and each hub ahould do the last few miles. Keeps drivers local and fixes a problem that is only going to get worse.

          • DauntingFlamingo@lemmy.ml
            link
            fedilink
            arrow-up
            2
            ·
            edit-2
            1 year ago

            That would be the augmented part and the AI. ANYTHING that presents a potential hazard already takes a vehicle out of automated driving in most models, because after a few Teslas didn’t stop people started suing

          • Amju Wolf@pawb.social
            link
            fedilink
            English
            arrow-up
            1
            ·
            1 year ago

            I mean that’s a huge issue for human drivers too.

            We need assistive technologies that protect us, but if at any point the driver is no longer driving the car manufacturer needs to take full responsibility.

      • Skull giver@popplesburger.hilciferous.nl
        link
        fedilink
        English
        arrow-up
        3
        ·
        1 year ago

        Current laws in most places still require a human with a license to drive self driving cars (and I don’t see that changing any time soon with how terrible self driving cars still are). That makes the human driver, who should intervene in these scenarios, responsible.

        Once we remove the human override, I would consider a self driving car breaking the law to be a faulty product, possibly requiring a recall if it happens more often. If any other part of the car is prone to breaking, you’d demand a recall too.

        As for the fines, you’d probably see something like “the driver receives a fine but they can hold the company that sold them the car liable for a faulty product”.

        Fining the manufacturer directly is a nice idea, but if Tesla does go bankrupt, where do we send the fines then?

        • sin_free_for_00_days@sopuli.xyz
          link
          fedilink
          arrow-up
          3
          ·
          edit-2
          1 year ago

          I’m pretty sure there are autonomous cars driving around San Francisco, and have been for some time.

          EDIT: Here’s an uplifting story about San Francisco-ians(?) interacting with the self-driving cars.

      • stanleytweedle@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        1 year ago

        I think its almost always pointless to hold back innovation, but in this case I think a full ban on self driving cars would be a great move.

        I agree on both points. Also I think it’s important to characterize the ‘innovation’ of self driving as more social-economic than technological.

        The component systems- sensing, processing, communications, power, etc- have a wide range of engineering applications and research and development will inevitably continue no matter the future of self-driving. Self driving only solves a very particular social-economic-technological issue that only exists because of how humans historically chose to address the same issue with older technology. Self driving is more of a product than a ‘technology’ in my book.

        So my point there is that I don’t think a ban on full self driving really qualifies as ‘holding back innovation’ at all. It’s just telling companies not to develop a specific product. Hyperbolic example but nobody would say banning companies from creating a nuclear powered oven was ‘holding back innovation’. If anything forcing us to re-envision human transportation without integrating into legacy requirements advances innovation more than just trying to use AI to solve the problems created by using humans to solve the original problem of how to move humans around in cars.

        • amanneedsamaid@sopuli.xyz
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 year ago

          I see it the same way, but an incredible amount of people I’ve discussed this with say that its stupid to hold back technological innovation “like self-driving cars”. Its an unnecessary piece of technology.

          I also just think the whole ethical complication is fucked. The way we have it now, every driver is responsible for their actions and no driver ever glitches out on the freeway (and if they do, they bear the consequences). Imagine a man’s wife and kids getting killed by a drunk driver vs a self-driving car. In one scenario you can clearly place blame, and take action in a much more meaningful way than just suing a car manufacturer.

      • sugar_in_your_tea@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        1
        ·
        edit-2
        1 year ago

        The responsible party should be the owner of the vehicle, not the manufacturer or passenger. If a company runs an automated ride share service, for example, that company should be liable. Likewise if you own a car and use the self-driving feature, you are at fault it it goes wrong, so you should use it at your own risk.

        That said, for the owner to be truly responsible, they need ownership of the self-driving code, as well as diagnostics for them to be able to monitor it. If they don’t have that, do they truly own the car?

        That said, there’s nothing stopping a manufacturer or dealer from making a deal to cover self-driving fines.

        • amanneedsamaid@sopuli.xyz
          link
          fedilink
          English
          arrow-up
          3
          ·
          edit-2
          1 year ago

          Well exactly, I see no way that all the self driving source code will be FOSS (I don’t think corporations would ever willingly sign onto this). So the responsible party in the case of a malfunction should therefore be the company, because in a full self driving setup the occupant is not controlling the vehicle, and has no reasonable way to ensure the safety of the code.

          • sugar_in_your_tea@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            2
            ·
            1 year ago

            Which is why it should be dual responsibility. The owner of the vehicle chose to use the feature, so they have responsibility. If it malfunctions when the driver was following the instructions, the manufacturer has responsibility. Both are culpable, so they should share responsibility.

    • schroedingershat@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      ·
      1 year ago

      Nah. Give tesla the same number of points everyone else gets on their license. If the company runs out, no more cars controlled by tesla on the roads…

      • MeshPotato@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        1 year ago

        We already had that in the 70s and 80s. Those were RoRo trains.

        You put your car on a drive on ramp. Go into the comfy cabin, maybe even a sleeper cabin for over night journeys. Get out at the other end, drive your car down the carrier and explore the area that you’ve journeyed to with the vehicle that you own. Look up the 89s ABC film about the Ghan railway closing down.

        I live in Australia and love seeing the distant from my home centre of tue country. Unfortunately long distance trains here have become a lifestyle luxury experience rather than transportation. Same goes for bicycles amd motorcycles.