• Trollception@lemmy.world
    link
    fedilink
    English
    arrow-up
    20
    arrow-down
    27
    ·
    edit-2
    8 months ago

    You do realize humans kill hundreds of other humans a day in cars, right? Is it possible that autonomous vehicles may actually be safer than a human driver?

    • KredeSeraf@lemmy.world
      link
      fedilink
      English
      arrow-up
      32
      arrow-down
      2
      ·
      8 months ago

      Sure. But no system is 100% effective and all of their questions are legit and important to answer. If I got hit by one of these tomorrow I want to know the process for fault, compensation and pathway to improvement are all already done not something my accident is going to landmark.

      But that being said, I was a licensing examiner for 2 years and quit because they kept making it easier to pass and I was forced to pass so many people who should not be on the road.

      I think this idea is sound, but that doesn’t mean there aren’t things to address around it.

      • Trollception@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        arrow-down
        12
        ·
        8 months ago

        Honestly I’m sure there will be a lot of unfortunate mistakes until computers and self driving systems can be relied upon. However there needs to be an entry point for manufacturers and this is it. Technology will get better over time, it always has. Eventually self driving autos will be the norm.

        • KredeSeraf@lemmy.world
          link
          fedilink
          English
          arrow-up
          13
          arrow-down
          4
          ·
          8 months ago

          That still doesn’t address all the issues surrounding it. I am unsure if you are just young and not aware how these things work or terribly naive. But companies will always cut corners to keep profits. Regulation forces a certain level of quality control (ideally). Just letting them do their thing because “it’ll eventually get better” is a gateway to absurd amounts of damage. Also, not all technology always gets better. Plenty just get abandoned.

          But to circle back, if I get hit by a car tomorrow and all these thinga you think are unimportant are unanswered does that mean I might mot get legal justice or compensation? If there isn’t clearly codified law I might not, and you might be callous enough to say you don’t care about me. But what about you? What if you got hit by a unmonitored self driving car tomorrow and then told you’d have to go through a long, expensive court battle to determine fault because no one had done it it. So you’re in and out of a hospital recovering and draining all of your money on bills both legal and medical to eventually hopefully get compensated for something that wasn’t your fault.

          That is why people here are asking these questions. Few people actually oppose progress. They just need to know that reasonable precautions are taken for predictable failures.

          • Trollception@lemmy.world
            link
            fedilink
            English
            arrow-up
            5
            arrow-down
            5
            ·
            edit-2
            8 months ago

            To be clear I never said that I didn’t care about an individual’s safety, you inferred that somehow from my post and quite frankly are quite disrespectful. I simply stated that autonomous vehicles are here to stay and that the technology will improve more with time.

            The legal implications of self driving cars are still being determined and as this is literally one of the first approved technologies available. Tesla doesn’t count as it’s not a SAE level 3 autonomous driving vehicle. There are some references in the liability section of the wiki.

            https://en.m.wikipedia.org/wiki/Regulation_of_self-driving_cars

          • Llewellyn@lemm.ee
            link
            fedilink
            English
            arrow-up
            4
            arrow-down
            4
            ·
            edit-2
            8 months ago

            But then it’s good that the manufacturer states the driver isn’t obliged to watch the road. Because it shifts responsibility towards the manufacturer and thus - it’s a great incentive to make technology as safe as possible.

        • MeDuViNoX@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          4
          arrow-down
          1
          ·
          8 months ago

          Can’t the entry point just be that you have to pay attention while it’s driving for you until they figure it out?

        • stoly@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          arrow-down
          5
          ·
          8 months ago

          You’re deciding to prioritize economic development over human safety.

    • Adanisi@lemmy.zip
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      2
      ·
      edit-2
      8 months ago

      *at 40mph on a clear straight road on a sunny day in a constant stream of traffic with no unexpected happenings, Ts&Cs apply.

    • stoly@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      2
      ·
      8 months ago

      Only on closed courses. The best AI lacks the basic heuristics of a child and you simply can’t account for all possible outcomes.