• celsiustimeline@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    12
    arrow-down
    9
    ·
    3 hours ago

    In the past, autonomous vehicle development dwelled on the ethical hypothetical situations like “do you hit an old lady crossing the road in order to avoid crashing into a schoolbus full of children?”, but what about safety hypotheticals? Like, if you were actually driving your vehicle, there are moments when it’s in your best interest to not be at a stop, such as when people are physically surrounding your car and potentially mean to cause you harm, which is extremely common in America. When does the driverless car get you out of a tight spot and run over some carjackers if need be?

    • AbsoluteChicagoDog@lemm.ee
      link
      fedilink
      English
      arrow-up
      23
      arrow-down
      1
      ·
      2 hours ago

      How the fuck do you figure that’s “extremely common”? You need to spend less time on the Internet my dude …

    • Blackmist@feddit.uk
      link
      fedilink
      English
      arrow-up
      14
      arrow-down
      5
      ·
      3 hours ago

      If an AI car ever has to make a decision on who dies, the answer should always be “whoever agreed to the terms and conditions before they got in the vehicle”.

      • Skates@feddit.nl
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        1
        ·
        20 minutes ago

        This will never be the case. Because nobody will buy an overpriced “yo, if there’s ever any doubt about, like, anything - just put a bullet in my head” machine. So nobody will sell it.

        Face it - you have the same thousands of pounds of metal today, and you’re the only one making decisions. You (drivers, as a community) have killed before, for selfish reasons: because you don’t want to die is the least selfish of them. Other hits include “didn’t wanna not get drunk with the homies”, “I really needed to answer that text” and “I have 10 minutes till home but the game starts in 5, it’s my favorite team, I can make it”. And you somehow seem to want non-drivers (passengers of AI cars) to have the same expectation that they will be a victim even when they get a car?

        Drivers are so self-centered it’s goddamn ridiculous.

        • Blackmist@feddit.uk
          link
          fedilink
          English
          arrow-up
          2
          ·
          13 minutes ago

          I’m talking about pedestrians, not other drivers.

          If autonomous vehicles can’t be trusted not to run people over, then they shouldn’t be allowed to go above like 20mph in a built up area where there’s likely to be people walking about. And frankly neither should human drivers, but good luck not getting them to call it a “war on motorists” if you try.

    • BluesF@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      3 hours ago

      Surely you can just take over? You can’t expect the car to run people over for you lol