New Footage Shows Tesla On Autopilot Crashing Into Police Car After Alerting Driver 150 Times::Six officers who were injured in the crash are suing Tesla despite the fact that the driver was allegedly impaired

  • EndOfLine@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    1 year ago

    Officers injured at the scene are blaming and suing Tesla over the incident.

    And the reality is that any vehicle on cruise control with an impaired driver behind the wheel would’ve likely hit the police car at a higher speed. Autopilot might be maligned for its name but drivers are ultimately responsible for the way they choose to pilot any car, including a Tesla.

    I hope those officers got one of those “you don’t pay if we don’t win” lawyers. The responsibility ultimately resides with the driver and I’m not seeing them getting any money from Tesla.

    • friendlymessage@feddit.de
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 year ago

      Well, in the end it’s up to whether Tesla’s ADAS is compliant with laws and regulations. If there really were 150 warnings by the ADAS without it disengaging, this might be an indicator of faulty software and therefore Tesla being at least partially at fault. It goes without saying that the driver is mostly to blame but an ADAS shouldn’t just keep on driving when it senses that the driver is incapacitated.

  • CaptainProton@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    1 year ago

    This is stupid. Teslas can park themselves, they’re not just on rails. It should be pulling over and putting the flashers on if a driver is unresponsive.

    That being said, the driver knew this behavior, acted with wanton disregard for safe driving practices, and so the incident is the driver’s fault and they should be held responsible for their actions. It’s not the courts job to legislate.

    It’s actually the NTSB’s job to regulate car safety so if they don’t already have it congress needs to grant them the authority to regulate what AI behavior is acceptable/define safeguards against misbehaving AI.

    • dzire187@feddit.de
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 year ago

      It should be pulling over and putting the flashers on if a driver is unresponsive.

      Yes. Actually, just stopping in the middle of the road with hazard lights would be sufficient.

  • chakan2@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    1 year ago

    I hope the cops win. Autopilot allows for a driver to completely disengage their attention from the car in a way that’s not possible with just cruise control.

    There’s no way you can drop a human in a life threatening critical situation with 2.5 seconds to make a decision and expect them to make reasonable decisions. Even stone cold sober, that’s a lot to ask of a person when the car makes a critical mistake like this.

    On cruise, the driver would still have to be aware that they were driving. With auto pilot, the driver had likely passed out and the car carried on it merry way.

    • Thorny_Thicket@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      1 year ago

      Tesla on autopilot/FSD is almost 4 times less likely to be involved in a crash than a human driven Tesla which even then is half as likely to end up in a accident compared to average car. You not liking Musk fortunelately doesn’t change these facts.

      In the 2nd quarter, we recorded one crash for every 4.41 million miles driven in which drivers were using Autopilot technology (Autosteer and active safety features). For drivers who were not using Autopilot technology (no Autosteer and active safety features), we recorded one crash for every 1.2 million miles driven. By comparison, NHTSA’s most recent data shows that in the United States there is an automobile crash every 484,000 miles.

      Source

        • Thorny_Thicket@sopuli.xyz
          link
          fedilink
          English
          arrow-up
          0
          ·
          1 year ago

          Perhaps. I’m sure you’ll provide me with the independent data you’re basing that “Teslas are not safe” claim on

            • narp@feddit.de
              link
              fedilink
              English
              arrow-up
              0
              ·
              1 year ago

              You made the first comment: “Teslas aren’t safe”, without providing proof.

              And now you’re calling someone a hypocrite because he asks for data of exactly what you claimed, while you’re redefining your first argument as “the contrary”.

              So, do you have proof that Tesla’s aren’t safe in comparison to other cars, or is it just your opinion?

                • narp@feddit.de
                  link
                  fedilink
                  English
                  arrow-up
                  0
                  ·
                  1 year ago

                  But you can’t base a fact on one accident. Or even multiple. What if newspapers like to write especially about Tesla accidents to generate clicks?

                  Teslas seemingly have a lot of accidents, but without checking the statistics and comparing it to other manufacturers you wouldn’t really know if the perceived truth is a fact or not.