Last year, two Waymo robotaxis in Phoenix “made contact” with the same pickup truck that was in the midst of being towed, which prompted the Alphabet subsidiary to issue a recall on its vehicles’ software. A “recall” in this case meant rolling out a software update after investigating the issue and determining its root cause.

In a blog post, Waymo has revealed that on December 11, 2023, one of its robotaxis collided with a backwards-facing pickup truck being towed ahead of it. The company says the truck was being towed improperly and was angled across a center turn lane and a traffic lane. Apparently, the tow truck didn’t pull over after the incident, and another Waymo vehicle came into contact with the pickup truck a few minutes later. Waymo didn’t elaborate on what it meant by saying that its robotaxis “made contact” with the pickup truck, but it did say that the incidents resulted in no injuries and only minor vehicle damage. The self-driving vehicles involved in the collisions weren’t carrying any passenger.

After an investigation, Waymo found that its software had incorrectly predicted the future movements of the pickup truck due to “persistent orientation mismatch” between the towed vehicle and the one towing it. The company developed and validated a fix for its software to prevent similar incidents in the future and started deploying the update to its fleet on December 20.

  • noodlejetski@lemm.ee
    link
    fedilink
    English
    arrow-up
    114
    ·
    edit-2
    5 months ago

    I love the corpospeak. why say “crashed into” when you can use “made contact” which sounds futuristic and implies that your product belongs to an alien civilization?

  • bstix@feddit.dk
    link
    fedilink
    English
    arrow-up
    79
    arrow-down
    9
    ·
    5 months ago

    The company says the truck was being towed improperly

    Shit happens on the road. It’s still not a great idea to drive into it.

    The company developed and validated a fix for its software to prevent similar incidents

    So their plan is to fix one accident at a time…

    • DoomBot5@lemmy.world
      link
      fedilink
      English
      arrow-up
      21
      arrow-down
      2
      ·
      5 months ago

      Rules are written in blood. Once you figure out all the standard cases, you can only try and predict as many edge cases that you can think of. You can’t make something fool proof because there will always be a greater fool that will come by.

      • bstix@feddit.dk
        link
        fedilink
        English
        arrow-up
        15
        arrow-down
        2
        ·
        5 months ago

        Unexpected or not, it should do its best to stop or avoid the obstacle, not drive into it.

        An autonomous vehicle shouldn’t ever be able to actively drive forward into anything. It’s basic collision detection that ought to brake the car here. If something is in the position the car wants to drive to, it simply shouldn’t drive there. There’s no reason to blame the obstacle for being towed incorrectly…

        • NotMyOldRedditName@lemmy.world
          link
          fedilink
          English
          arrow-up
          7
          ·
          edit-2
          4 months ago

          In this case it thought the vehicle had a different trajectory due to how it was improperly set up.

          The car probably thought it wasn’t going to hit it until it was too late and the trajectory calculation proved incorrect.

          Every vehicle on the road is few moments away from crashing if we calculate that incorrectly. It doesn’t matter if it knows its there.

          • bstix@feddit.dk
            link
            fedilink
            English
            arrow-up
            2
            ·
            4 months ago

            Same thing applies to a human driver. Most accidents happen because the driver makes a wrong assumption. The key to safe driving is not getting in situations where driving is based on assumptions.

            Trajectory calculation is definitely an assumption and shouldn’t be allowed to override whatever sensor is checking for obstructions ahead of the car.

            • NotMyOldRedditName@lemmy.world
              link
              fedilink
              English
              arrow-up
              2
              ·
              4 months ago

              The car can’t move without trajectory calculations though.

              If the car ahead of you pulls forward when the light goes green, your car can start moving forward as well keeping in mind the lead cars trajectory and speed.

              If it was just don’t hit an object in its path, the car wouldn’t move forward until the lead was half way down the block.

              The car knew the truck was there in this case, it wasn’t a failure to detect. Due to a programming failure it thought it was safe to move because the truck wouldn’t be there.

              If you’re following a vehicle with proper distance and it slams the brakes you should be able to stop in time as you’ve calculated their trajectory and a safe speed behind. But if that same vehicle slams on the brakes and goes into reverse, well… Goodluck.

              It’s all assumptions assuming the detection is accurate in the first place.

              • bstix@feddit.dk
                link
                fedilink
                English
                arrow-up
                1
                ·
                4 months ago

                If you’re following a vehicle with proper distance and it slams the brakes you should be able to stop in time as you’ve calculated their trajectory and a safe speed behind.

                You dont need to calculate their trajectory. It’s enough to know your own.

                If a heavy box falls off a truck and stops dead in front of you, you need to be able to stop. That box has no trajectory, so it’s an error to include other vehicles trajectories in the safe distance calculation.

                Traffic can move through an intersection closely by calculating a safe distance, which may be smaller than the legal definition, but still large enough to stop for anything suddenly appearing on the road. The only thing needed is that the distance is calculated based on your own speed and a visually confirmed position of other things. It can absolutely be done regardless of the speed or direction of other vehicles.

                Anyway. A backwards facing truck is a weird thing to misinterpret. Trucks sometimes face backwards for whatever reasons.

                It would be interesting to know how the self driving car would react to a ghost driver.

                • NotMyOldRedditName@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  2
                  arrow-down
                  1
                  ·
                  edit-2
                  4 months ago

                  You dont need to calculate their trajectory. It’s enough to know your own.

                  This doesn’t make sense. It’s why I was saying the car won’t move at a stop light when it goes green until the car is half way down the street.

                  If the car is 2.5 seconds ahead of me at 60mph on the highway, it’s only 2.5 seconds ahead of me if the other car is doing 60 mph. If the car is doing 0mph then I’m going to crash into it.

                  It needs to know how fast and what direction the obstacle is going, and how to calculate the rate of acceleration/deceleration and extrapolate from there.

    • Tetsuo@jlai.lu
      link
      fedilink
      English
      arrow-up
      15
      arrow-down
      2
      ·
      5 months ago

      Honestly, I think only trial and error will let us get a proper autonomous car.

      And I still think autonomous cars will save many more lives than it endangered once it become reliable.

      But for now this is bound to happen…

      To be clear, they still are responsible of these car and the safety of others. They didn’t test properly.

      They should be trying every edge case they can think about.

      A large screen on the side of a truck ? What if a car is displayed on it ? Would the car sensor notice the difference?

      A farmer dropped a hay bale on the road ? It got flattened by rain ? Does the car understand that this might not be safe to drive on or to brake on ?

      There is hundreds of unique situations that they should be trying before an autonomous car gets even close to a public road.

      But even if you try everything there will be mistakes and fatalities.

      • threelonmusketeers@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        1
        ·
        4 months ago

        There is hundreds of unique situations that they should be trying before an autonomous car gets even close to a public road.

        Do you think “better than human drivers” is sufficient for deployment on public roads, or do you think the bar should be higher?

        • Tetsuo@jlai.lu
          link
          fedilink
          English
          arrow-up
          2
          ·
          4 months ago

          Honestly, I’m pragmatic, if less people die in accidents involving autonomous car, then yes.

          The thing is we shouldn’t be trusting the manufacturers for these stats. It has to be reported by a government agency or something.

          Similarly Autonomous car software should have to be certified by an independent organization before being deployed. Same thing for updates to the software. Otherwise we would get deadly updates from time to time.

          If we deploy and handle autonomous car with the same safety approach as in aviation I’m sure this transition can be done fairly safely.

    • Chozo@kbin.social
      link
      fedilink
      arrow-up
      16
      arrow-down
      4
      ·
      5 months ago

      So their plan is to fix one accident at a time…

      Well how else would you do it?

      • bstix@feddit.dk
        link
        fedilink
        English
        arrow-up
        32
        arrow-down
        4
        ·
        5 months ago

        You drive a car and can’t quite figure out what is happening in front of you.

        Do you:

        • A: Turn up the music and plow right through.
        • B: Slow down (potentially to a full stop) and assess the situation.
        • C : Slow down, close your eyes and continue driving slowly into the obstacle
        • D: Sound the horn and flash the lights

        From the description offered in the article the car chose C, which is wrong.

        • lengau@midwest.social
          link
          fedilink
          English
          arrow-up
          18
          arrow-down
          3
          ·
          5 months ago

          Given the millions of global road deaths annually I think B is probably the least popular answer.

          • Tetsuo@jlai.lu
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            1
            ·
            4 months ago

            Honestly slowing down too much can easily create an accident that didn’t exist in the first place.

            Not every situation can be handled by slowing down.

            If that’s the default behavior on high speed road this could be deadly for the car behind you.

        • Chozo@kbin.social
          link
          fedilink
          arrow-up
          9
          arrow-down
          8
          ·
          5 months ago

          I wasn’t asking about the car’s logic algorithm; we all know that the SDC made an error, since it [checks notes] hit another car. We already know it didn’t do the correct thing. I was asking how else you think the developers should be working on the software other than one thing at a time. That seemed like a weird criticism.

          • bstix@feddit.dk
            link
            fedilink
            English
            arrow-up
            13
            arrow-down
            7
            ·
            5 months ago

            Sorry, I didn’t answer your question. Consider the following instead:

            Your self driving car has crashed into a god damn tow truck with a backwards facing truck.

            Do you:

            • A: Program your car to deal differently with fucking backwards facing trucks on tow trucks
            • B: Go back to question one and make your self driving car pass a simple theory test.

            According to the article the company has chosen A, which is wrong.

      • Turun@feddit.de
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        4 months ago

        Ideally they don’t need actual accidents to find errors, but discover said issues in QA and automated testing. Not hitting anything sounds like a manageable goal to be honest.

  • cestvrai@lemm.ee
    link
    fedilink
    English
    arrow-up
    63
    arrow-down
    3
    ·
    5 months ago

    Hmm, so it’s only designed to handle expected scenarios?

    That’s not how driving works… at all. 😐

  • Overzeetop@lemmy.world
    link
    fedilink
    English
    arrow-up
    56
    arrow-down
    5
    ·
    5 months ago

    The description of an unexpected/(impossible) orientation for an on road obstacle works as an excuse, right up to the point where you realize that the software should, explicitly, not run into anything at all. That’s got to be, like, the first law of (robotic) vehicle piloting.

    It was just lucky that it happened twice as, otherwise, Alphabet likely would have shrugged it off as some unimportant, random event.

    • dan1101@lemm.ee
      link
      fedilink
      English
      arrow-up
      27
      arrow-down
      7
      ·
      5 months ago

      Billionaires get to alpha test their software on public roads and everyone is at risk.

      • nivenkos@lemmy.world
        link
        fedilink
        English
        arrow-up
        7
        arrow-down
        25
        ·
        5 months ago

        It’s great though - that’s how you get amazing services and technological advancement.

        I wish we had that. In Europe you’re just stuck paying 50 euros for a taxi in major cities (who block the roads, etc. to maintain their monopolies).

        Meanwhile in the USA you guys have VR headsets, bioluminescent houseplants and self-driving cars (not to mention the $100k+ salaries!), it’s incredible.

          • Patches@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            3
            ·
            4 months ago

            Bruh in the US of A the grass is greener because it’s made of polypropylene and spray painted green. Just don’t smell it, or look too hard.

        • vaultdweller013@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          15
          arrow-down
          1
          ·
          5 months ago

          Most of us are in poverty, I dont know when but we’re in another gilded age and just like the last was underneath the gold is rusty iron.

        • BakerBagel@midwest.social
          link
          fedilink
          English
          arrow-up
          10
          ·
          5 months ago

          Yeah it’s $40 for an Uber in Columbus or Cleveland as well. There isn’t a monopoly on taxis creating that price, thats just how much it actually costs to rent a car for cross city travel.

          If you want a no regulations/free market at the helm, you want to move to India. They have all the rules you love.

        • JungleJim@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          5
          ·
          5 months ago

          Bioluminescent house plants are cool but as an American I can tell you right now that my luxury bones hurt.

          • nivenkos@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            3
            ·
            edit-2
            5 months ago

            I can tell you right now that my luxury bones hurt.

            That’s the same in Europe though, dentistry isn’t covered on public insurance in the UK, Spain, Sweden, etc.

            But we have even less net salary to cover it when there are problems.

            • JungleJim@sh.itjust.works
              link
              fedilink
              English
              arrow-up
              4
              ·
              5 months ago

              True, but your savings on non-luxury bones helps with the fees associated with luxury ones, I’m sure. I can’t do anything for my bones with a $30 glowing petunia.

        • RedFox@infosec.pub
          link
          fedilink
          English
          arrow-up
          3
          arrow-down
          1
          ·
          edit-2
          4 months ago

          I appreciate/understand your envy. I’m not sure why everyone disagrees so much unless they have also lived under similar constraints.

          Unless sarcasm.

          Also agree with it might be perception or grass is greener like other comment 😉

    • LesserAbe@lemmy.world
      link
      fedilink
      English
      arrow-up
      12
      arrow-down
      2
      ·
      5 months ago

      I didn’t read it as them saying “therefore this isn’t a problem,” it was an explanation for why it happened. Think about human explanations for accidents: “they pulled out in front of me” “they stopped abruptly”. Those don’t make it ok that an accident happened either.

    • Bizzle@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      arrow-down
      1
      ·
      5 months ago

      It should of course not run into anything, but it does need to be able to identify obstacles at the very least for crash priority when crazy shit inevitably happens. For instance, maybe it hits a nice squishy Pomeranian that won’t cause any damage to itself instead of swerving to avoid it and possibly totalling itself by hitting a fire hydrant.

      Or maybe it hits the fire hydrant instead of a toddler.

      At any rate, being able to identify an obstacle and react to unexpected orientations of those obstacles is something I think a human driver does pretty well most of the time. Autonomous cars are irresponsible and frankly I can’t believe they’re legal to operate.

    • ___@lemm.ee
      link
      fedilink
      English
      arrow-up
      2
      ·
      4 months ago

      It would have been a different article if two waymos decided to take a wrong turn off a cliff.

  • JCreazy@midwest.social
    link
    fedilink
    English
    arrow-up
    62
    arrow-down
    14
    ·
    5 months ago

    I’m getting tired of implementing technology before it’s finished and all the bugs are worked out. Driverless cars are still not ready for prime time yet. The same thing is happening currently with AI or companies are utilizing it without having any idea what it can do.

    • corsicanguppy@lemmy.ca
      link
      fedilink
      English
      arrow-up
      41
      arrow-down
      13
      ·
      5 months ago

      tired of implementing technology before it’s finished

      That’s is every single programme you’ve ever used.

      Software will be built, sold, used, maintained and finally obsoleted and it will still not be ‘complete’. It will have bugs, sometimes lots, sometimes huge, and those will not be fixed. Our biggest accomplishment as a society may be the case where we patched software on Mars or in the voyager probe still speeding away from earth.

      Self-driving cars, though, don’t need to have perfectly ‘complete’ software, though; they just need to work better than humans. That’s already been accomplished, long ago.

      And with each fix applied to every one of them, it’s a situation they all shouldn’t ever repeat. Can we say the same about humans? I can’t even get my beautiful, stubborn wife to slow down, leave more space, and quit turning the steering wheel in that rope-climbing way like a farmer on a tractor does (because the airbag will take her hand off).

      • dsemy@lemm.ee
        link
        fedilink
        English
        arrow-up
        17
        arrow-down
        8
        ·
        5 months ago

        That’s is every single programme you’ve ever used.

        No software is perfect, but anybody who uses a computer knows that some software is much less complete. This currently seems to be the case when it comes autonomous driving tech.

        And with each fix applied to every one of them, it’s a situation they all shouldn’t ever repeat.

        First, there are many companies developing autonomous driving tech, and if there’s one thing tech companies like to do is re-invent the wheel (ffs Tesla did this literally). Second, have you ever used modern software? A bug fix guarantees nothing. Third, you completely ignore the opposite possibility - what if they push a serious bug in an update, which drives you off a cliff and kills you? It doesn’t matter if they push a fix 2 hours later (and let’s be honest, many of these cars will likely stop getting updates pretty fast anyway once this tech gets really popular, just look at the state of software updates in other industries).

        • daed@lemmy.world
          link
          fedilink
          English
          arrow-up
          11
          arrow-down
          5
          ·
          edit-2
          5 months ago

          I understand your issue with these cars - they’re dangerous, and could kill people with incomplete or buggy software. I believe the person you are responding to was pointing out that even with the bugs, these are already safer than human drivers. This is already better when looking at data rather than headlines and going off of how things seem.

          Personally, I would prefer to be in control of the vehicle at all times. I don’t like the idea of driverless tech either.

          • RedFox@infosec.pub
            link
            fedilink
            English
            arrow-up
            3
            arrow-down
            1
            ·
            edit-2
            4 months ago

            Well, has anyone done good statistics to show all the self driving cars are more dangerous than regular distracted humans as a whole?

            We can always point to numerous self driving car errors and accidents, but I am under the impression that compared to the number of accidents involving people on a daily basis, self driving cars might be safer even now?

            I’m thinking of how many crashes took place in the time it took me to type this out. I’m also curious about the fatality rate between self or assisted driving vs not.

            I think we tend to be super critical of new things, especially tech things, which is understandable and appropriate, but it would be nice to see some holistic context. I wish government regulators would publish that data for us, to help us form informed opinions instead of having to rely on manufacturers (conflict of interest) or journalists who need a good story to tell, and some clicks.

          • dsemy@lemm.ee
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            4
            ·
            4 months ago

            Currently there are many edge cases which haven’t even been considered yet, so maybe statistically it is safer, but it doesn’t change anything if your car makes a dumb mistake you wouldn’t have and gets you into an accident (or someone else’s car does and they don’t stop it cause they weren’t watching the road).

    • long_chicken_boat@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      26
      ·
      5 months ago

      I’m against driverless cars, but I don’t think this type of errors can be detected in a lab environment. It’s just impossible to test with every single car model or real world situations that it will find in actual usage.

      An optimal solution would be to have a backup driver with every car that keeps an eye on the road in case of software failure. But, of course, this isn’t profitable, so they’d rather put lives at risk.

    • nooeh@lemmy.world
      link
      fedilink
      English
      arrow-up
      16
      arrow-down
      2
      ·
      5 months ago

      How will they encounter these edge cases without real world testing?

    • LesserAbe@lemmy.world
      link
      fedilink
      English
      arrow-up
      15
      arrow-down
      2
      ·
      5 months ago

      You’re right there should be a minimum safety threshold before tech is deployed. Waymo has had pretty extensive testing (unlike say, Tesla). As I understand it their safety record is pretty good.

      How many accidents have you had in your life? I’ve been responsible for a couple rear ends and I collided with a guard rail (no one ever injured). Ideally we want incidents per mile driven to be lower for these driverless cars than when people drive. Waymos have driven a lot of miles (and millions more in a virtual environment) and supposedly their number is better than human driving, but the question is if they’ve driven enough and in enough varied situations to really be an accurate stat.

      • Doof@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        2
        ·
        4 months ago

        A slightly tapped a car a first day driving, that’s it. No damage. Not exact a good question.

        Look at how data is collected with self driving vehicles and tell me it’s truly safer.

        • LesserAbe@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          arrow-down
          1
          ·
          4 months ago

          My point asking about personal car incidents is that each of those, like your car tap, show we can make mistakes, and they didn’t merit a news story. There is a level of error we accept right now, and it comes from humans instead of computers.

          It’s appropriate that there are stories about waymo, because it’s new and needs to be scrutinized and proven. Still it would benefit us to read these stories with a critical mind, not to reflexively think “one accident, that means they’re totally unsafe!” At the same time, not accepting at face value information from companies who have a vested interest in portraying the technology as safe.

    • nivenkos@lemmy.world
      link
      fedilink
      English
      arrow-up
      10
      arrow-down
      12
      ·
      5 months ago

      That’s how you get technological advancement.

      Bureaucracy just leads to monopolies and little to any progress.

    • twack@lemmy.world
      link
      fedilink
      English
      arrow-up
      24
      arrow-down
      1
      ·
      4 months ago

      Because Tesla was fixing significant safety issues without reporting it to the NHTSA in a way that they could track the problems and source of the issue. The two of them got into a pissing match, and the result is that now all OTA’s are recalls. After this, the media realized that “recall” generates more views than “OTA”, and here we are.

      • Dlayknee@lemmy.world
        link
        fedilink
        English
        arrow-up
        6
        ·
        4 months ago

        I think it’s slightly more nuanced - not all OTAs are recalls, and not all recalls are OTAs (for Tesla). Depending on the issue (for Teslas), the solution may be pushed via an OTA in which case they “issue a recall” with a software update. They’re actually going through this right now. For some other issues though, it’s a hardware problem that an OTA won’t fix so they issue a recall to repair the problem (ex: when the wiring harness for their cameras was fraying the cables).

        This is 100% from the NHTSA shenanigans, though.

    • Chozo@kbin.social
      link
      fedilink
      arrow-up
      5
      arrow-down
      2
      ·
      4 months ago

      The fleet of cars is summoned back to the HQ to have the update installed, so it causes a temporary service shutdown until cars are able to start leaving the garage with the new software. They can’t do major updates over the air due to the file size; pushing out a mutli-gigabyte update to a few hundred cars at once isn’t great on the cellular network.

    • Kbobabob@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      arrow-down
      2
      ·
      4 months ago

      What typically happens when a recall is issued for other vehicles? Don’t they either remove and replace the bad part or add extra parts to fix the issue?

      How is removing bad code and replacing it with good code or just adding extra code to fix the issue any different?

      Do you want to physically go somewhere?

      • filcuk@lemmy.zip
        link
        fedilink
        English
        arrow-up
        6
        arrow-down
        1
        ·
        4 months ago

        Kinda, as the word implies. If it’s a software update, call it that; the car’s not going back to the shop/manufacturer.

          • Jakeroxs@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            1
            ·
            edit-2
            4 months ago

            Here’s an example of why I don’t like that they’re called recalls when it’s just a system update, if you have a recall on a food item, is there some way to fix it aside from taking it back (to be replaced) or throwing it away?

            When there’s a security patch released on your phone, do we call it a recall on the phone? Or is that reserved for when there a major hardware defect (like the Samsung Note fiasco)

            • Kbobabob@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              ·
              edit-2
              4 months ago

              I think the difference in the case you mentioned is that with a car they use recall because it could be dangerous to keep using it as is.

              • Jakeroxs@sh.itjust.works
                link
                fedilink
                English
                arrow-up
                1
                ·
                4 months ago

                Fair, it just seems like there should maybe be a new word for this era where an OTA update is all that’s needed.

        • ShepherdPie@midwest.social
          link
          fedilink
          English
          arrow-up
          1
          ·
          4 months ago

          What if you consider its the software/firmware getting recalled and not the vehicle itself? Then it’s all perfectly cromulent.

    • MNByChoice@midwest.social
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      1
      ·
      4 months ago

      They often are. Many recalls for other manufacturers are similar. They don’t actually buy back the cars and crush them.

  • tonyn@lemmy.ml
    link
    fedilink
    English
    arrow-up
    29
    arrow-down
    1
    ·
    5 months ago

    That pickup truck was asking for it I tell ya. He was looking at me sideways, he was.

  • Chozo@kbin.social
    link
    fedilink
    arrow-up
    29
    arrow-down
    1
    ·
    edit-2
    5 months ago

    After an investigation, Waymo found that its software had incorrectly predicted the future movements of the pickup truck due to “persistent orientation mismatch” between the towed vehicle and the one towing it.

    Having worked at Waymo for a year troubleshooting daily builds of the software, this sounds to me like they may be trying to test riskier, “human” behaviors. Normally, the cars won’t accelerate at all if the lidar detects an object in front of it, no matter what it thinks the object is or what direction it’s moving in. So the fact that this failsafe was overridden somehow makes me think they’re trying to add more “What would a human driver do in this situation?” options to the car’s decision-making process. I’m guessing somebody added something along the lines of “assume the object will have started moving by the time you’re closer to that position” and forgot to set a backup safety mechanism for the event that the object doesn’t start moving.

    I’m pretty sure the dev team also has safety checklists that they go through before pushing out any build, to make sure that every failsafe is accounted for, so that’s a pretty major fuckup to have slipped through the cracks (if my theory is even close to accurate). But luckily, a very easily-fixed fuckup. They’re lucky this situation was just “comically stupid” instead of “harrowing tragedy”.

    • GiveMemes@jlai.lu
      link
      fedilink
      English
      arrow-up
      20
      arrow-down
      17
      ·
      5 months ago

      Get your beta tests off my tax dollar funded roads pls. Feel free to beta test on a closed track.

      • Chozo@kbin.social
        link
        fedilink
        arrow-up
        26
        arrow-down
        4
        ·
        5 months ago

        They’ve already been testing on private tracks for years. There comes a point where, eventually, something new is used for the first time on a public road. Regardless, even despite even idiotic crashes like this one, they’re still safer than human drivers.

        I say my tax dollar funded DMV should put forth a significantly more stringent driving test and auto-revoke the licenses of anybody who doesn’t pass, before I’d want SDCs off the roads. Inattentive drivers are one of the most lethal things in the world, and we all just kinda shrug our shoulders and ignore that problem, but then we somehow take issue when a literal supercomputer on wheels with an audited safety history far exceeding any human driver has two hiccups over the course of hundreds of millions of driven miles. It’s just a weird outlook, imo.

        • fiercekitten@lemm.ee
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          3
          ·
          4 months ago

          People have been hit and killed by autonomous vehicles on public streets due to bad practices and bad software. Those cases aren’t hiccups, those are deaths that shouldn’t have happened and shouldn’t have been able to happen. If a company can’t develop its product and make it safe without killing people first, then it shouldn’t get to make the product.

          • Chozo@kbin.social
            link
            fedilink
            arrow-up
            3
            arrow-down
            1
            ·
            4 months ago

            People have been hit and killed by human drivers at much, much higher rates than SDCs. Those aren’t hiccups, and those are deaths that shouldn’t have happened, as well. The miles driven per collision ratio between humans and SDCs aren’t even comparable. Human drivers are an order of magnitude more dangerous, and there’s an order of magnitude more human drivers than SDCs in the cities where these fleets are deployed.

            By your logic, you should agree that we should be revoking licenses and removing human drivers from the equation, because people are far more dangerous than SDCs are. If we can’t drive safely without killing people, then we shouldn’t be licensing people to drive, right?

            • fiercekitten@lemm.ee
              link
              fedilink
              English
              arrow-up
              2
              arrow-down
              2
              ·
              4 months ago

              I’m all for making the roads safer, but these companies should never have the right to test their products in a way that gets people killed, period. That didn’t happen in this article, but it has happened, and that’s not okay.

              • Chozo@kbin.social
                link
                fedilink
                arrow-up
                5
                arrow-down
                1
                ·
                edit-2
                4 months ago

                People shouldn’t drive in a way that gets people killed. Where’s the outrage for the problem that we’ve already had for over a century and done nothing to fix?

                A solution is appearing, and you’re rejecting it.

  • deafboy@lemmy.world
    link
    fedilink
    English
    arrow-up
    17
    arrow-down
    2
    ·
    4 months ago

    “made contact” “towed improperly”. What a pathetic excuse. Wasn’t the entire point of self driving cars the ability to deal with unpredictable situations? The ones that happen all the time every day?

    Considering the driving habits differ from town to town, the current approaches do not seem to be viable for the long term anyway.

    • Argonne@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      4 months ago

      It’s as if they are still in testing. This is many years away from being safe, but it will happen

    • Meowoem@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      1
      ·
      4 months ago

      It’s a rare edge case that slipped through because the circumstances to cause it are obscure, from the description it was a minor bump and the software was updated to try and ensure it doesn’t happen again - and it probably won’t.

      Testing for things like this is difficult but looking at the numbers from these projects testing is going incredibly well and we’re likely to see moves towards legal acceptance soon

  • indomara@lemmy.world
    link
    fedilink
    English
    arrow-up
    18
    arrow-down
    3
    ·
    4 months ago

    I still don’t understand how these are allowed. One is not allowed to let a Tesla drive without being 100% in control and ready to take the wheel at all times, but these cars are allowed to drive around autonomously?

    If I am driving my car, and I hit a pedestrian, they have legal recourse against me. What happens when it was an AI or a company or a car?

    • kava@lemmy.world
      link
      fedilink
      English
      arrow-up
      10
      arrow-down
      1
      ·
      4 months ago

      You have legal recourse against the owner of the car, presumably the company that is profiting from the taxi service.

      You see these all the time in San Francisco. I’d imagine the vast majority of the time, there are no issues. It’s just going to be big headlines whenever some accident does happen.

      Nobody seems to care about the nearly 50,000 people dying every year from human-caused car accidents

      • Cosmic Cleric@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        4
        ·
        4 months ago

        Nobody seems to care about the nearly 50,000 people dying every year from human-caused car accidents

        I would actually wager that’s not true, it’s just that the people we elect tend to favor the corporations and look after their interests moreso than the people who elected them, so we end up being powerless to do anything about it.

        • kava@lemmy.world
          link
          fedilink
          English
          arrow-up
          6
          ·
          edit-2
          4 months ago

          sure, but why do these accidents caused by AI drivers get on the news consistently and yet we rarely see news about human-caused accidents? it’s because news reports what is most interesting - not exactly accurate or representative of the real problems of the country

          • ShepherdPie@midwest.social
            link
            fedilink
            English
            arrow-up
            4
            ·
            4 months ago

            Yeah same reason why a single EV fire is national news but an ICE fire is just an unnoteworthy, everyday occurrence.

    • Oka@lemmy.ml
      link
      fedilink
      English
      arrow-up
      5
      ·
      4 months ago

      The company is at fault. I don’t think there’s laws currently in place that say a vehicle has to be manned on the street, just that it uses the correct signals and responds correctly to traffic, but I may be wrong. It may also be local laws.

  • rsuri@lemmy.world
    link
    fedilink
    English
    arrow-up
    16
    arrow-down
    1
    ·
    4 months ago

    In a blog post, Waymo has revealed that on December 11, 2023, one of its robotaxis collided with a backwards-facing pickup truck being towed ahead of it. The company says the truck was being towed improperly and was angled across a center turn lane and a traffic lane.

    See? Waymo robotaxis don’t just take you where you need to go, they also dispense swift road justice.