So about that Uber, that we shall now call Christine…

Oscar Gordon

A Navy Turbine Tech who learned to spin wrenches on old cars, Oscar has since been trained as an Engineer & Software Developer & now writes tools for other engineers. When not in his shop or at work, he can be found spending time with his family, gardening, hiking, kayaking, gaming, or whatever strikes his fancy & fits in the budget.

Related Post Roulette

39 Responses

  1. Road Scholar says:

    I saw this and it adds an important data point to the discussion we had here earlier. What we have here is a Bayesian situation; an algorithm that takes in various data streams and has to output a binary decision. (Actually, multiple algorithms simultaneously looking for different things.) And like any other Bayesian test situation, the algorithm will have a certain likelihood of committing Type I and Type II errors with the overall goal of tuning it to keep both types of errors within acceptable bounds. So false positives make for an unpleasant user experience while false negatives result in injury and death. At the current stage of development tuning the algorithm to an acceptable rate of false negatives renders the system unacceptable from a user experience and consumer acceptance perspective. Uber is desperately trying to monetize the technology investment and just got ahead of themselves with tragic results.Report

    • Oscar Gordon in reply to Road Scholar says:

      @road-scholar

      Yep. Honestly, while it might make some sense for a company like Uber to be involved in the development of an autonomous vehicle, they shouldn’t be trying to do it themselves. They don’t have the right kind of development experience or corporate culture. Sure, Alphabet isn’t exactly who I would think of to run development of autonomous vehicles either, but they seem to be much more careful and conservative in their approach.Report

    • Morat20 in reply to Road Scholar says:

      Uber had also moved onto cost cutting measures — stripping out sensors and other moves that make sense once you’ve proven the system and are now pruning it down to what you really need to run it.

      So Uber was ahead of itself in yet another area, because their self-driving system was nowhere near mature even before this accident.

      And lastly, I’ve seen strangely little on any impact them losing that court case might have had — after all, they jump started their self-driving car process by poaching a bunch of Google IP, and then lost in court about it. I suspect they had to do some reinventing the wheel, and I can’t help but wonder how thoroughly they tested their replacements.Report

    • PD Shaw in reply to Road Scholar says:

      Yeah, but I think this is also a problem:

      According to Uber, the developmental self-driving system relies on an attentive operator to intervene if the system fails to perform appropriately during testing.

      It’s marketed (or will be) as a convenience item (leave the driving to us), but its dependent on an attentive driver that the technology naturally lulls into complacency.Report

      • Oscar Gordon in reply to PD Shaw says:

        That is what I meant when I said:

        …until you’ve got all the bugs worked out, you annoy the hell out of the operator and you risk a jerky ride (the operator is getting paid, right?).

        If the operator needs to be attentive, then you keep them attentive.Report

  2. That said, I am glad the report notes that the pedestrian was not helping herself by crossing in the dark, wearing dark clothes, and having no side reflectors on her bike. There is plenty of fault to go around.

    That’s probably true, but it sounds off to me. What if the pedestrian had no side reflectors but wore bright clothe, or wore bright clothes but had no side reflectors? Would there still be “plenty” of fault on the pedestrian (as well as Uber) or would it be a lesser degree of fault?

    I admit I’m being contentious and I admit that I tend to side with the pedestrian even though I know full well that pedestrians aren’t always right just for being pedestrians. But still, this pedestrian didn’t have a say as to whether Uber would be doing a test. Maybe another car would’ve hit her, but in this case it was the Uber car and not “another car.”Report

    • Oscar Gordon in reply to gabriel conroy says:

      @gabriel-conroy

      Had the pedestrian done something to make themselves more visible to the approaching car, the operator might have noticed something further out and been able to take action by grabbing the wheel and swerving. Additionally, being more visible might have caused the AI to achieve recognition of the object it had detected sooner, and caused the AI to slow down or evade.Report

      • Stillwater in reply to Oscar Gordon says:

        Autonomous car AI cannot fail, it can only be failed.Report

        • Oscar Gordon in reply to Stillwater says:

          Oh no, the AI failed, because it had 6 seconds to do something besides call a cout<< command.

          As I note to Mark, had a human been solely in control, the accident probably still would have happened, so let's not kid ourselves that your average driver would have done better.Report

          • Stillwater in reply to Oscar Gordon says:

            This is what you wrote: Additionally, being more visible might have caused the AI to achieve recognition of the object it had detected sooner, and caused the AI to slow down or evade.

            AI cannot fail, dude.

            Add: I think you’re my new enemy at OT Oscar. 🙂 Sorry about that but I didn’t make you the evil AI advocate in these parts.Report

            • Oscar Gordon in reply to Stillwater says:

              Dude, the AI failed. Or, rather, Uber* failed to give the AI what it needed to avoid an accident it had plenty of lead time to avoid.

              My point regarding the culpability of the pedestrian is fleshed out below.

              *The AI is a construct, thus the failure is on those who built it.Report

          • As I note to Mark, had a human been solely in control, the accident probably still would have happened, so let’s not kid ourselves that your average driver would have done better.

            With Dragnofrog’s reservations below, I’m inclined to agree. (I also sign on to the spirit of his comment about civil engineering, with the proviso that I’m ill-equipped to judge that kind of thing.)

            I will say that if Uber hadn’t been testing that thing, the car wouldn’t have been driven at all at that particular time/date. It’s not as if the driver was going to go wherever they were going and just decided to let Uber do it. In that sense, the test caused an accident that might very well not have happened.

            I realize my reasoning can be taken too ridiculous extremes when it comes to assigning fault in things like these. I also realize that part of your own critique of what happened, if I understand it right, is that Uber was doing the test when it wasn’t ready or when it didn’t have the tools to do it right. So I don’t think I’m saying anything you’re not, even though I am pushing back on the “plenty of fault to go around” framing.Report

      • I’m primarily complaining about the “[t]here is plenty of fault to go around” way of looking at it, even though there is fault to go around. In that sense, I was complaining mostly about the “plenty.” It seems like a claim that “both sides were to blame, so it’s all good.” You’re definitely not saying that, but that’s the way the wording came across to me. And I was pushing back against that (probably mostly uncharitable) way of reading that wording.

        And I do get the point your elaborated in your comment. The pedestrian definitely played a role in what happened.Report

    • dragonfrog in reply to gabriel conroy says:

      The urban design. Holy crap the urban design. That is where the “plenty of blame to go around” goes.

      Seriously, go look at the spot where the lady was killed, in Google maps. Look at the satellite view, go through it in street view, turn the viewpoint so you’re looking at it as a driver would and as a person on foot would. Trace out where the paths go, where the built environment has a fence or barrier that says “don’t walk here”, where it has a well maintained path that says says “totally walk here”, where it says “drive fast here, there’s no need to worry about people walking”.

      The city built a beautiful brick footpath though the median, connecting footpaths on either side of the road that led to that spot. To say that the victim was “jaywalking” is technically true, but absolutely everything about what a person on foot saw would lead them to believe they were not jaywalking.

      I cannot even imagine how that built environment came about through simple negligence and incompetence. It is truly so dangerous, and so conflicted in the messages it sends depending on how and from what direction you approach it, that it suggests malice verging on attempted murder by civil engineering.Report

  3. Mark Van H says:

    That said, I am glad the report notes that the pedestrian was not helping herself by crossing in the dark, wearing dark clothes, and having no side reflectors on her bike. There is plenty of fault to go around.

    Except that a human driver still would have spotted the pedestrian. While it might be good sense for a pedestrian to cover itself in reflective clothing and lights, it doesn’t change the fact that there would not have been an accident without a reckless auto pilot program. So the fault is squarely Uber’s.

    And on a related note; if Uber has the ambition to ever operate self driving vehicles in continental Europe, they better fix their algorithms. In most countries -whatever the circumstances- in a collision between a car and pedestrian, the car driver is automatically at fault.Report

    • Oscar Gordon in reply to Mark Van H says:

      If you watch the video from the car, there is no way a human would have avoided that accident. The pedestrian was in shadow with nothing bright or reflective. By the time her shoes were painted by the headlights, it was just over a second to impact.

      The reason Uber screwed up is because the LIDAR system had something 6 seconds out, and the AI didn’t do anything about it except (it would seem), flash a message on a screen.Report

      • Dark Matter in reply to Oscar Gordon says:

        If you watch the video from the car, there is no way a human would have avoided that accident. The pedestrian was in shadow with nothing bright or reflective. By the time her shoes were painted by the headlights, it was just over a second to impact.

        No, that’s just the view from the Uber dash camera which apparently wasn’t set correctly.

        Some YouTubers posted what the street is actually like at night and it’s VERY well lit with street lamps everywhere. A human totally would have caught it, even with someone wearing black.Report

        • Dark Matter in reply to Dark Matter says:

          Here we go. The location of the accident was at about 33 seconds in, he took this video to show how well lit Mill Ave is.

          https://www.youtube.com/watch?v=CRW0q8i3u6EReport

          • Oscar Gordon in reply to Dark Matter says:

            Again, what are they using for a camera, and how is it configured? My cell phone camera won’t give me that kind of video, but my Nikon sure will.

            But I’ll go back to the report. If the investigators at the scene felt that the area was sufficiently illuminated such that what the pedestrian was wearing, and what kinds of lights or reflectors were on the bike were not relevant, why would they mention what the camera saw, unless it was, in their opinion, relevant to the incident?

            I’ll tell you what, at the end of June, I am moving to the Phoenix area. I’ll drive by that area about 10PM some night and tell you my impression of how well lit it is.Report

            • Dark Matter in reply to Oscar Gordon says:

              I’ll tell you what, at the end of June, I am moving to the Phoenix area. I’ll drive by that area about 10PM some night and tell you my impression of how well lit it is.

              :Thumbs Up:Report

      • dragonfrog in reply to Oscar Gordon says:

        If your watch the video and assume that the camera’s light sensitivity is similar to a human eye”s, that’s true.

        I have also seen night time photos of the same location that make it seem far far Moore likely that a human would have seen the victim in time to stop or at least slow down to a less lethal speed.Report

  4. PD Shaw says:

    Still no explanation of the mysterious “brick landscaping in the shape of an X.” My best guess is that it was initially created to temporarily merge traffic while one of the bridges was being renovated. Maybe bricked over after it was done. Someone here hammered on this issue initially, because it looks like something intended to be used to cross the road.Report

    • dragonfrog in reply to PD Shaw says:

      That was me, or at least included me. It looks for all the world like a foot path through the median. If that wasn’t the intent they did a great job disguising it as a foot path – particularly as it is right at the spot where a long foot path intersects the road.Report

  5. Doctor Jay says:

    I’m with Oscar’s evaluation that Uber really has no business carrying out this kind of research. I think it’s seductive in that it looks so simple to manage, and maybe just a question of getting enough data to train the ML brains.

    So people are out there cutting every corner in a race to see who gets enough training in first. While I don’t absolve Uber, I note that this is just a manifestation of the daily high-stakes road race that is the tech industry.

    It’s done a lot for me, and for the world, but the tech industry is addicted to speed, and I mean the velocity kind. The amphetamine kind is mostly submerged, if not completely absent. It is a treadmill with a lottery, keeping everyone running just for a chance at a ticket.

    At some companies they at least question that, and wonder if there isn’t something else to life. But not, from what I’ve heard, at Uber.Report

    • Oscar Gordon in reply to Doctor Jay says:

      Morat is, I believe, on the right track, in that Uber got caught with their hand in Alphabets cookie jar, and now they were scrambling (probably) to satisfy investors and Wall Street prognosticators to avoid having their value shot in the knee.Report

      • Doctor Jay in reply to Oscar Gordon says:

        I used to be an Autonomous Vehicle company like you, but then I took a lawsuit in the knee?

        *****

        Yeah, I think Morat’s point works, but that fits with my point, which is Uber represents the cocaine-snorting version of Silicon Valley, which believes that if an asteroid lands on your head, it just proves you weren’t being enough of a badass. I mean, why weren’t you landing on that asteroid’s head, instead?

        Not everybody here is like that. But we know those people. We know them well.Report

  6. Oscar Gordon says:

    Regarding what the camera saw and the pedestrians visibility:

    1) What a camera sees and what a human eye sees are, of course, not typically the same thing. A camera can be configured to see less than what we see, or a great deal more. So while @dark-matter may be correct that the dashcam was configured improperly and was not light sensitive enough, we shouldn’t trust pictures on the internet either, unless you now exactly what camera was being used and how it was setup.

    2) Even if the dashcam was configured improperly, that doesn’t mean a human driver would have seen the pedestrian. Remember, a driver is sitting behind a curved piece of multilayered glass (that might be dirty) set at an angle relative to the field of view. Anyone want to sit down and calculate the angles of reflection and refraction and guess how well a human would see a person crossing beyond the are of illumination while the glare of a street light was above and in front of the car? There is a REASON we tell bikes and pedestrians to use lights and reflectors and to only cross at marked places, and it’s not because drivers are inattentive at night.

    3) Had this been a case of a human driver, I can pretty much guarantee the police would not fault the driver, specifically because the pedestrian was poorly illuminated and/or reflective and not in a designated crossing. If you cross a street at night in dark clothing and without any kind of light or reflective material/devices, and you get hit, you own some of that fault. Nothing compels you to cross that street at that spot*. Nothing prevents you from waiting until the way is sufficiently clear that you can cross safely. If you cross, at night, without something bright or reflective, and assume that the oncoming car will see you and slow down or evade… well, we all know what happens when you assume.

    *However, I continue to agree with Dragonfrog that the urban planner who designed that median also owns some of the fault, for creating the appearance of a crossing where none exits. This is the inverse of ‘Desire Paths’Report

    • Thanks for this explanation. I still bristle a bit because I think drivers often (how often, I don’t know) aren’t held as responsible as they should be.

      However, I can sign on to this: ” If you cross a street at night in dark clothing and without any kind of light or reflective material/devices, and you get hit, you own some of that fault.”Report