1. home
  2. news
  3. Self-Driving Cars: In a Crash, Do They Favor Occupants or Pedestrians?

Self-Driving Cars: In a Crash, Do They Favor Occupants or Pedestrians?

Could something horrifying like this really happen? Absolutely.

Nelson IresonWriterManufacturerPhotographer

It's time for another short trip into the future, once again to investigate the effects self-driving cars might have on the people who will be expected to use them—and everyone else. Whether your self-driving car likes you or not, there are aspects to the implementation of self-driving cars that are likely to raise issues we, as a society, will have difficulty even addressing coherently, let alone solving. One of those problems is how we tell our cars to behave when there are no good outcomes to a situation, such as when a self-driving car must choose between striking a pedestrian to protect its occupant(s) or putting its occupant(s) at risk to save the pedestrian.

Yeah, yeah, you've heard this one before. Maybe you have, but probably not quite like this: What if all of the choices aren't left up to the car? What if the car can be told to give customized weights to its various moral concerns? It's not such a crazy idea; in fact, it's not even my idea, it's a plot element of the Amazon Prime show Upload.

In Upload, set in the near future, people have discovered a way to live on after death in a digital afterlife—and they've also figured out self-driving cars. In one of the pivotal scenes of the first episode, the main character's fiancée seeks to protect him by switching his car from "prefer pedestrian" (his preferred mode, being a considerate guy) to "prefer occupant" mode. There's not much explanation, but the names are self-explanatory: in the event of a quandary, the car will choose to protect either the person on foot or the person in the car.

Such a decision requires obvious moral input, so it's not inconceivable that some combination of our litigious society, our polemicized legislative system, and corporate risk-avoidance might yield self-driving cars that handle the mundane stuff but leave the big decisions to the meat computers inside. In fact, barring truly unfathomable advances in AI, this may be the only route forward, as it would be impossible to write every circumstance a car might encounter into its programming.

So, what of it? You might be one of the people who, like the fiancée in Upload, have no compunction about flipping the switch to "prefer occupant" mode. Or you might have a kinder heart and keep your self-driving car in "prefer pedestrian" mode. But in either case, you have to wonder what the aftermath of a crash would look like where that choice actually came into play.

Let's run through both scenarios, with a common setup: The self-driving car is on a 50-mph surface street in a major city, approaching an intersection on a rainy day. The light is green in the direction of the car's travel, there are multiple pedestrians waiting on the corner, and cross-traffic waiting at the light. The car's occupant is catching up on much-needed sleep during the commute.

Prefer Pedestrian:

As the self-driving car approaches the intersection, a handful of people, not noticing the oncoming car, decide to hurry across the street and out of the rain without waiting for the walk signal. Suddenly faced with a choice between barreling through this small crowd or swerving into the skyscrapers lining the street, the car doesn't hesitate, immediately taking evasive action to prevent contact with the people on foot. In doing so, it swerves not into the skyscrapers, but into the cross-traffic lanes, T-boning a car at 50 mph. The jaywalking pedestrians escape uninjured, but the law-abiding car's occupant dies. The occupant of the subject car, after a rude awakening, survives with minor injuries.

Prefer Occupant:

As the self-driving car approaches the intersection, the same handful of people scamper across the street to escape the rain. The self-driving car takes notice of the pedestrians, and possibly even sounds an alert to the occupant in the cabin (bumpy road ahead!), but given its prior directive, it simply carries on. The occupants of the cars waiting at the light at the cross-street (those without privacy mode engaged, or otherwise distracted, at least) are treated to the horrifying sight of five people being mortally launched into the air as the self-driving car that hit them continues on its merry way, the driver's sleep only momentarily disturbed.

This is obviously a nightmare scenario, but it's far from inconceivable. Equally nightmarish is assigning responsibility for the outcome of either option. In the Prefer Pedestrian case, the law-abiding driver's death certainly wasn't his or her own fault, but should the blame fall on the pedestrians, the occupant of the subject car, or the car itself? Certainly, the pedestrians are the rule-breakers here, and their rule-breaking is what caused the crash. But the pedestrians didn't steer the car into waiting traffic, the self-driving car did, and it did so at the behest of the driver's preferences.

In the second case, you might argue even more forcefully that the pedestrians took their lives into their own hands when they decided to jaywalk, taking any blame off of the self-driving car or its occupant, especially since no uninvolved third parties were injured or killed. But, as the Prefer Pedestrian scenario shows, this wasn't the only possible outcome, but the outcome predetermined by the occupant's preference to save their own skin.

If you're looking for the answer of who's to blame in each scenario, I don't have one for you—and I'm not sure we can come to an answer as a society, either, at least not quickly and easily, and certainly not cheaply. We can't even come to a consensus on whether to wear masks during a pandemic (and we weren't able to a century ago, either)—a decision not all that different, morally speaking, from the one presented here.