Murder on the highway

What’s for sure is that this case will not be the last one but the first of many. Hence, people’s eyeballs are on the impending judgment because it will set a precedent and probably shape the development of self-driving cars in the coming years


It was a cold winter evening, bang in the middle of the festive season, and Gilberto met a girl around his age called Maria. After much pondering, he found the courage to ask her on a date, and she immediately accepted. A mix of adrenaline and excitement went through his body as they mutually agreed on their first date, the 29th of December.

The anticipated day finally arrived; he picked Maria up with his Honda Civic and drove to a surprise destination. As the car was going through an intersection in one of the suburbs of Los Angeles, a Tesla Model S ran the red lights moving at high speed. It then bashed into their car, killing them on the spot and erasing their dreams forever.

It was a terrible calamity, an avoidable one if only Kevin, the driver, had been careful. The police conducted the necessary inquiries and charged the driver with manslaughter, with the prosecutors arguing that his actions were reckless. But in a surprising twist of events, Kevin is pleading not guilty.

His line of defense is relatively straightforward. He was driving a Tesla Model S, one of those cars with Autopilot built into them. Even though the inquiry ascertained that he had his hand on the steering wheel, he was not driving the car. The Artificial Intelligence (AI) of the Tesla Model S was calling the shots. The computer decided to go through the red lights and crash at full speed into the other car.

Such a tragedy is not the first of its kind; since 2016, more than 27 crashes involving Autopilot have resulted in 11 fatalities. When one considers that there are almost 800,000 vehicles equipped with Autopilot, the number of accidents seems small. However, this case differs from all the others because Kevin is the first person charged with a felony. The court documents do not refer to the Autopilot even though it was in use. Furthermore, the victims’ families, who are also suing the driver and Tesla in two separate proceedings, claim that the car “suddenly and unintentionally accelerated to an excessive, unsafe, and uncontrollable speed.” Of course, now we have to see who will bear the responsibility.

Tesla doesn’t seem to want to take any, and they claim that drivers should not misuse the autopilot function. They must not fall into the trap of automation complacency, whereby they give the AI total control of the vehicle. An official Tesla spokesman even stated that the Autopilot should only be used by a fully attentive driver. But we all know that this statement cannot be further from the truth since a recent survey of Tesla owners found that almost half of them said they feel comfortable treating their vehicles as fully autonomous. If they didn’t, then what is the scope of owning a car with self-driving capabilities in the first place?

So if the driver technically wasn’t driving, and since the manufacturer is not taking any responsibility, who should bear the blame?

One may accuse the AI. But the AI is not a legal entity and cannot be formally charged. Furthermore, Autopilot is not a typical software program. Even though a team of AI developers wrote the program, the decision system of the software evolves with time. This kind of software uses machine learning algorithms, meaning the more experience the car has, the more sophisticated it becomes.

Now, these cars do not just rely on their own experience but also share the driving experience of other self-driving vehicles. So imagine the 800,000 cars mentioned earlier; they all collect their different driving experiences and share them amongst themselves to improve the driving program. We estimate that today, such vehicles boast more than 60 years of continuous driving experience, which is constantly increasing daily. So it’s very hard to pinpoint who is responsible for this tragedy.

We still need to answer another open question, whether it’s an accident or a murder. An accident can be defined as an unfortunate event that happens unexpectedly and unintentionally, usually resulting in damage, injury, or even death, like in this case. However, if we were to place the operations of the car under the microscope, the car most probably detected the other vehicle, and rather than breaking, it decided to press the accelerator. Considering that such vehicles make hundreds of decisions per second, it could have chosen to swerve around, but it didn’t. Since it decided to smash into the other car, this shows premeditation, which is one of the conditions for murder. However, we acknowledge that there was no malicious intent behind this decision, so it probably falls under manslaughter (killing a human without malice aforethought).

Of course, these are just conjectures, and in the end, it will be up to the court to decide the best course of action to ensure that justice prevails and give some peace to the victims’ families.

What’s for sure is that this case will not be the last one but the first of many. Hence, people’s eyeballs are on the impending judgment because it will set a precedent and probably shape the development of self-driving cars in the coming years.

More in People