Would you buy a car that chose to murder you?

You are travelling along a single-lane mountain road in an autonomous car that is fast-approaching a tunnel. Just before you enter, a child runs and trips in front of you.

The car makes a split-second decision based on logic: the child has a longer lifespan than you, so it commits suicide by smashing into the wall of the tunnel, thereby saving the child. But you are killed instantly.

Would you buy a car that made that decision?

In March this year, 63% of Metro readers said they would buy a driverless car; but perhaps they didn’t know that driverless car manufacturers are facing that exact dilemma: should the pedestrian or the passenger die when there is an inevitable collision?

Research shows three-quarters of people are in favour of killing the driver.

However, the same research shows when it is implicitly stated that you are the driver who will be killed, then people are more likely to opt for the ‘selfish’ option.

Or, as Spock and Kirk famously discussed, ‘The needs of the many outweigh the needs of the few – as long as it’s not my arse on the line.’ Well, something like that, anyway.

You may recognise the dilemma as a version of The Trolley Problem, an ethical brainteaser that has been twisting the minds of philosophers since 1967.

And it’s a Catch 22 situation – if fewer people buy autonomous cars because they are programmed to sacrifice their owners, more people will inevitably die because human error accounts for more than 90% of accidents in the vehicles we currently drive.

One major problem is, who will ultimately play God?

The first driverless cars will still require you to have a driving licence, and will presumably put the onus on you to take back ultimate control.

However, tests have already shown, unsurprisingly, that people do not pay attention to the road when they are in an autonomous vehicle.

Eventually, when prices are democratised, everyone will be able to afford driverless. All cars will be meshed, communicating with each other using 5G tech. It is then that you are likely to lose control, to either the programmers or the Government.

Would you trust these agents to have your life in their hands?

There are also other scenarios to consider.

What about terrorists hacking your car? Or rich people paying to have an override button and better tech to save themselves at the expense of your life? Or people who deliberately jump in front of your driverless car, forcing it to save their lives and sacrifice yours?

And will cars start using moral bases for their decisions, and weigh human life?

For instance, would the car save the brain surgeon but kill the unemployed man? Would we all have to be ‘rated’, like that creepy Black Mirror episode?

Maybe there will be implementation of new road-safety laws.

Research shows that people believe, if you are following the rules of society, then you shouldn’t be the one who is killed.

For example, jaywalking could become a crime, like it is in the States. Therefore, if pedestrians are flouting these laws, the car could make a snap judgement accordingly, and kill the transgressor.

One argument is that other technology will have caught up by the time cars are meshed.

For instance, will we see ejector seats? Or will our cars fill with a breathable foam on impact – like in the film Demolition Man? Will driverless cars be on a grid, with no pedestrians to complicate matters?

Leave a Reply

Your email address will not be published. Required fields are marked *

Close

About the project

Instant Things

We are a small division based in London UK, that is managing several websites for large clients in the media industry. The project has been started these spring and the website is live since July. We are publishing the most interesting and actual stories in 4 categories and we are open to new ideas and contributors.