Self-driving cars are slowly becoming reality now. With companies like Tesla and Volvo working tirelessly on self-driving software, their cars have already reached level 4 of autonomous driving. (6 levels of autonomous driving)
Although we are still a few years away when a completely self-driving car could drive from point A to point B without any human intervention, we need to start resolving the legal issues that self-driving cars would bring upon us.
Why are self-driving cars important?
Self-driving cars are important because it has the potential to significantly reduce the crashes, prevent injuries and save lives. 94% of the accidents that happen on the road today are because of human error. So, replacing the traditional human-driven cars with automatic self-driving cars will revolutionize the concept of transportation all over the world.
Self-driving cars will save time and increase productivity. As the cars would totally following an assigned path, it will no more lead to traffic jams and thus saving a lot of time. The people who would be travelling in the car would also be using the time to either catch up on sleep or complete an assignment, either will lead to increased productivity.
It will also be very beneficial for the environment as it will save the greenhouse gases produced by cars while being stuck in traffic. Even in electric cars, it will reduce the amount of energy being consumed from moving one place to another and thus will be good for the environment.
What are some problems with self-driving cars?
There are numerous problems that self-driving cars would inflict. Even if we ignore the negative economical impact of self-driving cars such as the loss of jobs, there are still plenty of problems that would need to be resolved way before self-driving cars become reality.
Let’s consider a situation, a completely self-driven car is going on a road and there’s a curve. When the car approaches the curve, it realizes that an old lady is standing in its way. If the car tries to enter the other lane, it will hit a 5-year-old kid. And if it tries to avoid both of them, it will hit a wall and kill the passenger. There is no time to apply brakes and come to a stop. What will the car do?
Will it kill the old lady cause she is already old and has lived her life? Or will it kill the kid who was never in danger in the first place? Or will it hit the wall and killing the passenger who trusted the car with his life?
If the company programmes always save the pedestrian in such a situation, will a person ever get into that knowing that his death is assured if such a situation arises? Or will the company be sued by every pedestrian injured due to their programming of saving the passenger rather than the pedestrian? Who will be responsible for anyone’s death? How will the car decide who’s life is more important and on what parameters?
This is an ethical dilemma that the manufacturer needs to solve even before a single car could hit the road.
This hypothesis is put forward by assuming that cars would be able to process so many permutations and combinations in a very short period of time, which is another technical challenge that needs to be addressed before self-driving cars become reality.
What are the legal issues that self-driving cars arises?
Various legal issues need to be addressed before putting self-driving cars on the road.
In case a self-driving car hits a pedestrian or another self-driving vehicle, who is going to be liable for that? According to the laws existing right now, the driver is responsible for the accident. But when fully autonomous cars hit the road, there will be no driver in the car. The current trend with Tesla in the United States shows that the liability for the accidents by the autonomous cars would shift towards the manufacturer.
In a case in Texas, Tesla was involved in a crash. The driver claimed that he was using the autopilot and not driving himself. Even though Tesla is a level 4 self-driving car, it was sued for the injury caused by its auto-pilot software. (In level 4 of autonomous driving, the driver is required to pay attention at all times. Click here to read more).
If a driver is intoxicated at the wheel of a self-driving car, will he be held liable for the crash even though he is technically not “driving” the car?
What if an accident takes place due to the wrong decision of the car, to what level will a manufacturer be liable?
This question needs to be answered by a series of legislation that should be drafted carefully and justly.
2. Data Protection
Autonomous cars which are being developed right now are based on machine learning. To explain it in layman’s language, the car is made to go through almost all the situations that it may encounter on the road is taught how to adjust accordingly. So, the AI processes a huge amount of data to be able to drive on the road.
Those data may be sensitive as it could say a lot about the owner who drives the vehicle. For example, If a car insurance company gets hold of that data, it can use it to manipulate car owners to pay higher premiums as their driving pattern is not “aligned” with the insurance company “standards”.
The car may also contain the locations that the owner goes to regularly. These all are personal data that can be easily misused. And when a car could be used to access such sensitive data, it becomes a legal privacy issue.
One of the biggest dangers that self-driving cars pose is their prone to getting hacked. A car that can drive itself when guided through a satellite, could also be controlled remotely. And when a car can be controlled remotely by a satellite, it can be taken over by hackers. There could be chaos in the world if even 1 terrorist organization gets hold of such a satellite.
After Autonomous cars penetrate the market, Al-Qaeda could launch a terrorist attack on New York City from a Master Control Facility (MCF) in Kazakhstan. They could control all the cars at once and create chaos. It could be days if not weeks before any military could track and stop them.
Cybersecurity will be the one place where companies would require to work a lot to make their cars safe from malware attacks and hacking attempts. Laws should be brought in place to localize things as far as possible so that terrorists from other countries could not control them. A kill switch could also be put in place to deactivate every car in an instant if such a situation arises.
Whenever there is an issue of liability, another issue automatically arises with it, Insurance. There is too much ambiguity and possibility under the current laws for self-driving cars.
Suppose an autonomous car hits a pedestrian or a cyclist. Who will be liable under this scenario? Will it be the passenger who was just sitting in the car or the cyclist? There would be numerous situations where autonomous cars would drive themselves from point A to B without any passengers inside. Who will be liable then if an accident takes place?
Under the current laws, a driver has to get the insurance of the vehicle under the “user-liability model”. It is possible that with manufacturer made self-driving cars, the liability could be divided under multiple parties. Questions to be resolved include whether car owners will still be required to have third-party liability insurance, whether carmakers will be required to have product liability coverage.
These are the major legal issues that a self-driving car faces today. There are numerous more which could not make it to the top 4 most important legal issues. Self-driving cars are the future, it is eventually going to come. So, instead of trying to dodge the legal issues, lets us face it and start drafting laws for it. We are already behind in a lot of technologies, let us not miss this bus too.