Technology has been always been developed to provide comfort and safety to humans.
We are also seeing a similar trend in the auto world with the development of autonomous cars. There are semi-autonomous cars for sale in market already. Almost all the OEMs are trying to develop their own autonomous cars for the future and that future doesnt seem to far away. All this is being developed keeping in mind the driver's comfortability and safety.
But then no technology is ever 100% fail proof! We have had many examples recently of self driving cars carshing, just to name a few:
- Google's self driving car jumped a red light and was involved in a serious crash.
- Tesla's self driving car suffered a crash last year, killing the driver.
Am not questioning the technology here, self driving cars certainly have a better record than the human driven cars in terms of crashes. But what am questioning here is that, who's responsible if the self driving cars crash?
Who takes the blame if a component or a software malfunction in the car leads to an accident? The OEM? or the driver?
OEMs like Google and Tesla say that if the cars crash they will take the responsibility, but in a recent crash Tesla blamed the driver saying that the accident could have been avoided if the driver was alert and had kept his hands on the stearing wheel. But they are designing cars which will be completely self driven and require zero human input. what about then? Companies like GM say that to some extent the driver inside the car will be also held responsible if a crash occurs, as he/she can react to avoid the crash.
If OEMs are taking the responsibility and compensating the damages, then what will happen to car insurance companies? What about damages caused by natural phenomenons likes hail storm and typhoons? Who will compensate if the autonomous cars hits a pedestrian? Also all the traffic laws are currently written with the idea of a driver behind the wheel? Will we have to change the driving laws as well?