Dominik Nemcsok
Everybody makes mistakes. Mistakes are inevitable for a human being, but not just people can make mistakes. Nowadays new technical innovations appear day to day. Most of these inventions are designed to help people. But we should ask: What if a computer makes a mistake?
If someone makes a mistake, depending on what kind of mistake has been made, different consequences can occur. For instance, when a student makes a lot of mistakes on a test, he gets a lower grade, or when somebody decides to sleep for 10 more minutes, he might miss the bus. I think these smaller kinds of mistakes have happened to everybody. But sometimes greater mistakes are made. For example, imagine when a playground is not maintained properly and a child gets injured. In a case like that the police investigates the case, and the court decides what should happen to those responsible. It is straightforward what happens to people making mistakes, because the laws and rules of most countries are fair, reasonable, and well-considered.
However, legislatives and governments are usually not able to keep up with the quickly innovating world. I would like to take the topic of self-driving vehicles as an example. Even now there are a lot of cars on the road with very advanced “driver-assistance” features and even some that are completely self-driving, for example, some taxis in Los Angeles. However, nowadays the capabilities of these computer-driven vehicles are highly limited. But that makes me return to my previous question: What if a computer makes a mistake?
What will happen when self-driving cars are driving people around all over the world, and a deadly accident happens? Whose fault will it be? What if the car hits a pedestrian? If that were to happen today when the “driver-assistance” features of the car are enabled, it would probably still be the fault of the driver. Because when someone turns on this function, the manufacturer makes the driver note that he must take control of the vehicle whenever necessary, and that the car is not completely capable of driving itself. But if this would happen with a vehicle that is one hundred percent self-driving? It is unclear whose fault it would be. Who and how should be punished? In my opinion, it couldn’t be the driver, because he wouldn’t have any control in a situation like that. But then, should it be the fault of the people who designed the car’s driving software or the company in general that made the car?
I believe that everyone thinks of these questions differently, but it is crucially important to be careful about technologies that can be dangerous for anyone and anything. We shouldn’t make mistakes that everyone would regret. It is our responsibility to make the right decisions.