Sei sulla pagina 1di 2

Assignment # 4

Autonomous Vehicles Ethical Issues


Autonomous cars raise lots of important ethical questions.
1. Who's liable when an autonomous car crashes? The driver? Google? The programmer?
2. How should the car be programmed to act in the event of an unavoidable accident? Should
it minimize the loss of life, even if it means sacrificing the occupants, or should it protect
the occupants at all costs? Should it choose between these extremes at random? The
answers to these ethical questions are important because they could have a big impact on
the way self-driving cars are accepted in society. Who would buy a car programmed to
sacrifice the owner?
3. Is it acceptable for an autonomous vehicle to avoid a motorcycle by swerving into a wall,
considering that the probability of survival is greater for the passenger of the car, than for
the rider of the motorcycle?
4. Should different decisions be made when children
are on board, since they both have a longer time
ahead of them than adults, and had less agency in
being in the car in the first place?
5. If a manufacturer offers different versions of its
moral algorithm, and a buyer knowingly chose
one of them, is the buyer to blame for the harmful
consequences of the algorithms decisions?
6. If the software misinterprets a worn down sign
does the blame fall on the department of
transportation for poorly maintained signage or the
company who produced the self-driving software?
7. Tell a car to prioritize avoiding humans over
avoiding parked vehicles, or not to swerve for
squirrels.
8. The biggest ethical question is how quickly we
move. We have a technology that potentially could
save a lot of people, but is going to be imperfect and is going to kill. [1]

Other Ethical Issues:


1. Unemployment: There is a significant number of people in the world who make a living
driving (cabs, trucker drivers, delivery men). When self-driving cars become common
place is it possible that these people lose their jobs?
Page | 1

2. Trolley problem: Would the car choose to let 4


pedestrians die or kill the driver/pedestrian?
3. Current laws vs. future: Do you make autonomous cars
conform to current laws and vehicle codes or establish
new ones?
4. Vulnerability: How do you ensure autonomous cars
remain safe when the system becomes advanced enough
to warrant a car network? What about cybersecurity and
the autonomous car?
5. Expectation that a car is autonomous: Do you still
allow manually operated cars when most cars are
autonomous. Consider an elevator, people expect
sticking their arm in the door will cause it to reopen.
Will this become the norm for cars?
6. Differing laws in different countries/states: Should
autonomous
vehicles
be
regulated
at
an
international/national or state level? The implications
of self-driving cars regulated at a state level could lead
to significant delay in innovation by manufacturers having to conform many different state
level regulations. [2]
As science fiction begins to turn to reality it is likely Isaac Asimov's famous laws concerning
robotics will come into the ethical discussion surrounding self-driving cars:
1. A robot may not injure a human being or, through inaction, allow a human being to come
to harm.
2. A robot must obey the orders given to it by human beings, except where such orders
would conflict with the first law.
3. A robot must protect its own existence as long as such protection does not conflict with
the first or second laws. [3]

References:
1. Will Knight, How to Help Self-Driving Cars Make Ethical Decisions, July 29, 2015. Available at:
https://www.technologyreview.com/s/539731/how-to-help-self-driving-cars-make-ethicaldecisions/ (Accessed: 25 Sep 2016).
2. Ethics,
Googles
Autonomous
Vehicle.
Available
at:
http://googlesautonomousvehicle.weebly.com/ethics.html (Accessed: 25 Sep 2016).
3. Asimov, Isaac (1950). I, Robot.

Page | 2

Potrebbero piacerti anche