Read the passage below and choose the most appropriate answer for the questions that follow.
Passage II
The 'trolley problem' used to be an obscure question in philosophical ethics. It runs as follows: a trolley, or a train, is speeding down a track towards a junction. Some moustache-twirling evildoer has tied five people to the track ahead and another person to the branch line. You are standing next to a lever that controls the junction. Do nothing, and the five people will be killed. Pull the lever, and only one person dies. What is the ethical course of action?
The excitement around self-driving cars, though, has made the problem famous. A truly self-driving car, after all, will have to be given ethical instructions of some sort by its human programmers. That has led to a miniature boom for the world's small band of professional ethicists, who suddenly find themselves in hot demand.
In a paper just published in Nature, a team of psychologists and computer scientists describe a different approach. Rather than asking said a small band of philosophers for their thoughts, this team, led by Edmond Awad of the Massachusetts Institute of Technology (MIT), decided instead to ask the general public. They created the "Moral Machine", a website which presents visitors with a series of choices about whom to save and whom to kill. In one, for instance, a self-driving car experiences brake an ahead of a pedestrian crossing. If it carries on in a straight line, a man, a woman and two homeless people of unspecified sex will be run down. If it serves, the death count will be the same, but the victims will be two women and two male business executives. What should the car do?
The strongest preferences, expressed by respondents from all over the world, were for saving human lives over animal ones, preferring to save many rather than few and prioritising children over the old. There were weaker preferences for saving women over men, pedestrians over passengers in the car and for taking action rather than doing nothing. Criminals were seen as literally sub-human ranking below dogs in the public's priority list, above cats.
Preferences differed between countries. The preference for saving women, for instance, was stronger in places with higher levels of gender equality. The researchers found that the world's countries clustered into three broad categories, which they dubbed "western" covering North America and the Christian cultural countries of Europe, "Eastern", including the Middle East, India and China and "Southern" Latin America and many of France's Former colonial possessions. Countries in the Eastern cluster, for instance, showed a weaker preference for sparing the young over the elderly, while the preference for humans over animals was less pronounced in southern nations. Self-driving cars, it seems, may need the ability to download new moralities when they cross national borders.
Among the following, who would be the equivalent of the person pulling the lever in the 'trolley problem'?
It is mentioned in the second paragraph, 'A truly self-driving car, after all, will have to be given ethical instructions of some sort by its human programmers ' Hence the equivalent of the person pulling the lever in the 'trolley problem' will be human programmers who give ethical instructions.
Though the excitement of self -driving cars led to the miniature boom of professional ethicists , It is because of the human programmers who gave ethical instructions of some sort to the self-driving cars which have led to the trolley problem.
CEOs of multinational car manufacturing firms and driverless-car owners who use their cars to travel abroad are nowhere mentioned in the passage.
Hence D is the correct answer.
Create a FREE account and get: