Instructions

Read the passage below and choose the most appropriate answer for the questions that follow.
Passage II
          The 'trolley problem' used to be an obscure question in philosophical ethics. It runs as follows: a trolley, or a train, is speeding down a track towards a junction. Some moustache-twirling evildoer has tied five people to the track ahead and another person to the branch line. You are standing next to a lever that controls the junction. Do nothing, and the five people will be killed. Pull the lever, and only one person dies. What is the ethical course of action?

          The excitement around self-driving cars, though, has made the problem famous. A truly self-driving car, after all, will have to be given ethical instructions of some sort by its human programmers. That has led to a miniature boom for the world's small band of professional ethicists, who suddenly find themselves in hot demand.

          In a paper just published in Nature, a team of psychologists and computer scientists describe a different approach. Rather than asking said a small band of philosophers for their thoughts, this team, led by Edmond Awad of the Massachusetts Institute of Technology (MIT), decided instead to ask the general public. They created the "Moral Machine", a website which presents visitors with a series of choices about whom to save and whom to kill. In one, for instance, a self-driving car experiences brake an ahead of a pedestrian crossing. If it carries on in a straight line, a man, a woman and two homeless people of unspecified sex will be run down. If it serves, the death count will be the same, but the victims will be two women and two male business executives. What should the car do? 

          The strongest preferences, expressed by respondents from all over the world, were for saving human lives over animal ones, preferring to save many rather than few and prioritising children over the old. There were weaker preferences for saving women over men, pedestrians over passengers in the car and for taking action rather than doing nothing. Criminals were seen as literally sub-human ranking below dogs in the public's priority list, above cats.

          Preferences differed between countries. The preference for saving women, for instance, was stronger in places with higher levels of gender equality. The researchers found that the world's countries clustered into three broad categories, which they dubbed "western" covering North America and the Christian cultural countries of Europe, "Eastern", including the Middle East, India and China and "Southern" Latin America and many of France's Former colonial possessions. Countries in the Eastern cluster, for instance, showed a weaker preference for sparing the young over the elderly, while the preference for humans over animals was less pronounced in southern nations. Self-driving cars, it seems, may need the ability to download new moralities when they cross national borders.

Question 49

Regulatory approval of which of the following preferences of car-navigation software is likely to face most uncertainty in a 'Southern' country with high levels of gender in-equality?

Solution

In the Southern countries, it is given that there is a weaker preference while choosing between humans and animals. Moreover, it is given that this particular nation has high levels of gender equality, so there would hardly be any preference between either gender. 
So for the software to face the most uncertainty in this region, the two potential victims should be a human and an animal respectively and both of opposite genders.
The option that best describes such a situation is Option B.


Create a FREE account and get:

  • All Quant Formulas and shortcuts PDF
  • 40+ previous papers with solutions PDF
  • Top 500 MBA exam Solved Questions for Free

cracku

Boost your Prep!

Download App