The passage below is accompanied by a set of questions. Choose the best answer to each question.
As software improves, the people using it become less likely to sharpen their own know-how. Applications that offer lots of prompts and tips are often to blame; simpler, less solicitous programs push people harder to think, act and learn.
Ten years ago, information scientists at Utrecht University in the Netherlands had a group of people carry out complicated analytical and planning tasks using either rudimentary software that provided no assistance or sophisticated software that offered a great deal of aid. The researchers found that the people using the simple software developed better strategies, made fewer mistakes and developed a deeper aptitude for the work. The people using the more advanced software, meanwhile, would often “aimlessly click around” when confronted with a tricky problem. The supposedly helpful software actually short-circuited their thinking and learning.
[According to] philosopher Hubert Dreyfus . . . . our skills get sharper only through practice, when we use them regularly to overcome different sorts of difficult challenges. The goal of modern software, by contrast, is to ease our way through such challenges. Arduous, painstaking work is exactly what programmers are most eager to automate—after all, that is where the immediate efficiency gains tend to lie. In other words, a fundamental tension ripples between the interests of the people doing the automation and the interests of the people doing the work.
Nevertheless, automation’s scope continues to widen. With the rise of electronic health records, physicians increasingly rely on software templates to guide them through patient exams. The programs incorporate valuable checklists and alerts, but they also make medicine more routinized and formulaic—and distance doctors from their patients. . . . Harvard Medical School professor Beth Lown, in a 2012 journal article . . . warned that when doctors become“screen-driven,” following a computer’s prompts rather than “the patient’s narrative thread,” their thinking can become constricted. In the worst cases, they may miss important diagnostic signals. . . .
In a recent paper published in the journal Diagnosis, three medical researchers . . . examined the misdiagnosis of Thomas Eric Duncan, the first person to die of Ebola in the U.S., at Texas Health Presbyterian Hospital Dallas. They argue that the digital templates used by the hospital’s clinicians to record patient information probably helped to induce a kind of tunnel vision. “These highly constrained tools,” the researchers write, “are optimized for data capture but at the expense of sacrificing their utility for appropriate triage and diagnosis, leading users to miss the forest for the trees.” Medical software, they write, is no “replacement for basic history-taking, examination skills, and critical thinking.” . . .
There is an alternative. In “human-centred automation,” the talents of people take precedence. . . . In this model, software plays an essential but secondary role. It takes over routine functions that a human operator has already mastered, issues alerts when unexpected situations arise, provides fresh information that expands the operator’s perspective and counters the biases that often distort human thinking. The technology becomes the expert's partner, not the expert’s replacement.
It can be inferred that in the Utrecht University experiment, one group of people was“aimlessly clicking around” because:
"The researchers found that the people using the simple software developed better strategies, made fewer mistakes and developed a deeper aptitude for the work. The people using the more advanced software, meanwhile, would often “aimlessly click around” when confronted with a tricky problem. The supposedly helpful software actually short-circuited their thinking and learning."
The above excerpt gives the findings of the Utrecht University experiment. The two study groups(one assisted by simple software and the other by a more sophisticated one) show contrasting behaviours. When confronted with a tricky problem, the one with the advanced software would often aimlessly click around the screen to solve the problem. This shows the dependent behaviour of the user on the software. In other words, the users, rather than trying to develop a strategy for the problem, were expecting it to get done by the software.
Option A: Nowhere in the excerpt was the competency of the users questioned. Instead, it was the effect of the dependency on the software being tested. Thus, this option cannot be inferred.
Option B: The users expected the software to help with the tricky problems. This was described by the phrase "aimlessly click around" in the above excerpt. Thus, this is the correct option.
Option C: The phrase "aimlessly click around" was not used to contrast the strategies adopted by the two study groups; hence, this is not the correct option.
Option D: This again cannot be inferred from the above excerpt.
Thus, the correct option is B.
Create a FREE account and get: