Such fluency is sometimes a reliable guide to understanding, but it’s also easy to fool. Just presenting a problem in a font that’s harder to read can decrease fluency and trigger more effortful processing, with the surprising consequence that, for example, people do better at logical deduction and tricky word problems when the problems are presented in a less readable font. It follows that smarter and more efficient information retrieval via machines could foster dumber and less effective information-processing in human minds.
Consider another example: The educational psychologist Marcia Linn and her collaborators have studied the “deceptive clarity” that can result from complex scientific visualizations of the kind that technology in the classroom and online education are making ever more readily available. [h] Marcia C. Linn, et al. “Can Desirable Difficulties Overcome Deceptive Clarity in Scientific Visualizations?” in Successful Remembering and Successful Forgetting: A Festschrift in Honor of Robert A. Bjork , Aaron Benjamin, ed. (New York: Routledge, 2010).
Such clarity can be deceptive because the transparency and memorability of the visualization is mistaken for genuine understanding. It isn’t just that when the visualization is gone the understanding goes with it, but that genuine understanding was never achieved in the first place.
People suffer from illusions of understanding even absent fancy technology, but current trends toward faster, easier, and more seamless information retrieval threaten to exacerbate rather than correct any misplaced confidence in what we truly comprehend. These trends also threaten to undermine some natural mechanisms for self-correction. For example, people are often more accurate in assessing their own understanding after explaining something to someone else (or even to themselves) or after a delay. Removing social interaction and time from informational transactions could therefore have costs when it comes to our ability to track our understanding.
Are technological advances and illusions of understanding inevitably intertwined? Fortunately not. If a change in font or delay in access can attenuate fluency, then a host of other minor tweaks to the way information is accessed and presented can surely do the same. In educational contexts, deceptive clarity can be partly overcome by introducing what psychologist Robert Bjork calls “desirable difficulties,” such as varying the conditions under which information is presented, delaying feedback, or engaging learners in generation and critique—all of which disrupt a false sense of comprehension. And some of the social mechanisms that help calibrate understanding, such as explanation, can be facilitated by the marriage of information technologies and social media.
But avoiding illusions of understanding, and the intellectual hubris they portend, will require more than a change in technology—it will require a change in our expectations and behavior. We have to give up the idea that fast and easy access to information is always better access to information.
THE END OF HARDSHIP INOCULATION
ADAM ALTER
Psychologist; assistant professor of marketing, Stern School of Business, NYU; author, Drunk Tank Pink: And Other Unexpected Forces That Shape How We Think, Feel, and Behave
When psychologists ask people to tackle a new mental task in the laboratory, they begin with a round of practice trials. As the novelty of the experience wears off, participants develop a tentative mastery over the task, no longer wasting their limited cognitive resources on trying to remember which buttons to push, or repeatedly rehearsing the responses they’re expected to perform. Just as vaccines inoculate people against disease, a small dose of practice trials prepares participants for the rigors of the experiment proper.
The same logic explains how children come to master the mental difficulties confronting them as they grow into adulthood. Trivial hardships inure them to greater future challenges that might otherwise defeat them but for the help of these experiential scaffolds. A child who remembers his mother’s phone number is thereafter better equipped to memorize other numerical information. Another who routinely performs mental arithmetic in math class develops the skills needed to perform more complex mental algorithms. A third who sits bored at home on a rainy day is forced to devise new forms of entertainment, meanwhile learning the rudiments of critical thinking.
Unfortunately, these crucial experiences are declining with the rise of lifestyle technologies. The operation of iPhones and iPads is miraculously intuitive, but their user-friendliness means that children as young as three or four can learn to use them. Smartphones and tablets eradicate the need to remember phone numbers, perform mental calculations, and seek new forms of entertainment, so children of the 21st century never experience the minor hardships attending those tasks. They certainly derive other benefits from technology, but convenience and stimulation are double-edged swords, also heralding the decline of hardship inoculation. Today’s children might thus be poorly prepared for the more difficult tasks that will meet them as time passes.
What’s particularly worrying is not that today’s children will grow up to be cognitively unprepared but the question of what the trend portends for their children, grandchildren, and so on. The “ideal” world—the one that looks more and more like the contemporary world with each passing generation—is the same world that fails to prepare us to memorize, compute, generate, elaborate, and, more generally, to think. We don’t yet know which cognitive capacities will be usurped by machines and gadgets, but the range will widen over time, and the people who run governments, businesses, and scientific enterprises will be the poorer prepared because of this foregone vaccination.
LARRY SANGER
Cofounder of Wikipedia & founder of Citizendium
We should be worried about online silos. They make us stupid and hostile toward one another.
Internet silos are news, information, opinion, and discussion communities dominated by a single point of view. Examples are the Huffington Post on the left and National Review Online on the right, but these are only a couple of examples—and not the worst, either. In technology, Slashdot is a different kind of silo—of geek attitudes.
Information silos in general are nothing new and not limited to the Internet; talk radio works this way, churches and academia are often silos, and businesses and organizations study how to avoid a silo culture. But Internet communities are particularly subject to a silo mentality because they are virtually instant: They have no history of competing diverse traditions and they are self-selecting, thus self-reinforcing. The differences between online communities tend to be stark. That’s why there are so many silos online.
It shouldn’t be surprising that silos are fun and compelling for a lot of us. They make us feel like we belong. They reinforce our core assumptions and give us easily digestible talking points, obviating the need for difficult individual thought. They appeal to our epistemic vanity and our laziness.
That’s one of the problems. Silos make us overconfident and uncritical. Silos worry me because critical knowledge—the only kind there is, about anything difficult—requires a robust marketplace of ideas. Silos give too much credence to objectively unsupportable views that stroke the egos of their members. In a broader marketplace, such ideas would be subjected to much-needed scrutiny. Silos are epistemically suspect. They make us stupider. They might be full of (biased) information, but they make us less critical-thinking and hence lower the quality of our belief systems.
Читать дальше