You are currently browsing the category archive for the ‘Science or Common Sense?’ category.

It’s starting to catch on, the idea that different learning styles are bunk. A paper that was only cited as forthcoming evidence for this old post of mine has now been published, and it argues that there’s simply no persuasive evidence that there’s such a thing as different learning styles.

(Now, as a parenthetical aside, just because there’s another study with these kinds of findings doesn’t mean that the idea of different learning styles will disappear. That idea has been accepted widely, so it’ll be hard to eradicate it. The seeming anecdotal evidence (“I knew this kid, Johnny, who couldn’t learn till his teacher put the information in Mayan pictograms…and then he became an A student!”) will be taken to outweigh simple scientific fact. So people will believe in learning styles in spite of evidence to the contrary. And it’s such a convenient excuse: if a kid underperforms, well, chalk it up to bad teaching that doesn’t take into account Susie’s learning style. Any idea with that kind of legs is going to be with us for a long time.)

Anyway, let me get to my favorite part of the article:

But individual learning is another matter, and psychologists have discovered that some of the most hallowed advice on study habits is flat wrong. For instance, many study skills courses insist that students find a specific place, a study room or a quiet corner of the library, to take their work. The research finds just the opposite. In one classic 1978 experiment, psychologists found that college students who studied a list of 40 vocabulary words in two different rooms — one windowless and cluttered, the other modern, with a view on a courtyard — did far better on a test than students who studied the words twice, in the same room. Later studies have confirmed the finding, for a variety of topics.

The brain makes subtle associations between what it is studying and the background sensations it has at the time, the authors say, regardless of whether those perceptions are conscious. It colors the terms of the Versailles Treaty with the wasted fluorescent glow of the dorm study room, say; or the elements of the Marshall Plan with the jade-curtain shade of the willow tree in the backyard. Forcing the brain to make multiple associations with the same material may, in effect, give that information more neural scaffolding.

“What we think is happening here is that, when the outside context is varied, the information is enriched, and this slows down forgetting,” said Dr. Bjork, the senior author of the two-room experiment.

To which I say, well of course! If a student studies the same information twice, more learning will occur. It’s elementary, my dear Watson.

Of course, a student has to study the information once in order to study it again. For some students, that’s going to be where the entire exercise breaks down.

And then again, the article is careful to remind us:

None of which is to suggest that these techniques — alternating study environments, mixing content, spacing study sessions, self-testing or all the above — will turn a grade-A slacker into a grade-A student. Motivation matters. So do impressing friends, making the hockey team and finding the nerve to text the cute student in social studies.

Apparently even science can’t create motivation ex nihilo, so until it can, it would appear we’re stuck with having some good students, some average, and some not-so-good.


Because science (and business, for that matter) are doing it now, too.

From an article by Geoffrey Harpham, Director of the National Humanities Center down in beautiful North Carolina:

Autonomy, singularity, creativity–each of these terms names both a long-standing concern of the humanities and a set of contemporary projects being undertaken in the sciences.

Many such projects–from the relatively familiar such as stem-cell research and the Human Genome Project to the more exotic, such as attempts to upload the component parts of consciousness into a computer, bioinformatics, and advanced nanotechnology–appear to have serious implications for our basic understanding of human being. These projects may well force us to modify our understanding of traditional moral and philosophical questions, including the definition of and value attached to such presumptively nonhuman concepts as “the animal” and “the machine.”

Humanists, who have been only partially aware of the work being done by scientists and other nonhumanists on their own most fundamental concepts, must try to overcome their disciplinary and temperamental resistances and welcome these developments as offering a new grounding for their own work. They must commit themselves to be not just spectators marveling at new miracles, but coinvestigators of these miracles, synthesizing, weighing, judging and translating into the vernacular so that new ideas can enter public discourse.

They–we–must understand that while scientists are indeed poaching our concepts, poaching in general is one of the ways in which disciplines are reinvigorated, and this particular act of thievery is nothing less than the primary driver of the transformation of knowledge today. For their part, those investigating the human condition from a nonhumanistic perspective must accept the contributions of humanists, who have a deep and abiding stake in all knowledge related to the question of the human.

We stand today at a critical juncture not just in the history of disciplines but of human self-understanding, one that presents remarkable and unprecedented opportunities for thinkers of all descriptions. A rich, deep and extended conversation between humanists and scientists on the question of the human could have implications well beyond the academy. It could result in the rejuvenation of many disciplines, and even in a reconfiguration of disciplines themselves–in short, a new golden age.

Harpham’s call for a conversation between scientists and humanists about the nature of “the human” isn’t quite as far-fetched as it might seem, since he, as Director of the NHC, has been running these kinds of talks for several years now. Some of the findings have found their way into print in a special issue of Daedalus. So the question for other humanists is, I suppose, what science have you read lately?

Because somebody has to figure out what to do with e-readers.

I’ve tended to be a bit negative about e-reading. Partly because I just like holding a book, and partly because many studies seem to suggest that it significantly changes the reading experience–which I’m pretty happy with the way it is, thanks.

Given that I approach this subject negatively, I was surprised to find myself fascinated by Adrian Johns‘s talk “As our Readers Go Digital. Johns makes a provocative case for the benefits of using e-readers. He does this by relating such uses back to the development of print–his field of expertise–with a nifty reading of a passage from Milton’s “Areopagitica”:

First, the advent of this dislocated, ‘universal’, skeptical reading practice – along with the places and formats that sustained it – provoked constructive responses as well as calls for simple repression. The problem was to uphold cultural order, and to do that one must confirm “good” reading in the new environment of rivalry. You see the problem addressed first in John Milton’s famous tract on censorship, Areopagitica. Much of Milton’s tract was devoted to defining the responsibilities of reading in this febrile world. Consider his marvelous image of London:

Behold now this vast city, a city of refuge, the mansion house of liberty….The shop of war hath not there more anvils and hammers waking to fashion out the plates and instruments of armed justice in defense of beleaguered truth than there be pens and heads there, sitting by their studious lamps, musing, searching, revolving new notions and ideas wherewith to present, as with their homage and their fealty, the approaching reformation; others as fast reading, trying all things, assenting to the force of reason and convincement. What could a man require more from a nation so pliant and so prone to seek after knowledge? What wants there to such a towardly and pregnant soil but wise and faithful laborers to make a knowing people, a nation of prophets, of sages, and of worthies? … Where there is much desire to learn, there of necessity will be much arguing, much writing, many opinions; for opinion in good men is but knowledge in the making.

The question was how to frame what Milton called a “generous prudence” that could furnish common ground for sparring readers beyond all institutions. It did not exist when Milton was writing. By the time of, say, Addison and Steele, less than a century later, it did: there were widely accepted norms of good and bad, expert and inexpert reading – in coffeehouses, of periodicals. Coffeehouse culture had become civilized, learned. As Thomas Hobbes said, even science itself came to find a home in the media and practices of this environment.

Johns sees an analogy between the creation and development of the infamous Enlightenment public sphere and the prospects ahead of us in the digital age. He calls those of us who care about judicious reading to both maintain those values and instill them in students. This seems to me a very important point in the wash of fears and panic-mongering about the new information onslaught:

This matter of readerly responsibility is, I think, important in a generalized sense. It may even be the key to the 3 Ps’ role in digital culture. We, like Milton, live at a moment when knowledge has left institutions and is wandering through strange and ill-defined spaces. We too face fears of skepticism, credulity, and a world where the civility of reading – its role in civic life – is up for grabs. In early modern Europe, it was only when a civility of reading was developed – partly through the revival of older skills, partly by a new forensics of the book – that corrosive credulity and skepticism were checked and a “public sphere” could come into being. Much of the Enlightenment depended on that achievement. What we have in prospect is another historic shift of the same order of magnitude. We too need to establish foundations for it: we need standards for creditable reading. As in Casaubon’s day, they will surely come from a merger of the old and the new, and some element of them will be forensic. That is, they will involve source criticism – both textual and algorithmic, including a comprehension of metadata – and historical – the history of the book, for example – at the same time.

Someone needs to articulate this. The fact that books have “left the building” leaves us inside the building with a more important task than ever. But it is something that calls for real change in the institutional and professional structures we inhabit – change of a kind not often enough appreciated. The roles of librarian and faculty need to be interpermeable again (as they were in Milton’s day). It isn’t enough for librarians to become “informationists,” as those at Johns Hopkins reportedly have. When places, practices, and publics are all up for grabs, to forge them anew we all need to become something more – something for which the seventeenth century had a much better word: intelligencers. Intelligencers were the skilled mediators of Milton’s day – people like Samuel Hartlib, Henry Oldenburg, Marin Mersenne, and Nicolas-Claude Fabri de Peiresc – who made possible the maturing of rancor, credulity, and skepticism into a relatively polite and cosmopolitan public sphere. We need more like them now.

I am not at all sure that academia is well equipped to produce these intelligencers; none of those I have just named came from universities. But if not us, who?

This turn to the ethics of e-reading, and to the need to invent new ethical standards to accommodate it, seems to me much needed. Ethical questions, and more broadly human questions, tend to be given little if any attention in discussions of e-reading technology. I recently blogged about an example here, where the questions being asked are about speed-reading rather than comprehension. Or, in other words, about the results that it’s easiest to communicate in our current public sphere. How many of us know how fast we read? Probably not many, but all of us have some ideas about what we’ve read and why.

To quote a favorite film, Miller’s Crossing, “I’m talkin’ about friendship. I’m talkin’ about character. I’m talkin’ about – hell. Leo, I ain’t embarrassed to use the word – I’m talkin’ about ethics.”

A new study concludes that students do not rank teachers highly because they’ve actually learned things from them:

How does a university rate the quality of a professor? In K-12 education, you have standardized tests, and those scores have never been more widely used in evaluating the value added by a teacher.

But there’s no equivalent at the college level. College administrators tend to rely on student evaluations. If students say a professor is doing a good job, perhaps that’s enough.

Or maybe not. A new study reaches the opposite conclusion: professors who rate highly among students tend to teach students less. Professors who teach students more tend to get bad ratings from their students — who, presumably, would just as soon get high grades for minimal effort.

The study finds that professor rank, experience and stature are far more predictive of how much their students will learn. But those professors generally get bad ratings from students, who are effectively punishing their professors for attempting to push them toward deeper learning.

I mean, sure, students recognize quality when they see it. Most of the time. But that leads to a question of principles: when you do recognize superior teaching–an excellent, challenging, time-consuming class–do you risk the B or C in order to learn, when you know you can get an easy A elsewhere? It takes a serious dedication to one’s principles to go that route. And that’s not even counting the people who just want to pay their four years’s worth of dues as quickly and cheaply as possible and get out.

The study is interesting for its longitudinal attention. Apparently students do worse in challenging classes, but better in subsequent ones; and better in easy classes, then worse in subsequent ones. Like, duh. This is the kind of thing that makes me create a new category of blog post: “science or common sense?”