You are currently browsing the category archive for the ‘Reasons To Study Literature’ category.

Because it makes you smarter than similar coursework in “business, education, social work, and communications”:

Students majoring in liberal arts fields see “significantly higher gains in critical thinking, complex reasoning, and writing skills over time than students in other fields of study.” Students majoring in business, education, social work and communications showed the smallest gains. (The authors note that this could be more a reflection of more-demanding reading and writing assignments, on average, in the liberal arts courses than of the substance of the material.)

This article reports on the findings of a brand new book, Academically Adrift: Limited Learning on College Campuses, where two highly respected sociologists studied how much students learn in college. The findings show that many students simply fail to learn anything of much significance in college, but that liberal arts majors show the greatest gains. And this jives with what I’ve heard from local businesses, who prefer to hire liberal arts majors because they can think and communicate, and adapt quickly to new expectations. I wonder whether that’s becoming a trend on a larger level, nationally and whatnot?

(Now, the one or two of you who used to read this blog might complain that I haven’t updated in two months. And you’d be right–I didn’t just fumble, I dropped that ball and kicked it right into the hands of the other team. I had a lot on my plate and simply failed to keep up here. But expect new things more frequently.)

Because politics is complicated, and if you want to live in a democracy you have to try to figure it out.

But America is famously anti-intellectual, however, so we’ve somehow wound up in the situation we’re in right now, aptly summed up by Eugene Robinson:

The nation demands the impossible: quick, painless solutions to long-term, structural problems. While they’re running for office, politicians of both parties encourage this kind of magical thinking. When they get into office, they’re forced to try to explain that things aren’t quite so simple — that restructuring our economy, renewing the nation’s increasingly rickety infrastructure, reforming an unsustainable system of entitlements, redefining America’s position in the world and all the other massive challenges that face the country are going to require years of effort. But the American people don’t want to hear any of this. They want somebody to make it all better. Now.

In plain English, this state of affairs is called “wishful thinking.” It doesn’t work that well for my four-year-old kid, and it isn’t going to work that well for you, either.

The world is a complicated place. No one is going to figure it out for you, but they will figure it out for themselves and use that knowledge to take advantage of you. And the danger comes, as I’ve always seen and heard it described, when too many people surrender their powers of intellect to a few demagogues and power brokers. That’s when really bad things like revolutions and tyrannies and dictatorships happen. See history for examples.

Now, I find myself once again suggesting that studying literature is the way to help us avoid these kinds of bad outcomes. And surely, if there are any political scientists or financiers or historians reading this blog, they’re already sharpening their argumentative knives to prove that it’s really their discipline, not literature, that’s really practical when it comes to dealing with these problems. And once again, I’ll simply note that literature is classically defined as “literary productions as a whole; the body of writings produced in a particular country or period, or in the world in general” (OED).

As you’ll learn in your literature classes, it’s the context–the historical context, the political context, the intellectual context, the debates that one book responds to and then creates afterward as other people read it–that really counts. Reading one book is almost as bad as reading no books. The point is to start reading, to become a skilled and informed reader, and to keep reading–until you know everything that’s relevant.

So that you don’t become a know-nothing:

It would be nice to dismiss the stupid things that Americans believe as harmless, the price of having such a large, messy democracy. Plenty of hate-filled partisans swore that Abraham Lincoln was a Catholic and Franklin Roosevelt was a Jew. So what if one-in-five believe the sun revolves around the earth, or aren’t sure from which country the United States gained its independence?

But false belief in weapons of mass-destruction led the United States to a trillion-dollar war. And trust in rising home value as a truism as reliable as a sunrise was a major contributor to the catastrophic collapse of the economy. At its worst extreme, a culture of misinformation can produce something like Iran, which is run by a Holocaust denier.

It’s one thing to forget the past, with predictable consequences, as the favorite aphorism goes. But what about those who refuse to comprehend the present?

Political debate is conducted largely through words. So how else are you going to learn how to handle words–how to parse through an argument and discover whether it’s based on faulty reasoning–how to articulate your own opinions, responses, and rebuttals?

OK, to be fair, you can learn some of these things in other humanities fields, like philosophy or political science. But one of the definitions of literature is “literary productions as a whole; the body of writings produced in a particular country or period, or in the world in general” (OED). Therefore philosophy and political science are literature, at least in the broad sense. Double-majoring in something practical, like political science or English, and learning how to read, speak, and listen, is a helpful complement to the theoretical knowledge you pick up during all those ivory-tower business classes–you know, the ones conducted in a lecture hall, with none of the pressures or politics of an actual workplace.

So you should double-major in a humanities field–unless, I suppose, you have an ingenious plan for doing business sans communication.

In one of the last essays Tony Judt wrote before he died, he explains why words are irreplaceable:

Cultural insecurity begets its linguistic doppelganger. The same is true of technical advance. In a world of Facebook, MySpace and Twitter (not to mention texting), pithy allusion substitutes for exposition. Where once the internet seemed an opportunity for unrestricted communication, the commercial bias of the medium – “I am what I buy” – brings impoverishment of its own. My children observe of their own generation that the communicative shorthand of their hardware has begun to seep into communication itself: “People talk like texts.”

This ought to worry us. When words lose their integrity so do the ideas they express. If we privilege personal expression over formal convention, then we are privatising language no less than we have privatised so much else. “When I use a word,” Humpty Dumpty said, in rather a scornful tone, “it means just what I choose it to mean – neither more nor less.” “The question is,” said Alice, “whether you can make words mean so many different things.” Alice was right: the outcome is anarchy.

In Politics and the English Language, Orwell castigated contemporaries for using language to mystify rather than inform. His critique was directed at bad faith: people wrote poorly because they were trying to say something unclear or else deliberately prevaricating. Our problem is different. Shoddy prose today bespeaks intellectual insecurity: we speak and write badly because we don’t feel confident in what we think and are reluctant to assert it unambiguously (“It’s only my opinion …”). Rather than suffering from the onset of “newspeak”, we risk the rise of “nospeak”.

I am more conscious of these considerations now than at any time in the past. In the grip of a neurological disorder, I am fast losing control of words even as my relationship with the world has been reduced to them. They still form with impeccable discipline and unreduced range in the silence of my thoughts – the view from inside is as rich as ever – but I can no longer convey them with ease. Vowel sounds and sibilant consonants slide out of my mouth, shapeless and inchoate. The vocal muscle, for 60 years my reliable alter ego, is failing. Communication, performance, assertion: these are now my weakest assets. Translating being into thought, thought into words and words into communication will soon be beyond me and I shall be confined to the rhetorical landscape of my interior reflections.

No longer free to exercise it, I appreciate more than ever how vital communication is: not just the means by which we live together but part of what living together means. The wealth of words in which I was raised were a public space in their own right – and properly preserved public spaces are what we so lack today. If words fall into disrepair, what will substitute? They are all we have.

(P.S. I’m also pleased to hear someone else call out the kind of political “discourse” at the top of the page as the product of “know-nothings.” I’ve been calling these kinds of willfully ignorant people “know-nothings” since at least the 2008 election.)

Because science (and business, for that matter) are doing it now, too.

From an article by Geoffrey Harpham, Director of the National Humanities Center down in beautiful North Carolina:

Autonomy, singularity, creativity–each of these terms names both a long-standing concern of the humanities and a set of contemporary projects being undertaken in the sciences.

Many such projects–from the relatively familiar such as stem-cell research and the Human Genome Project to the more exotic, such as attempts to upload the component parts of consciousness into a computer, bioinformatics, and advanced nanotechnology–appear to have serious implications for our basic understanding of human being. These projects may well force us to modify our understanding of traditional moral and philosophical questions, including the definition of and value attached to such presumptively nonhuman concepts as “the animal” and “the machine.”

Humanists, who have been only partially aware of the work being done by scientists and other nonhumanists on their own most fundamental concepts, must try to overcome their disciplinary and temperamental resistances and welcome these developments as offering a new grounding for their own work. They must commit themselves to be not just spectators marveling at new miracles, but coinvestigators of these miracles, synthesizing, weighing, judging and translating into the vernacular so that new ideas can enter public discourse.

They–we–must understand that while scientists are indeed poaching our concepts, poaching in general is one of the ways in which disciplines are reinvigorated, and this particular act of thievery is nothing less than the primary driver of the transformation of knowledge today. For their part, those investigating the human condition from a nonhumanistic perspective must accept the contributions of humanists, who have a deep and abiding stake in all knowledge related to the question of the human.

We stand today at a critical juncture not just in the history of disciplines but of human self-understanding, one that presents remarkable and unprecedented opportunities for thinkers of all descriptions. A rich, deep and extended conversation between humanists and scientists on the question of the human could have implications well beyond the academy. It could result in the rejuvenation of many disciplines, and even in a reconfiguration of disciplines themselves–in short, a new golden age.

Harpham’s call for a conversation between scientists and humanists about the nature of “the human” isn’t quite as far-fetched as it might seem, since he, as Director of the NHC, has been running these kinds of talks for several years now. Some of the findings have found their way into print in a special issue of Daedalus. So the question for other humanists is, I suppose, what science have you read lately?

Because somebody has to figure out what to do with e-readers.

I’ve tended to be a bit negative about e-reading. Partly because I just like holding a book, and partly because many studies seem to suggest that it significantly changes the reading experience–which I’m pretty happy with the way it is, thanks.

Given that I approach this subject negatively, I was surprised to find myself fascinated by Adrian Johns‘s talk “As our Readers Go Digital. Johns makes a provocative case for the benefits of using e-readers. He does this by relating such uses back to the development of print–his field of expertise–with a nifty reading of a passage from Milton’s “Areopagitica”:

First, the advent of this dislocated, ‘universal’, skeptical reading practice – along with the places and formats that sustained it – provoked constructive responses as well as calls for simple repression. The problem was to uphold cultural order, and to do that one must confirm “good” reading in the new environment of rivalry. You see the problem addressed first in John Milton’s famous tract on censorship, Areopagitica. Much of Milton’s tract was devoted to defining the responsibilities of reading in this febrile world. Consider his marvelous image of London:

Behold now this vast city, a city of refuge, the mansion house of liberty….The shop of war hath not there more anvils and hammers waking to fashion out the plates and instruments of armed justice in defense of beleaguered truth than there be pens and heads there, sitting by their studious lamps, musing, searching, revolving new notions and ideas wherewith to present, as with their homage and their fealty, the approaching reformation; others as fast reading, trying all things, assenting to the force of reason and convincement. What could a man require more from a nation so pliant and so prone to seek after knowledge? What wants there to such a towardly and pregnant soil but wise and faithful laborers to make a knowing people, a nation of prophets, of sages, and of worthies? … Where there is much desire to learn, there of necessity will be much arguing, much writing, many opinions; for opinion in good men is but knowledge in the making.

The question was how to frame what Milton called a “generous prudence” that could furnish common ground for sparring readers beyond all institutions. It did not exist when Milton was writing. By the time of, say, Addison and Steele, less than a century later, it did: there were widely accepted norms of good and bad, expert and inexpert reading – in coffeehouses, of periodicals. Coffeehouse culture had become civilized, learned. As Thomas Hobbes said, even science itself came to find a home in the media and practices of this environment.

Johns sees an analogy between the creation and development of the infamous Enlightenment public sphere and the prospects ahead of us in the digital age. He calls those of us who care about judicious reading to both maintain those values and instill them in students. This seems to me a very important point in the wash of fears and panic-mongering about the new information onslaught:

This matter of readerly responsibility is, I think, important in a generalized sense. It may even be the key to the 3 Ps’ role in digital culture. We, like Milton, live at a moment when knowledge has left institutions and is wandering through strange and ill-defined spaces. We too face fears of skepticism, credulity, and a world where the civility of reading – its role in civic life – is up for grabs. In early modern Europe, it was only when a civility of reading was developed – partly through the revival of older skills, partly by a new forensics of the book – that corrosive credulity and skepticism were checked and a “public sphere” could come into being. Much of the Enlightenment depended on that achievement. What we have in prospect is another historic shift of the same order of magnitude. We too need to establish foundations for it: we need standards for creditable reading. As in Casaubon’s day, they will surely come from a merger of the old and the new, and some element of them will be forensic. That is, they will involve source criticism – both textual and algorithmic, including a comprehension of metadata – and historical – the history of the book, for example – at the same time.

Someone needs to articulate this. The fact that books have “left the building” leaves us inside the building with a more important task than ever. But it is something that calls for real change in the institutional and professional structures we inhabit – change of a kind not often enough appreciated. The roles of librarian and faculty need to be interpermeable again (as they were in Milton’s day). It isn’t enough for librarians to become “informationists,” as those at Johns Hopkins reportedly have. When places, practices, and publics are all up for grabs, to forge them anew we all need to become something more – something for which the seventeenth century had a much better word: intelligencers. Intelligencers were the skilled mediators of Milton’s day – people like Samuel Hartlib, Henry Oldenburg, Marin Mersenne, and Nicolas-Claude Fabri de Peiresc – who made possible the maturing of rancor, credulity, and skepticism into a relatively polite and cosmopolitan public sphere. We need more like them now.

I am not at all sure that academia is well equipped to produce these intelligencers; none of those I have just named came from universities. But if not us, who?

This turn to the ethics of e-reading, and to the need to invent new ethical standards to accommodate it, seems to me much needed. Ethical questions, and more broadly human questions, tend to be given little if any attention in discussions of e-reading technology. I recently blogged about an example here, where the questions being asked are about speed-reading rather than comprehension. Or, in other words, about the results that it’s easiest to communicate in our current public sphere. How many of us know how fast we read? Probably not many, but all of us have some ideas about what we’ve read and why.

To quote a favorite film, Miller’s Crossing, “I’m talkin’ about friendship. I’m talkin’ about character. I’m talkin’ about – hell. Leo, I ain’t embarrassed to use the word – I’m talkin’ about ethics.”

The sheer difficulty of trying to do it on your own, Good Will Hunting-style.

From a recent interview Vice Magazine did with Harold Bloom:

VICE: If a person wants to seriously approach literature on their own, outside of academia, it’s very difficult.

BLOOM: Without a real teacher, an authentic teacher, a real mentor, it’s very difficult for anyone to get started.

This was certainly the case for me. I trace it back to two teachers, actually. I was always a big reader, but the first taught me to tackle serious twentieth-century books like Joyce and Borges. The second taught me the importance of the classics and the Renaissance, especially Marlowe and Shakespeare. Somehow I wound up specializing in eighteenth-century literature.

It makes everyday life more interesting. Because once you start thinking about the things around you, you find yourself asking “what the hell?” a whole lot more often, and that’s ultimately pretty amusing.

An example: I was just eating chips out of a bag and fresh hot salsa. On the back of this Tostitos bag is written:

EVERY 3 MINUTES a woman in the U.S. is diagnosed with breast cancer.

In 2010, Frito-Lay will donate $1 MILLION to Susan G. Komen for the Cure, to fund education materials for early detection and prevention.

IN 3 MINUTES

LEARN how to administer a breast self exam by visiting www.komen.org.

SPREAD the word about the benefits of early detection to your friends and loved ones.

ENJOY a sensible, satisfying snack!

Now you may be wondering whether my sense of humor is black enough or insensitive enough to be amused by breast cancer. Not at all: my grandmother had a rough time with breast cancer (she ultimately survived the cancer). If you click on the link above, you’ll find yourself at the website listed on the back of the bag, because it’s worth getting the word out on something this important.

Is a million bucks of charity from a multinational corporation like Frito-Lay a paltry sum? Yes, but I’m not going to take issue with that, either. I’m going to guess that they lay out a lot more than that in charitable donations each year.

By this point I’m sure you’ve guessed that step three is the part that made me think, “What the hell?” In how many different ways can something be a non sequitur? First of all, snacking has nothing to do with breast cancer. Second, I bet there’s no woman alive who, even after reading the back of this bag, runs through those three steps: breast self-exam, getting the education out there, Frito-Lay snack.

And why does it have to be a “sensible” AND “satisfying” snack? C’mon, Frito-Lay: faced with our own mortality, can’t we head out and knock back a few beers over some buffalo wings? Because, like my grandmother, God rest her soul, even if you survive the cancer you’re going to find life terminal. In the face of those odds, I think the average woman needs to live a little. Maybe she needs some ice cream, if beer and wings ain’t her thing, you know? Whatever. And contrary to popular belief, she who dies skinniest does not win.

So why you gotta bring us DOWN, Frito-Lay? Oh, right, because you’re The Man, and The Man is always about keeping us down, one sensible, satisfying snack at a time.

And this is what it’s like to be a person who has spent most of his adult life studying literature. Doesn’t it sound fun?

Because the best mathematicians have decided that it’s a necessary adjunct to mathematical study, and what’s more important to the modern world than mathematics?

When Perelman was fourteen, Rukshin spent the summer tutoring him in English; he accomplished in a few months what generally took four years of study. Perelman had to fulfill the English requirement to get into Leningrad’s Specialized Mathematics School Number 239. As Gessen writes, these mathematical high schools owe much to Andrei Kolmogorov, arguably the most important Soviet mathematician of the twentieth century and a figure who straddled the divide in Soviet mathematics mentioned above.

Kolmogorov, who did seminal work in probability, complexity theory, and other subjects, was something of an anomaly. A prolific mathematician, he was also passionately interested in education and devised an imaginative secondary school curriculum featuring mathematics first of all, but also classical music, sports, hiking, literature, poetry, and activities intended to foster male bonding. In the schools that he inspired, his disciples promoted Greek and Renaissance values and tried to protect their students from Marxist indoctrination. Eventually Kolmogorov was denounced as an agent of Western influence in the Soviet Union, but his ideas still permeated School 239 when Perelman studied there.

Because it’s needed.

I’m going to let Michael Elliott, one of my former professors in graduate school, explain why–in 60 seconds, no less:

Critical thinking skills!

Now, mind you, I tend to roll my eyes when my colleagues defend the study of literature by trotting out the hoary old wisdom that it teaches critical thinking skills. I mean, of course it does. The world is a complex place, and you need to know some literature to participate in it fully. But the same also goes for science and math and history and political theory and so on. You need some of each, if you can get them.

(An aside: the reason Voltaire can be so disappointing is his overreliance on the power of critical thinking. In his hands, that often meant that what was intended to be critical thinking tended to become an exercise in witty misdirection that left the core issues untouched. See pretty much anything Stanley Fish has ever written for further examples of this.)

A lot of people who take great pride in being critical thinkers haven’t ever turned their critical faculties on their own opinions, as anyone who works in academe can tell you. Skepticism is a difficult taskmaster, after all.

So critical thinking doesn’t really make much of a case for literature in particular, because so much more than just literature goes into good critical thinking. But read Jane Galt here for a bit of exploration into the value of a humanities education for producing clever, witty people even in tech fields:

The interesting thing about the technology field is that while it’s mostly engineers and computer science majors, there’s a large minority of people who were drawn in from other fields. And what’s really interesting is that the most brilliant people I worked with during my time as a network engineer were humanities majors, something I’ve also heard other people say. Brilliant technically, I mean….

Why should that be? I don’t know. To the extent that it is true other than anecdotally, I suspect that part of it is that someone who can get a master’s in English lit from Columbia and become a really good programmer is just someone who’s really, really smart. But they also tended to be people who came up with the solutions other people hadn’t thought of — the outside-the-box people.

What makes the humanities (separate from the arts) important is that they take the areas where we have insufficient data (or too much) and try to abstract useful principles from it….

The universe does not offer us always and everywhere the opportunity to hypothesize, test, and peer review. Most of the time we have to make binding decisions on incomplete information. By searching for information in the spaces where hard data is not available, the humanities give us the tools to address those decisions….

But that doesn’t mean that it can’t be immensely challenging and rewarding, nor that it can’t provide us with valuable insights into the human condition. I am disturbed that the majority of the country doesn’t grasp basic principles of scientific thought, but I’m equally disturbed that the majority haven’t any idea or interest about their own history or how their government works, much less in picking up Shakespeare or even Dickens. And why not? Because it’s hard, that’s why. Just like any other discipline, understanding history or reading great works of literature from past centuries requires you to put in a lot of legwork building up your vocabulary of cultural and linguistic information before you can really get into the works. It’s boring. You stuff your brain with facts; you read Guilliver’s Travels for the second time hating every word. Then one day you have a eureka moment: two facts connect themselves in your mind in some way you’ve never thought of before. The internal logic of the eighteenth century penetrates your brain, and you laugh out loud at something Swift has said. Those things are important. They are the only way that we can enlarge our knowlege of human action beyond the limited scope of our own lifetimes — and as any scientist will tell you, the larger the data set, the better. They also tell us about ourselves in an intimate way that physics won’t. A life without art strikes me as, in some way, deeply unexamined.

So if the engineers who haven’t picked up a book since you read “The Godfather” in high school thought you were getting off easy, think again. The reason you haven’t is that you’re just as lazy as the English majors you make fun of.

The problem with our education is deeper than a lack of science. It’s a lack of breadth. We allow our students to wander off into little corners and only talk to others who share a fairly narrow range of interests. And we do so because doing otherwise is too hard. Hard on the professors, who have to force learning into the brains of students who aren’t used to it and don’t like it; hard on the parents, who will see a lot more variance in their childrens’ grades than they like; and hard, of course, on the students, who often seem to bitterly resent any effort to actually make them learn anything. In which number I was probably included in my younger days. But we should try anyway, if only so that we’ll have more areas of potential dinner conversation than the latest episode of Survivor.

The call here, in fact, is not really for literature on its own, but for the value of literature in a well-rounded education. That’s an idea I am willing to get behind 110%. After all, it wouldn’t hurt my feelings if English majors took a bit more math. (I took math up through differential equations in college, thankyewverrahmuchly.) If nothing else, it would help when it came time to settle up the check.