Oh the irony...

[a draft of a talk that I'm giving at this conference. Feedback is always welcome.]

This talk was originally going to be a brief exposition of how I teach my graduate students to write analytical papers, the slant being that I privilege the process of writing over the product of writing in a variety of ways. Few educators would disagree, I imagine, that teaching students how to research and write is important, particularly at the graduate level. Many of these students will pursue careers in academia where the “publish or perish” mentality is still alive and well. However, as I was thinking through what I wanted to talk about I realized that I had been ignoring a fundamental question, namely: why is it important to teach students how to write the “old-fashioned” way in a culture that’s dominated by Web 2.0 technologies?

For years I had been disappointed in the kinds of papers that students in my analysis of twentieth-century music class were turning in. Each student had to write a “standard” graduate-level research paper: 10-15 pages, roughly 10 sources—no Wikipedia!—that critically engaged a piece of music using one of the analytical approaches we discussed in class. This is not an exercise in regurgitating and reorganizing facts; it’s more book report than author biography, I’d say.

This what we do in graduate school, right?

I started to read Andrew Keen’s book The Cult of the Amateur at the same time I was preparing this talk. Keen laments the demise of record stores, bookstores, and human expertise in general as the Internet, its algorithms, and—most disturbingly—its amateurs commandeer knowledge and culture. While I’m not quite the prophet of doom that Keen is, I do see a fundamental problem that needs to be addressed. In a culture that valorizes both collaboration (which I think is a good thing) and the “noble amateur” (which I think is a bad thing), why is it important to teach students how to research and write a “traditional” paper? The humanities in particular seem to value single-author research papers with copious footnotes that document every bit of material—quoted literally or otherwise—from another source. Proponents of Web 2.0 culture seem to laud precisely the opposite. An article in the journal Nature from August 2010 lists over 57,000 co-authors who generated data by playing a video game. In 2007, Penguin Books experimented with a user-generated wiki-novel cleverly called A million penguins:

Nearly 1500 individuals have contributed to the writing and editing of A Million Penguins, contributing over 11,000 edits making this, in the words of Penguin’s Chief Executive, ‘not the most read, but possibly the most written novel in history‘. 75,000 people have visited the site and there have been more than 280,000 page views.

Perhaps most shocking is the vision of Kevin Kelly, who argues that all books ever written should become one “liquid” book that is constantly evolving and changing, reflecting the predilections of anyone, anywhere, at any time. If this is the way the world is going, what is the point of teaching students how to write a research paper?

One argument used by proponents of Web 2.0 culture is that truth is not fixed—objectivity is dead and subjectivity is in; the text is not important, it’s what you bring to the text. Truth is merely what the majority of people agree to be true: if tomorrow, we collectively decide that the sky is not blue, but green, then, well, the sky is green. The malleability of Web 2.0 is simply a reflection of this postmodern worldview, right? This argument does not hold up because the truth has always been in flux. For decades, people believed that the Earth was flat, that the sun revolved around it. Now, we know better; it’s entirely possible that those truths could be overturned at some point as well. Today, change seems to take place at a much more rapid pace thanks in no small part to computers, the Internet, cellular phones, and other technologies.

Second, we now have more knowledge of widely varying quality available to us than ever before, and most of this can be accessed in an instant from anywhere. Gone are the gatekeepers—the publishing houses, record label executives, librarians, journalists, and others—whose expertise we relied on to separate the wheat from the chaff. In the age of Web 2.0, each of us is now expected to sort through this heap of knowledge on our own and to determine what is useful and what is not, what is true and what is false. This takes an inordinate amount of time, which is ironic because time is precisely the valuable commodity that technology is trying to help us conserve. Research and critical writing teach students how to evaluate source materials and how new knowledge emerges from synthesis: it creates the next generation of gatekeepers.

Third (and finally?), much of the information that is readily available on the Internet is common knowledge, not expert knowledge. I might even go a step further—walking in Keen’s shoes—and argue that much of the information available online is not even common knowledge, it’s popular knowledge. Many sites, including search engines like Google and Bing (which may be the same thing, come to find out) and aggregators like Digg and del.i.cious rank top hits or stories by popularity, not by any measure of authority. The only reason we have common knowledge or popular knowledge is because expert knowledge has trickled down into the collective wisdom. Our society will never progress if common knowledge is simply recycled over and over; we need to continue to generate new knowledge—expert knowledge—to advance. Teaching research and writing skills obviously provides our students with tools to generate this expert knowledge.

All of this Web 2.0 technology does have a positive side: more people are writing in the form of blogs, wikis, Tweets, Facebook notes, and the like (see Nicholas Carr's The Shallows), and knowledge is available instantly, anywhere. This is a good thing: perhaps then it’s not writing that we need to emphasize but critical thinking, which is becoming more and more important in our information-saturated society. The “old-fashioned” research paper with its thesis statement, footnotes, and single author, remains one of the most effective ways to teach students how to think critically.

Tax day

Opinionated much?