Death of the Author?

I've thought this over a bit today and basically agree with Matt Yglesias, that the claim Trip Gabriel reports in this NYT article that the ethos of the internet is prompting kids to plagiarize more than they used to is pretty dubious. Here's the core of Gabriel's article:

Professors used to deal with plagiarism by admonishing students to give credit to others and to follow the style guide for citations, and pretty much left it at that.

But these cases — typical ones, according to writing tutors and officials responsible for discipline at the three schools who described the plagiarism — suggest that many students simply do not grasp that using words they did not write is a serious misdeed.

It is a disconnect that is growing in the Internet age as concepts of intellectual property, copyright and originality are under assault in the unbridled exchange of online information, say educators who study plagiarism.

Digital technology makes copying and pasting easy, of course. But that is the least of it. The Internet may also be redefining how students — who came of age with music file-sharing, Wikipedia and Web-linking — understand the concept of authorship and the singularity of any text or image.

Gabriel's article seems like misplaced anxiety; the stakes are pretty low with plagiarism: students are basically only "hurting themselves" by cheating on their homework, as the proverb goes, and it's not like the papers are up for publication. These cheaters are not David Shields or Jonathan Lethem. The idea that students suddenly don't understand the concept of authorship reminds me of the worst nightmares of the fuddy-duddy professors who would fulminate about Barthes and Foucault and "this so-called textuality" when I was a graduate student. Where was the proper respect for Genius?

Gabriel interviews anthropologist Susan Blum, who seems like this species of worrywart.

In an interview, she said the idea of an author whose singular effort creates an original work is rooted in Enlightenment ideas of the individual. It is buttressed by the Western concept of intellectual property rights as secured by copyright law. But both traditions are being challenged.

“Our notion of authorship and originality was born, it flourished, and it may be waning,” Ms. Blum said.

She contends that undergraduates are less interested in cultivating a unique and authentic identity — as their 1960s counterparts were — than in trying on many different personas, which the Web enables with social networking.

Obviously she hasn't hurt Mark Zuckerberg lecture about integrity and a single online identity. I'd be surprised too to find that students are uninterested in authenticity and unique identity and are seeking to merge with the multitude in a gesture of postmodern antisubjectivity. Self-broadcasting mediums and Web 2.0 seem to emphasize the value of a unique identity, not dissolve it.

Students, I suspect, don't take attribution seriously because the work they are being asked to do is not serious to them. They don't have much of a sense of scholarship as a collective enterprise, or of what they do in college as scholarship. With gen-ed classes, they know they are just marking time and doing busy work for the most part. They are right to think that plagiarism is not "a serious misdeed" that is somehow different from any other form of academic dishonesty. To pretend otherwise is to serve the ideological bidding of the lords of intellectual property.

The implication of plagiarism hysteria is that scholarship is a process of claiming ownership of proprietary information, an exceedingly unnatural attitude that students have always needed to be indoctrinated into, particularly if they want an academic career. This usually involves a series of ritualized genuflections in the form of citations of the recognized masters of a particular discipline as part of a student's professionalization into the academy.

Yglesias notes that the Web prioritizes the association of data with its metadata -- song files with the artists, etc. -- and thus generally organizes information so that it is easier to deduce where it originated if you are so inclined. It's never been easier to catch cheaters, he points out, something that was true even when I last taught college courses, in 2001.

I am inclined to think that the ubiquity of material available for appropriation and the ease of cutting and pasting itself explains most of the alleged rise in plagiarism. In my experience, most students who were inclined to cheat were way too lazy to retype passages out of a book.

So far J. J. Abrams and Rian Johnson resemble children at play, remaking the films they fell in love with. As an audience, however, we desire a fuller experience.

As recently as the lackluster episodes I-III of the Star Wars saga, the embossed gold logo followed by scrolling prologue text was cause for excitement. In the approach to the release of any of the then new prequel installments, the Twentieth Century Fox fanfare, followed by the Lucas Film logo, teased one's impulsive excitement at a glimpse into the next installment's narrative. Then sat in the movie theatre on the anticipated day of release, the sight and sound of the Twentieth Century Fox fanfare signalled the end of fevered anticipation. Whatever happened to those times? For some of us, is it a product of youth in which age now denies us the ability to lose ourselves within such adolescent pleasure? There's no answer to this question -- only the realisation that this sensation is missing and it has been since the summer of 2005. Star Wars is now a movie to tick off your to-watch list, no longer a spark in the dreary reality of the everyday. The magic has disappeared… Star Wars is spiritually dead.

Keep reading... Show less

This has been a remarkable year for shoegaze. If it were only for the re-raising of two central pillars of the initial scene it would still have been enough, but that wasn't even the half of it.

It hardly needs to be said that the last 12 months haven't been everyone's favorite, but it does deserve to be noted that 2017 has been a remarkable year for shoegaze. If it were only for the re-raising of two central pillars of the initial scene it would still have been enough, but that wasn't even the half of it. Other longtime dreamers either reappeared or kept up their recent hot streaks, and a number of relative newcomers established their place in what has become one of the more robust rock subgenre subcultures out there.

Keep reading... Show less

​'The Ferryman': Ephemeral Ideas, Eternal Tragedies

The current cast of The Ferryman in London's West End. Photo by Johan Persson. (Courtesy of The Corner Shop)

Staggeringly multi-layered, dangerously fast-paced and rich in characterizations, dialogue and context, Jez Butterworth's new hit about a family during the time of Ireland's the Troubles leaves the audience breathless, sweaty and tearful, in a nightmarish, dry-heaving haze.

"Vanishing. It's a powerful word, that"

Northern Ireland, Rural Derry, 1981, nighttime. The local ringleader of the Irish Republican Army gun-toting comrades ambushes a priest and tells him that the body of one Seamus Carney has been recovered. It is said that the man had spent a full ten years rotting in a bog. The IRA gunslinger, Muldoon, orders the priest to arrange for the Carney family not to utter a word of what had happened to the wretched man.

Keep reading... Show less

Aaron Sorkin's real-life twister about Molly Bloom, an Olympic skier turned high-stakes poker wrangler, is scorchingly fun but never takes its heroine as seriously as the men.

Chances are, we will never see a heartwarming Aaron Sorkin movie about somebody with a learning disability or severe handicap they had to overcome. This is for the best. The most caffeinated major American screenwriter, Sorkin only seems to find his voice when inhabiting a frantically energetic persona whose thoughts outrun their ability to verbalize and emote them. The start of his latest movie, Molly's Game, is so resolutely Sorkin-esque that it's almost a self-parody. Only this time, like most of his better work, it's based on a true story.

Keep reading... Show less

There's something characteristically English about the Royal Society, whereby strangers gather under the aegis of some shared interest to read, study, and form friendships and in which they are implicitly agreed to exist insulated and apart from political differences.

There is an amusing detail in The Curious World of Samuel Pepys and John Evelyn that is emblematic of the kind of intellectual passions that animated the educated elite of late 17th-century England. We learn that Henry Oldenburg, the first secretary of the Royal Society, had for many years carried on a bitter dispute with Robert Hooke, one of the great polymaths of the era whose name still appears to students of physics and biology. Was the root of their quarrel a personality clash, was it over money or property, over love, ego, values? Something simple and recognizable? The precise source of their conflict was none of the above exactly but is nevertheless revealing of a specific early modern English context: They were in dispute, Margaret Willes writes, "over the development of the balance-spring regulator watch mechanism."

Keep reading... Show less
Pop Ten
Mixed Media
PM Picks

© 1999-2017 All rights reserved.
Popmatters is wholly independently owned and operated.