A few years ago, while procrastinating on Facebook, I happened upon a status update, written by an artist I admire and with whom I have collaborated, that included the phrase “could care less.” If you’re anything like me, you have just cringed; you’re painfully aware that the correct phrase is “couldn’t care less,” and your sense of moral right-and-wrong has just been offended. (I am not exaggerating; more on both pain and morality later in this blog post.) I wonder, however, whether you would have done what I did: correct my friend’s usage with a quick comment. “Don’t you mean,” I offered, “that you couldn’t care less?” The reply I received, via private message, was immediate and sharp: “I didn’t know you’d joined the Grammar Police. I’ll be sure to let the boys down at Central Command know you’re doing your job.” My friend was not happy. I apologized profusely.
I have never, to my knowledge, made a similar mistake; I learned from that exchange to curb the expression of any corrective impulses I might have. Those impulses, however, have not gone away. Whenever I encounter, for example, a there/their/they’re mistake, or the phrase “for all intensive purposes,” or the non-word “addicting” in place of “addictive,” or any one of a thousand-odd other errors, I experience a profound desire to both edit the writing and discredit the writer. This fact has been true about me for as long as I can remember, and I do not expect it to change any time soon, if at all.
Like (almost) everyone who works in the arts, I’m a liberal.* Politically and socially, I’m about as far left as one can be; I hold opinions that enrage some of my conservative friends and family members and that even make a few moderates I know raise an eyebrow now and then. On matters related to language, however, I have always been very conservative, which is to say that I’ve resisted the too-easy introduction into our general lexicon of phrases that have not yet stood the test of time, eschewed the use of numbers as words (“2” instead of “to” or “too”) on Twitter, declined to send grammatically-incorrect text messages, and refused for the most part to replace fuck (a perfectly fine word) with the f-bomb. I’ve considered it part of my responsibility, as a writer, to protect and defend the language, and (as my Facebook friend will attest) I have sometimes been zealous in the effort. Without a well-defended language, I have told myself by way of rationalization, we would not have civilization. Those stakes have always seemed high enough to merit a bit of extra effort.
Of late, however, I’ve begun to think in very different terms. My conversion began when I realized that I could not actually recall ever deciding, consciously, to begin using emoticons, which I have always considered cheap. I use them, primarily, in email messages, Facebook status updates, and tweets. I could remember loathing them, but I couldn’t remember getting over that discomfort. The evidence that I’d done so, however, was overwhelming. A recent search of my email for the term “:)” returned 156 messages. I had to face facts.
————-
Very little of my graduate education in poetry, I am sad to say, remains with me. The nine months I spent in the Writing Seminars at Johns Hopkins were largely a busy, forgettable blur: yogurt-and-fruit lunches with Peter Sacks; packed poetry readings in Gilman Hall; long hours teaching literature I hadn’t selected and didn’t always admire. One particular two-hour conversation, however, in a seminar led by the immensely gifted Allen Grossman — a man who always seemed to me to be more of a rabbi or a shaman than a professor — changed me intellectually forever. In the seminar, Professor Grossman delivered his widely acclaimed lecture on the perplexing cross-cultural dissemination of the song “London Bridge Is Falling Down,” an intellectual tour de force I would never have forgotten even if those had been the only words he spoke that day. He then proceeded, however, to discourse extemporaneously on a very surprising secondary subject: WAYNE’S WORLD.
Yes, the film starring Mike Myers and Dana Carvey. What had captured Professor Grossman’s attention wasn’t the film, however; he was interested in what he called the Wayne’s World grammatical construction: the addition of the word “not” at the end of a sentence as a way of negating everything that preceded it. For example, consider this small exchange from the film:
BENJAMIN: “Well, I’ll explain it to him that it’s not a choice. It’s in his contract.”
RUSSEL: “Oh. Well, Wayne will understand that right away… NOT!”
At the height of the character’s popularity in the 1990s, you might recall, the Wayne’s World grammatical construction had entered the popular lexicon. (Curiously, it first appeared in print in 1893.)Â Everyone, it seemed, was adding “not” to the ends of sentences. In his off-the-cuff lecture, Professor Grossman speculated that people would continue to do so… and, as it happens, he was right. What impressed me about his lecture, though, wasn’t his attempt at prognostication; it was his comfort with the fact that grammar wasn’t actually stable. He seemed delighted, rather than concerned, that a new sentence structure might have evolved. Those high stakes I was anxious about? The reliance of civilization on the stability of the English language? He was anxious about them, too — not.
In retrospect, I think he realized what I later came to understand as well: that as writers, we aren’t only responsible for protecting the language. We must also renew and enrich it. By expanding linguistic possibility, we create the ability to think new thoughts. Conserving the culture we’ve created, in fact, might even be less important than dreaming the culture forward. Especially for a progressive like me.
————-
What do I do, then, with the fact that I still feel a distinct psychological ache when I encounter broken syntax? I’ve come to understand the sensation as a kind of evolutionary relic. My brain, like everyone’s, is adapted to detect deviations from expected patterns and, in response, trigger a flood of brain chemicals designed to put me on alert. This is a well-documented neuroscientific phenomenon; see V.S. Ramachandran’s The Tell-Tale Brain for a thorough (and mostly accessible) look at the literature. It’s believed to be biologically related, perhaps unsurprisingly, to our sense of morality: our impulse to distinguish right and wrong. This is why finding a grammatical error feels, at least to me, like uncovering a small lie or theft. That innate reaction is hard to ignore… but I am beginning to learn how to do it.
What has been helping me, more than anything, is another insight derived from evolution… and this one is quite humbling, as I am sure you are shortly to agree. We are all quite accustomed to thinking of language as a human creation: the equivalent of a car, say, or a house. Something we invented and created and can, if we so desire, modify at any time. Our current understanding of language and the brain, however, is becoming very different. Neuroscientists are now starting to argue that we really ought to think of our brains as a kind of distributed environment in which grammar and vocabulary live: the same way animals live in discreet sections of rain forest. We do not have the language we made, therefore; we have the language that has adapted to survive in the territory available to it. We are not in control: evolution is, and its processes are bounded by the limits of our minds.
If, therefore, the phrase “could care less” has propagated throughout the English language, no one is at fault (let alone morally bankrupt) for repeating the error. It’s more accurate to say that the “wrong” phrase is simply better suited to the environment than the “right” one. (A Google search reveals eight times as many results for “could care less” than for “couldn’t care less.”) Does that trouble you? It troubles me, whether I want it to or not, but I have no idea what we might do about it. Even if the 15% of us who use the phrase “couldn’t care less” correctly ALL enlisted full-time in the Grammar Police, we still wouldn’t be able to capture every offender. The task is hopeless.
I’m also beginning to believe, finally, that the task isn’t advised, either. I like evolution, after all. I trust it. It got us here. (“Everything is the way it is,” after all, as biologist and classic scholar D’Arcy Thompson said, “because it got that way.”) Besides, it’s clearly not stoppable, either; one might as well try to suppress the tides. Might we possibly take some control of how language adapts? Guide it or steer it somehow, even a little bit? If pressed, I’m enough of a transhumanist. I believe that eventually the human species will begin to take control of its own biological evolution. I have no idea, though, how one might go about it in practice. The very idea is… well, let’s just say that we don’t have the grammar or vocabulary to flesh it out. Yet.
In the meantime, I have finally come to accept that a bit more chilling out with regard to how others write and speak is probably warranted. Time to surrender my gun and my badge, I think, and go (largely) off duty. In time, perhaps, I might even learn to stop worrying and love the f-bomb, so to speak… though it sure is still tempting, even right now, to end this blog post with one surprisingly durable three-letter word that has managed, against all odds, to take root in my change-resistant brain: not.
Oh, I feel your pain–and have also long tried to accept the mutability of language over time; when no less than the NYT allows “a real trooper” in its puzzle, what else can you do? Changes in construction, though, bother me a whole lot less than changes borne of error, i.e. “could care less.” And I don’t think I will ever ever be able to stomach the “him and I”-type errors that are actually born of hypercorrectivity. But opening my mouth about these things? That I’m working on, and probably always will…
Grant me the serenity to etc. That’s what I have to remind myself. All the time!
The best corrective to my, er, corrective instincts was linguistics training in grad school. “Language is as language does” is very convincing when you are confronted with the history of the language and the utterly arbitrary connections between meaning and the sounds and symbols we use to convey it.
That is really a worthy perspective. If you get a chance, though, take a look at this: http://en.wikipedia.org/wiki/Bouba/kiki_effect Growing evidence that those connections may not be so arbitrary after all.
Being married to a man like my husband has lessened the twitches I used to get when I heard or read incorrect grammar or phrases. Michael is as smart as they come, but his genius lies in his ability to build and fix things. I have always marveled at how differently he sees the world than me. Mike is always looking at the physical world around him, either admiring what he sees or figuring out how to make it look better. So when he says “heighth” or doesn’t know the meaning of a word I feel has been in my lexicon my entire life, I cut him some slack. Some people aren’t wired for words, language, grammar, etc. They speak a language and create poetry in other ways.
Being married is terrific in so many ways, isn’t it? 🙂 (NOTE: emoticon alert!)
My wife is far more verbose than I am. She uses six words where I might use two. And that, too, has helped to remind me: there isn’t one right way of communicating.