Sunday, February 20, 2011

Speaking In the Mode of Vagueness, Just Like Squirrel Lady

Clark Whelton
What Happens in Vagueness Stays in Vagueness

The decline and fall of American English, and stuff

Winter 2011//City Journal

I recently watched a television program in which a woman described a baby squirrel that she had found in her yard. “And he was like, you know, ‘Helloooo, what are you looking at?’ and stuff, and I’m like, you know, ‘Can I, like, pick you up?,’ and he goes, like, ‘Brrrp brrrp brrrp,’ and I’m like, you know, ‘Whoa, that is so wow!’ ” She rambled on, speaking in self-quotations, sound effects, and other vocabulary substitutes, punctuating her sentences with facial tics and lateral eye shifts. All the while, however, she never said anything specific about her encounter with the squirrel.

Uh-oh. It was a classic case of Vagueness, the linguistic virus that infected spoken language in the late twentieth century. Squirrel Woman sounded like a high school junior, but she appeared to be in her mid-forties, old enough to have been an early carrier of the contagion. She might even have been a college intern in the days when Vagueness emerged from the shadows of slang and mounted an all-out assault on American English.

My acquaintance with Vagueness began in the 1980s, that distant decade when Edward I. Koch was mayor of New York and I was writing his speeches. The mayor’s speechwriting staff was small, and I welcomed the chance to hire an intern. Applications arrived from NYU, Columbia, Pace, and the senior colleges of the City University of New York. I interviewed four or five candidates and was happily surprised. The students were articulate and well informed on civic affairs. Their writing samples were excellent. The young woman whom I selected was easy to train and a pleasure to work with. Everything went so well that I hired interns at every opportunity.

Then came 1985.

The first applicant was a young man from NYU. During the interview, he spiked his replies so heavily with “like” that I mentioned his frequent use of the word. He seemed confused by my comment and replied, “Well . . . like . . . yeah.” Now, nobody likes a grammar prig. All’s fair in love and language, and the American lingo is in constant motion. “You should,” for example, has been replaced by “you need to.” “No” has faded into “not really.” “I said” is now “I went.” As for “you’re welcome,” that’s long since become “no problem.” Even nasal passages are affected by fashion. Quack-talking, the rasping tones preferred by many young women today, used to be considered a misfortune.

In 1985, I thought of “like” as a trite survivor of the hippie sixties. By itself, a little slang would not have disqualified the junior from NYU. But I was surprised to hear antique argot from a communications major looking for work in a speechwriting office, where job applicants would normally showcase their language skills. I was even more surprised when the next three candidates also laced their conversation with “like.” Most troubling was a puzzling drop in the quality of their writing samples. It took six tries, but eventually I found a student every bit as good as his predecessors. Then came 1986.

As the interviews proceeded, it grew obvious that “like” had strengthened its grip on intern syntax. And something new had been added: “You know” had replaced “Ummm . . .” as the sentence filler of choice. The candidates seemed to be evading the chore of beginning new thoughts. They spoke in run-on sentences, which they padded by adding “and stuff” at the end. Their writing samples were terrible. It took eight tries to find a promising intern. In the spring of 1987 came the all-interrogative interview. I asked a candidate where she went to school.

“Columbia?” she replied. Or asked.

“And you’re majoring in . . .”

“English?”

All her answers sounded like questions. Several other students did the same thing, ending declarative sentences with an interrogative rise. Something odd was happening. Was it guerrilla grammar? Had college kids fallen under the spell of some mad guru of verbal chaos? I began taking notes and mailed a letter to William Safire at the New York Times, urging him to do a column on the devolution of coherent speech. Undergraduates, I said, seemed to be shifting the burden of communication from speaker to listener. Ambiguity, evasion, and body language, such as air quotes—using fingers as quotation marks to indicate clichés—were transforming college English into a coded sign language in which speakers worked hard to avoid saying anything definite. I called it Vagueness.

By autumn 1987, the job interviews revealed that “like” was no longer a mere slang usage. It had mutated from hip preposition into the verbal milfoil that still clogs spoken English today. Vagueness was on the march. Double-clutching (“What I said was, I said . . .”) sprang into the arena. Playbacks, in which a speaker re-creates past events by narrating both sides of a conversation (“So I’m like, ‘Want to, like, see a movie?’ And he goes, ‘No way.’ And I go . . .”), made their entrance. I was baffled by what seemed to be a reversion to the idioms of childhood. And yet intern candidates were not hesitant or uncomfortable about speaking elementary school dialects in a college-level job interview. I engaged them in conversation and gradually realized that they saw Vagueness not as slang but as mainstream English. At long last, it dawned on me: Vagueness was not a campus fad or just another generational raid on proper locution. It was a coup. Linguistic rabble had stormed the grammar palace. The principles of effective speech had gone up in flames.

In 1988, my elder daughter graduated from Vassar. During a commencement reception, I asked one of her professors if he’d noticed any change in Vassar students’ language skills. “The biggest difference,” he replied, “is that by the time today’s students arrive on campus, they’ve been juvenilized. You can hear it in the way they talk. There seems to be a reduced capacity for abstract thought.” He went on to say that immature speech patterns used to be drummed out of kids in ninth grade. “Today, whatever way kids communicate seems to be fine with their high school teachers.” Where, I wonder, did Vagueness begin? It must have originated before the 1980s. “Like” has a long and scruffy pedigree: in the 1970s, it was a mainstay of Valspeak, the frequently ridiculed but highly contagious “Valley Girl” dialect of suburban Los Angeles, and even in 1964, the film Paris When It Sizzles lampooned the word’s overuse. All the way back in 1951, Holden Caulfield spoke proto-Vagueness (“I sort of landed on my side . . . my arm sort of hurt”), complete with double-clutching (“Finally, what I decided I’d do, I decided I’d . . .”) and demonstrative adjectives used as indefinite articles (“I felt sort of hungry so I went in this drugstore . . .”).

Is Vagueness simply an unexplainable descent into nonsense? Did Vagueness begin as an antidote to the demands of political correctness in the classroom, a way of sidestepping the danger of speaking forbidden ideas? Does Vagueness offer an undereducated generation a technique for camouflaging a lack of knowledge?

In 1991, I visited the small town of Bridgton, Maine, on the evening that the residents of Cumberland County gathered to welcome their local National Guard unit home from the Gulf War. It was a stirring moment. Escorted by the lights and sirens of two dozen fire engines from surrounding towns, the soldiers marched down Main Street. I was standing near the end of the parade and looked around expectantly for a platform, podium, or microphone. But there were to be no brief remarks of commendation by a mayor or commanding officer. There was to be no pastoral prayer of thanks for the safe return of the troops. Instead, the soldiers quickly dispersed. The fire engines rumbled away. The crowd went home. A few minutes later, Main Street stood empty.

Apparently there was, like, nothing to say.

7 comments:

  1. Its amazing how your culture can change and few people even notice. I was not surprized to see that Free Will was reintroduced by a teenage actor in a recent film.It was a big topic in the 50's and the thought leaders were men from the 16th and 17th centuries. I assume TV has something to do with it as the seeds of this trend were planted well before the internet or even the PC arrived.

    ReplyDelete
  2. Isn't, like, uhhm, well, you, umm, you know, like, free will, ike umm, will that is free, a big , I mean like, uhh, you know, humungous issue in the debate between the evolutionary psychologists and those against them. See for example, http://www.tnr.com/book

    My comment is as follows on that:

    Linker characterizes Bering's atheistic, evolutionary argument as being that humans, amongst all animals, are uniquely self conscious. They adaptively moderate their behavior under the gaze and judgment of others. That self conscious moderation was key to human survival. From there it wasn't a such a adaptive big leap to projecting all that survival-- facilitating gazing and judging to an overarching watching and judging deity in whose judgment we make ever greater strides to conform to what we project he expects from us. No God says Bering, a la Linker but, rather, a "mere adaptive illusion."

    For Linker such reasoning is circularly typical of evolutionary psychology which, he says, is marked by "presupposing what it seeks to prove." Linker says that what's insufferable about Bering "is his utter indifference to the likely psychological and social consequences of the truths that he understands himself to be revealing." Apparently, according to Linker, Bering thinks that our unique human self consciousness, our self suppression under the gaze and judgment of the other is most unfortunate and debilitating:

    ...The truth is that chimps live their lives without the “crippling, inhibiting psychological sense of others watching, observing, and critically evaluating them.” But “humans, unfortunately, are not so lucky....

    Linker qualaifies this assertion by noting that Bering also says that self scrutiny under the gaze of the other is so deeply embedded in us that we are unlikely ever to shed it:

    ...Bering also indicates, after all, that our evolutionary inheritance so strongly predisposes us toward theological and moral thinking that a fundamental change in our behavior is unlikely, no matter how many books evolutionary biologists write and promote...

    Now we come to the essence of Linker's critique: Bering says that our moral and religious intuitions as mere adaptive illusions does not make us any less for cleaving to and obeying these "mere adaptive illusions." Linker says, contrarily, if these intuitions are indeed mere illusions with which we delude and deceive ourselves, then we ought to be judged as ridiculous laughing-stocks, which does not male Bering wrong Linker says, only wrong about how we should react to the "truth" of what he says.

    This all seems to be a rehearsal of the issue of how we ought to conceive ourselves. For me it is really a non issue because we in there is a tendency to conflate two distinct ways of accounting for ourselves: the first, the scientific account of what constitutes us, drives us make us tick; the second, our understanding of ourselves as consciousnesses with will and choice and not pre determined, mechanistic beings. There is no reason why one perspective needs to exclude the other. We don't understand ourselves without the first; we dehumanize ourselves without the second.

    ReplyDelete
  3. Not sure that some animals are not self conscious, ever see a dog react to "bad dog"? They just lack the cerebral cortex to brood or imagine much about it. I agree just because we are aware of certain things (facts?) we can do or want to do much about them. That is Paul Haggis's question: he does not understand how he could remain a Scientologist for as long as he did. Which makes one wonder about all the new religions that were invented over the last century compared to the seeming normal religions invented over the millennia. People not just "kids say the darndest things" to quote the famous TV host. Read the New Yorker piece by Lawrence Wright in the New Yorker. Group pressure and our need to belong and ability to believe is really amazing. Denial is a two edged sword. Religion does continue to decline in advanced societies but does cling on and emerge to strike a blow for our snake brains and darker sides. Is it ironic that the Muslim (Christian/Hindu/Jewish etc) Brotherhoods all have charitble functions which perform as disguises for their true ruthless us first ideologuies.

    ReplyDelete
  4. My last two did not really link to the first about kids speak, like uhh what I mean is kids want to sound like their peers. Then something like TV starts to introduce new ways of speaking compared to the ways of our parents for example. Not sure whether this arguement has legs but suddenly kids had input as to how to be cool which did not originate in their local communities. Or maybe mass education lead to a dilution of the quality of inputs. No doubt a lot of factors contributed to the phenomena besides a sudden diltution of the quality of the genes creating the generation that came of age in 1985. Like something happened man besides pot smoking and TV/radio and records had to be part of it.

    ReplyDelete
  5. Rereading what i wrote and seeing the mistakes and poorly turned phases and then rereading the original about the squirrel lady I conclude part of all this is that most of the junk we spend our time seeing/watching/reading etc is unedited!!! The volume of crap needed to fill all the channels 7/24 and all of the blogs, speeches, articles etc means that yes their must be a lot of nonsense. Not even publishers can afford enough editors. A lot of what is published and or even said would benefit by a few glasses of wine. Nietzche reminds us: "The most fundamental form of human stupidity is forgetting what we were trying to say in the first place" Now what was it?

    ReplyDelete
  6. Larry J.: I need a better moment to respond to your comments. I'll let you know when I have.

    ReplyDelete
  7. Reflection is a good thing. Appropriate to the original subject and too often ignored by yours truly.

    ReplyDelete