January 1, 2017

At a loss for words



 From our overstocked archives

Sam Smith, 2010- One of the beats that I have found myself covering over the past few decades has been cultural, political and  ecological entropy, enervation, eradication, and extinction. Beyond the cataclysmic – such as climate change or the end of the First American Republic – there has also been dysfunctional devolvement in little corners of former joy such as the decline in music of melody,  interesting chords, dynamics and rhythmic variety. Even Rolling Stone’s 2010 list of the 50 best songs found only 4% of them from the past 30 years but 58% from the 1960s.

Still, I have harbored childlike faith that certain things still mattered even if the evidence suggested otherwise. One of these illusions has been that words still have a purpose beyond selling something, creating a vaudevillian slapstick response, expressing anger or causing some other exaggerated and often insincere emotional display.

But the other day I found myself forced to face reality. The editor of a newspaper I have long admired, Britain’s Guardian, gave a lengthy speech that included words of high praise for a new literary genre:

It is, said Alan Rusbridger, an “amazing form. . . a highly effective way of spreading ideas. . . changes the tone of writing.” It harnesses “the mass capabilities of humans” to provide information that is "new, valuable, relevant or entertaining” and encourages "people who can say things crisply and entertainingly.”

I am sorry to report that he was speaking of Twitter.

Rusbridger was no less enthusiastic about the Internet as a whole:

“I want to discuss the possibility that we are living at the end of a great arc of history, which began with the invention of moveable type. There have, of course, been other transformative steps in communication during that half millennium – the invention of the telegraph, or radio and television, for instance – but essentially they were continuations of an idea of communication that involved one person speaking to many.

“That's not dead as an idea. But what's happening today – the mass ability to communicate with each other, without having to go through a traditional intermediary – is truly transformative.”

There was in his speech no hint that he had noticed that, since the Internet was born, countries like Britain and the U.S. have moved demonstrably to the right, away from freedom of speech and towards an increasingly state and corporate defined language and thought. Was the Internet to blame? Perhaps not, but certainly and sadly it has proved no great engine of political or intellectual liberation.

And what about the atomization of individuals that has occurred during the computer age? Teenagers don’t even drive as much because many are just as happy simply texting each other. Is that better than sitting around in a room gabbing with one another? It hardly seems a transformative step in communications.

To be sure, I use Twitter regularly. And far from being anti-technological, I was one of the first of my crowd to own a phone answering machine. I was co-editor of my high school’s first photo offset yearbook. I was one of the first reporters in Washington to use a battery operated tape recorder,  the Review was part of the 1960s photo offset alternative press revolution and it went on the Web in 1995, less than five years after its creation. I sometimes explain my longevity in the trade to the fact that technology has improved steadily in inverse proportion to my own deterioration.

But along the way, I learned something important: not all technology is your friend. Some of it makes your life better, some is just fun to play with, and some pretends to be something that it really isn’t. Sort of like people, when you come to think of it.

One of the things that I find odd about Twitter - and its slightly more wordy cousin, Facebook - is that when I first started playing with computers, such as an 8k Atari, I found that they were mainly useful for killing time. Once you tired of games and started dreaming of something truly useful, you were flummoxed by minimal memory and other technical limitations.

That soon changed. In fact, the remarkable history of the computer as a consumer item has been a progression from simple games to a once unimaginable infinity of choices.

That is, until Mark Zuckenberg came along. One of the things both fascinating and frustrating about both Twitter and Facebook is that they have many of the same limitations of early computers, only these now stem not from undeveloped technology but from overdeveloped corporate intent.

As one Facebook user complained:

There is a messaging limit
There is a search limit
There is a friend limit
There is a group limit
There is a poking limit
There is a wall posting limit

There is a 420 character limit on Facebook status updates and the only virtue that can be found in this is that it is three and a half times more than Twitter's limit.

Rusbridger may consider this a step forward in human development but it is essentially reducing everything to the length of a thirty second commercial.

Besides, how transformative would our history have been if we had had to rely on the likes of: 

"AbeLin32 - Four score and seven years ago our fathers brought forth on this continent, a new nation, conceived in Liber. . . ."

Or: 
"JayHover - I am the Lord your God, who brought you out of the land of Egypt, out of the house of slavery; Do not have any ot. . . ."

Typically, nothing that tantalizing turns up on Facebook or Twitter when I check them. Instead I stare at the incomprehensible, the obscure, the self-indulgent, and the morose -  lightly seasoned with the informative, funny and profound.

Sometimes the latter makes the former worth wandering through, but on most days I find myself getting fatigued before I can even click on "Older Posts."

On the other hand, I daily check around 2000 headlines on Google Reader, selected from publications of interest, and I never surrender the task, being ever enticed by the chance that a major story lies concealed just one page down.

The reason, I suspect, is that Google Reader is a catalog of that most venerable predecessor of Facebook and Twitter: the headline. Succinct literary candy luring the reader into further exploration.

What the headline writer had for lunch, or how the house painting is coming along or how well she just did in Three Towers Solitaire is not part of the headline trade, for which generations of readers have unknowingly been blessed.

And it is not accidental. The good headline writer thinks first about the readers, ensnaring them into an article through succinct data, compressed ideas or even bad humor.

Thus the recent headline about NASA's loss of film from the first moon landing: "Small Step For Man; Giant Gaffe For NASA."

Sometimes it can accidentally get out of hand, as when a Florida newspaper ran a head about a kidnapping: "Police Hold Tongue In Widow's Snatch."

One of my own best contributions was when I was editing a paper on Capitol Hill and wrote a story about reaction to a local movie theater that had just moved into the porn business: "Neighbors Want Chitty Chitty Bang Bang, Not Nitty Gritty Gang Bang."

I also went through a period of writing light verse - again an act of voluntary verbal compression - and so am not opposed on principle to space limitations - but I have also learned that rationed writing is actually far more difficult than unregulated rambling, and to have 500 million people doing the latter on the same site can be a little hard to take. To make it work you have to approach it as haiku and not random dump of one's brain.

According to an report by the market research firm Pear Analytics, Facebook  is composed of 40% pointless babble, 38% conversation, 9% matters of pass along value, 6% self promotion, 4% spam and 4% news. That's a lot of words and bad images to rummage through to find something worthwhile.

(Social networking researcher Dannah Boyd has argued that the "pointless babble" should really be called "social grooming" or "peripheral awareness" by those wanting "to know what the people around them are thinking and doing and feeling, even when co-presence isn’t viable.")

While Twitter, Facebook (or Dannah Boyd, for that matter) are not helping us use words well, neither are they the only cause of our problems.

For example, a few years ago I wrote:

|||| Consider this 2005 report from Lois Romano in the Washington Post: "Only 41 percent of graduate students tested in 2003 could be classified as 'proficient' in prose -- reading and understanding information in short texts -- down 10 percentage points since 1992. Of college graduates, only 31 percent were classified as proficient -- compared with 40 percent in 1992."

Or this 2004 report from the NEA: "A Survey of Literary Reading in America reports drops in all groups studied, with the steepest rate of decline - 28 percent - occurring in the youngest age groups. The study also documents an overall decline of 10 percentage points in literary readers from 1982 to 2002, representing a loss of 20 million potential readers. The rate of decline is increasing and, according to the survey, has nearly tripled in the last decade. . .

"Literary reading declined among whites, African Americans and Hispanics. . . By age, the three youngest groups saw the steepest drops, but literary reading declined among all age groups. The rate of decline for the youngest adults, those aged 18 to 24, was 55 percent greater than that of the total adult population. . .

"Reading also affects lifestyle, the study shows. Literary readers are much more likely to be involved in cultural, sports and volunteer activities than are non-readers. For example, literary readers are nearly three times as likely to attend a performing arts event, almost four times as likely to visit an art museum, more than two-and-a-half times as likely to do volunteer or charity work, and over one-and-a-half times as likely to attend or participate in sports activities. People who read more books tend to have the highest level of participation in other activities."

The New English Review adds:

"Carol Iannone, citing a National Endowment for the Arts survey, reports 'For the first time in modern history less than half the adult population of the United States had read even a bit of poetry, fiction or drama in the entire year. While in 1982, almost 57 percent of Americans were literary readers' - those who read literature on their own, not for school or work - that percentage had shrunk to less than 47 percent in 2002.'"

In short, regardless of Facebook or Twitter, America is losing its literacy.

It is perhaps some comfort to remember that literacy has been widely enjoyed only for a short portion of human history. According to one study, for example, the illiteracy rate in France dropped from nearly 70% in the early 18th century to less than 10% in the late 19th century - over a period of approximately 160 years. In the hundred years before 1970, the America illiteracy rate dropped from 20% to 1%

And while the results were even better elsewhere - in Sweden the ability to read approached 100% by the end of the 18th century, that often didn't include writing. As late as 1841, 33% of men and 44% of women in England signed marriage certificates with a mark because they didn't know how to write their names.

In a happier place and a happier time we would treat our few centuries of growing literacy as something to cherish, expand and pass on. Instead, as in too many other ways, we seem to be slipping backwards. And even the editor of the Guardian is cheering us on, declaring the mundane to be magic, the trite to be tumultuous and our future resting on an ability to slash ideas and information down to 120 characters with a few dots of ellipsis added as a concession the hopelessly verbose.

On the whole, I'd rather watch YouTube.

No comments: