Feeds:
Posts
Comments

Posts Tagged ‘writing’

“I don’t think I have ever been quoted as well by a reporter before.”

I like to think that I’m immune to compliments but this comment from someone I interviewed earlier this week got through my defenses. I interpret it as saying that I reported what the person said accurately.

Or, to be exact, my reworking of what the person said was a close reflection of their thoughts. Because, of course, no journalist quotes an interview subject word for word – unless, that is, they want to portray the subject as a incoherent half-wit. If you have transcribed as many interviews as I have, you’ll know that even the most fluent speaker can be made to look rambling and dull by quoting every little pause, space-filler, and change of direction in thought. To make the story read better, all journalists routinely edit quotations to help the continuity of their stories. If they are also ethical journalists, they do while making sure that they preserve the sense of what the subject said.

At any rate, the comment pleased me, because accurately reflecting what someone has to say is a skill on which I pride myself. When I pitch a story to my editors, I rarely have a fixed opinion on the subject, except when I’m writing a commentary. Instead, I want to write the story because I’m interested in learning more about the subject. My opinion emerges from the as I research the story and talk to different people; on those occasions when I do have an opinion on a subject, I frequently alter it as I develop the story.

This habit does little to soothe the nerves of potential interviewees who ask what my perspective on the subject we’ll discuss happens to be, or ask in advance for the questions I want answered. If I were being completely honest, I’d have to explain that, in most cases, I don’t have the least idea what the perspective will be in the story. Similarly, while I jot down topics I want to cover, I rarely prepare specific questions. When I do, the resulting article is never an example of my best work. Instead, I develop my questions while listening to the interviewee. But these explanations, I suspect, would not be believed by the suspicious. They’d be sure I had a hidden agenda. The more I explained, the more paranoid they would become.

All the same, they’re the truth. While I taught in an English Department when post-colonialism was the prevailing critical theory, I’ve never been a believer in completely subjective truth. At the risk of sounding naive, I believe, if not in objective truth, then in the effort to find it. I’m well aware that my bias creeps in to everything I writer, regardless of my intentions, but I don’t believe that my perspective is endlessly interesting, so I try to vary it with the opinions of those whom I talk to.

That’s not to say that I don’t have a define viewpoint by the time I finish an article – although I do try to subdue the expression of it, because I happen to think that a gently-delivered truth that guides readers to the conclusions I want to give them is more effective than a thundering oration. But if I want to persuade people to accept my outlook, I want to make my development of my points as accurate as possible to make them more logically acceptable.

So, yes, I do try to report the gist of what other people say. It is both part of my code of ethics and part of my style of discussion to do so. No doubt I often fail, both ethically and stylistically, but such are my ideals – and I’m warmed by the thought that someone has noticed.

Read Full Post »

After writing professionally on the web for several years, I’m no stranger to careless readers and wonky comments. I learned long ago that not only can you not count on everyone to read your thoughts carefully, but that some subjects, such as Microsoft’s intentions towards free software, cause many people’s critical facilities to go on holiday. But none of the subjects I’ve written about provoke as much blind reaction as the suggestion that grammar should be descriptive rather than prescriptive.

I was reminded of this fact recently when I noticed that my article “Tech-writers, Grammar, and the Prescriptive Attitude” had not survived a redesign of the Techwr-l site where it was originally posted. Because I still get requests for it, I asked Deb and Eric Ray, who maintain the site, to send me a copy of the published version, and posted it on my blog. The request also prompted them to repost the essay on the original site. In the days since, I’ve been fielding a comment or two per day from both sources, most of them sent privately.

Some people approve the sentiments in the article, but perhaps half are outraged. They set out to correct my thinking by pointing out that, without consistency, language ceases to communicate. Their assumption seems to be that only official English has consistency. I answer that all forms of English have their own rules – for instance, a double negative, which is considered wrong in official English, is perfectly understandable with the context of some Afro-American dialects (and, I might add, Old English). So far, none of these conversations have continued beyond this point.

Another conversation about the article begins with someone asking if I’m saying that we (by which the reader usually means sophisticated, literary folk like themselves) should go along with mispronunciations or incorrect uses, usually in sarcastic tones that suggest that, of course, I mean no such thing. I reply that I do mean exactly that, that, when a pronunciation or usage reaches a certain degree of popularity, it becomes standard usage.

To this comment, my correspondent usually replies that it is the duty of the literate to fight against such barbarisms. My response is that you are, of course, free to use any language that you care to, but, if you imagine that your example is going to inspire an outbreak of proper usage, then you think far too much of yourself. The most anyone can do is avoid usages that are vague in the name of clarity and personal style – and, at that point, another conversation peters out.

And these are just the most common ones. I’ve been accused of advocating complete chaos, of insisting on poetic language at the expense of clarity, and all sorts of other stances that have nothing to do with what I wrote, no matter how you construe my words. Often enough, people insist that I believe something that I frankly state that I do not believe. Apparently, many people – particularly those who work with words – have so much invested in their self-image as initiates into the secrets of proper grammar that any suggestion that their knowledge is as useless as heraldry immediately robs them of their ability to read and analyze text.

Personally, I find the idea that we have, not one but dozens of versions of English exciting and challenging. It means that, as a writer, I have more to explore than I can possibly learn in one lifetime. But that, unfortunately, seems to be a minority viewpoint.

A good thing that I thrive on being a contrarian, or I might find the hostile responses disheartening.

Read Full Post »

A correspondent tells me that Boycott Novell’s Free Software Credibility List gave me a rating of three on a six point scale (I could link, but I don’t want to give the site any more hits than I have to). Until hearing this news, I didn’t know about the list, because, so far as free software is concerned, I only read news sites and blogs with either technical knowledge or expert commentary. Usually, too, I make a habit of not commenting negatively in public on anyone with a claim – no matter how remote – to being a journalist. At the very least, I generally don’t mention them by name. However, since my informant seemed to think I should be upset, I’m making an exception here.

To be honest, I am more amused than angry about a list whose silliness is exceeded only by the self-importance of its owners. I mean,  how does any journalist, no matter how skilled a word-slinger, get the same rating as Stallman, the founder of the free software do? Yet several do. And why are authorities like Eben Moglen off the list?

I also notice that, at least in some cases, the list seems a direct reflection of how closely a journalist’s opinion corresponds with Boycott Novell’s, rather than any criteria that might be mistaken for objectivity. Robin Miller, the senior editor at Linux.com, is apparently denigrated because he took a group tour of the Microsoft campus a couple of years ago (I’m sure the fact that he presided over a podcast in which a Boycott Novell writer performed poorly has nothing to do with his ranking). Other writers seem to rate a 4 or 5 largely because they stick to technical matters and, rarely talking about philosophy or politics, say nothing for Boycott Novell to dissect for suspect opinions.

Strangely, the Boycott Novell cadre didn’t rate their own reliability, although whether that is because they are assumed to be the only ones who rate a perfect six or because the ranking doesn’t include negative numbers, I leave as an exercise to the readers.

From the link attached to my name, my own ranking seems based on the fact that I accepted that a comment signed with a Boycott Novell writer’s name really was by him; when he said it wasn’t, I accepted the claim and he suggested that I was owed “some apologies.” Yet, apparently I’m permanently branded as being only marginally trustworthy because of this minor incident. I suspect, though, that the writer’s belief that I lumped the Boycott Novell writers into the category of conspiracy theorists has more to do with my ranking than anything else.

But these foibles don’t disturb me unduly. Far from being upset, I’m glad of the list, because it gives me a goal. If I write consistently hard-hitting articles in which I dig carefully for facts, build a flawless chain of reasoning, and tell the truth no matter how uncomfortable the consequences, then maybe – just maybe – in a few years Boycott Novell will reward me with the ultimate accolade of a zero ranking some day. Then I’ll know when I have truly arrived.

And that is all that I intend to say on this subject. Ever.

Read Full Post »

(Note: I wrote the following article six years ago for the Techwr-l site. For a long while, it was my most-requested article. Recently, I noticed that a site makeover had left the article down. By now, it may be posted again, but I thought I’d reprint it here, although it’s very long by the standards of blog postings. Although addressed to technical writers, almost everything in it applies to writers and writing in general.)

Most technical writers are confused about grammar. On any day on the TECHWR-L list, basic questions are asked: “Is ‘User’s Guide’ or ‘Users’ Guide’ correct? Maybe ‘Users Guide?'” “Should ‘web’ be capitalized when used to refer to the World Wide Web?” “Which is right: ‘A FAQ’ or ‘an FAQ?'” Many of these questions become the major thread on the list for a day or two, generating far more debate than they’re worth.

The confusion isn’t so much about the grammatical points themselves. It’s about the nature of grammar in general. Apparently, many tech writers do not see grammar as a set of conventions to help them write clearly. Instead, to judge by the wording of the questions and responses, they see grammar as a set of unchanging rules that can provide definitive answers in every situation.

Some are afraid to break the rules of grammar and risk being denounced as incompetent. A handful, smugly sure that they know the rules, use their rote learning of the rules as an ad hominem attack, nitpicking at typos and small errors to discredit writers without disproving their viewpoints. Most sit in the middle, haunted by the ghosts of childhood grammar classes until they can hardly tell on their own authority whether they are writing well or not. But underlying all these reactions is an attitude that rules are rules, and cannot be broken.

This attitude is usually known as a prescriptive approach to grammar. It assumes that grammar exists mainly to tell us how to speak or write properly–not well. It is an attitude that tech writers share with almost everybody in the English-speaking world. It is a form of conditioning that begins in kindergarten and continues through high school and even into college and university. It undermines nearly everyone’s confidence in their ability to communicate, especially on paper. Yet it is especially harmful to professional writers for at least three reasons:

  • It grotesquely exaggerates the importance of grammar. Although competence in grammar is sometimes proof of other writing skills, it stresses presentation over content. Even worse, it stresses correctness over precision, conciseness, or clarity.
  • It binds writers to viewpoints that are not only arbitrary and obsolete, but, in some cases, far from their own opinions.
  • It undermines writers’ confidence and their ability to make decisions about how to communicate effectively.

Why are we burdened by this attitude? How does it affect us? The easiest way to answer these questions is to look at the origins of the prescriptive attitude and the alternatives to it. Only then can we begin to grasp how we can live without it.

The Rise of Prescriptive Grammar

Prescriptive grammars are the products of the Enlightenment. Earlier grammars such as William Bullokar’s in 1586 and Ben Jonson’s posthumous one were also prescriptive, but intended for language students. The first prescriptive pronouncements for native English speakers date to the Seventeenth and Eighteenth Centuries.

This is the start of the great era of describing and recording. In every subject from biology to Egyptology, educated men struggled to write accounts so thorough that no other one would ever be needed. It was also a time that looked both backwards and forwards to Classical Rome. It looked back in the sense that ancient Rome was seen as the height of civilization. It looked forward in two senses: England perceived itself as a second Rome, and Latin was the language of international science.

The first prescriptive comments were very much in the spirit of their times. Their compilers hoped for definitive grammars that would fix the form of
English once and for all, and provide a source for settling grammatical disputes. Since Latin was the language of the world’s most sophisticated civilization, the closer this fixed form of English was to Latin, the more sophisticated English would be. Moreover, given the belief that culture had been degenerating since the fall of Rome, most grammarians automatically equated any change in the language with decay and degeneration.

John Dryden, the poet and playwright, was among the first to make prescriptive pronouncements. His main contribution to prescriptive grammar was to suggest that prepositions should never end a sentence. The reasons for this prescription are that Latin sentences rarely end in prepositions, and that the word “preposition” clearly indicates that this part of speech should go before (“pre”) the noun it is associated with. Dryden criticized Shakespeare and Jonson for not following this rule, and scrupulously edited his own writings until they conformed to it.

Dryden also hoped to fix the form of English, which was still rapidly changing. Together with the diarist John Evelyn and other authors, Dryden called for an English version of l’Academie Francais–the body of scholars and writers that oversee the creation of the official French dictionary and is the arbiter of correct French. Dryden’s plea for an English Academy was echoed later by Daniel Defoe and Jonathan Swift. The idea was being seriously considered by Queen Anne when she died in 1714. But with the ascension of the German-speaking George I, the question shifted from the monarch helping to purify English to teaching the monarch English, and the idea was dropped.

Deprived of royal assistance, English men of letters did their best to improve the language on their own. In the mid-Eighteenth Century, a number of grammars were published, as well as Samuel Johnson’s famous dictionary, whose Preface spells out its prescriptive purposes with both succinctness and dry wit. All these works were heavily prescriptive, although Joseph Priestley did include some comments about the importance of common usage in deciding what was proper.

The most influential of these grammars was Robert Lowth’s “Short Introduction to English Grammar,” published in 1761. Criticizing almost every major English writer from Shakespeare to Pope, Lowth made most of the prescriptive statements that people still follow today. His prescriptions include:

  • Two negatives make a positive, except in constructions such as “No, not even if you paid me.”
  • Never split an infinitive.
  • Never end a sentence in a preposition.
  • “Ain’t” is unacceptable in formal English.

These ideas have several sources: An attempt to model English on Latin (and therefore to arrive at a universal grammar that underlay all languages), a desire to be scientific, and Lowth’s personal preferences. For instance, Lowth would not accept split infinitives because Latin infinitives are one word and cannot be split. Similarly, two negatives make a positive because they do so in mathematics. The ban on “ain’t,” though, seems entirely Lowth’s idiosyncrasy. None of these prescriptions, however, took any notice of how people actually spoke or wrote. For example, despite Lowth, “ain’t” continued to be used by Queen Victoria and the upper classes until the start of the Twentieth Century.

Although Lowth later became Bishop of London, his ideas on proper usage would probably have remained obscure if they had not been borrowed for classroom use. In 1781, Charles Coote wrote a textbook grammar based on Lowth’s Grammar, adding his own preference for “he” and “his” as the indefinite personal pronoun (as in “everyone is entitled to his opinion”). A few years later, Lindley Murray borrowed from Lowth to write a series of textbooks for a girls’ school. Murray’s textbooks became so popular that they quickly became the standard English in American schools throughout much of the Nineteenth Century. With modifications, Murray’s “English Grammar” and Lowth’s Grammar have been the basis for textbook grammars ever since. With their unyielding rules, these textbooks have given at least eight generations of English-speakers the prescriptive attitude that inhibits people today.

The Biases of Prescription

In their language and their purposes, prescriptive grammars make a strong claim to objectivity. A. Lane was typical of the first grammarians when he wrote in 1700 that the purpose of grammar was to teach people to speak and write “according to the unalterable Rules of right Reason.” Similarly, in writing his dictionary, Samuel Johnson humorously referred to himself as a “slave of Science.” These claims are still echoed when modern defenders of the prescriptive attitude assume that the rules of grammar are value-free. Yet a closer look at prescriptive grammar reveals distinct biases. Some of these were openly admitted by the first grammarians. The problem is that some of these biases are no longer current. Others are demonstrably false.

The early grammarians’ claims to be scientific or precise concealed a strong personal bias. This bias was by no means a conspiracy–it was simply natural self-expression. Grammarians were highly educated, or the subject would hardly come to their attention at all. Since education was a privilege, they were either well-to-do or extremely talented. Since the higher levels of education were barred to women, they were male. And, as might be expected from the subject, they were all intensely literate.

While this background made the first grammarians supremely qualified for literary and scholarly work, it also carried a certain arrogance. The first grammarians showed scant interest in how English was used outside their own class and social circles. All of them assumed an unwarranted authority in their subject, appointing themselves as the arbiters of the language without any justification except their willingness to serve. Robert Lowth, for example, did not hesitate to include his own idiosyncrasies as grammatical rules. In much the same way, Samuel Johnson saw it as his “duty” to purify English usage through his dictionary. This same attitude continues in the prescriptive attitude today.

Moreover, the first grammars were a direct reflection of their authors’ bias and education. The problem is, the education of the Restoration and Enlightenment includes assumptions that we would question today. For example:

  • Rome is the model for all things: In fact, Latin is a poor model for English. Although both languages are Indo-European, the relation is indirect. Even the heavy influence of French, a Latin-derived language, via the Norman Conquest, does not make English’s structure much closer to Latin. At its core, English is a Germanic language, and if the first grammarians had used Dutch, Swedish, or German as a model, they would have had no precedent for objecting to split infinitives or double negatives. However, the Germanic origins of English were poorly understood in Britain during the Seventeenth and Eighteenth Centuries. At any rate, the first grammarians would probably have considered Germanic models too crude to replace the polished perfection of Latin.
  • Writing is the basis for English grammar: Like other grammarians, Samuel Johnson assumes that the principles of grammar should be taken from the written form of the language. Since Johnson was a writer himself, this assumption is understandable. However, modern linguistics regards written usage as simply one of many types of English, none of which is more valid in the abstract than any of the others. If anything, the spoken language is usually given greater priority today, partly because it tends to be the source of innovation, and partly because it reveals how people use the language when they are not trying to write correctly or formally.
  • Change is degeneration: When the narrator of “Gulliver’s Travels” visits Laputa, he is shown a vision of the Senate in Ancient Rome, then the modern English Parliament. The Senators look like demigods and heroes, the Members of Parliament scoundrels and ruffians. In writing this passage, Jonathan Swift reflects a widespread belief in his time that times are getting continually worse. This fallacy is the exact opposite of the modern one of equating all change with progress.

Applied to the English language, Swift’s view has little basis in fact. Admittedly, English has simplified itself over the centuries by dropping most noun declensions and verb conjugations, but that has not made it less useful for communication. Nor, despite the delicate shudder of prescriptive grammarians, does the shift in meaning of “cute” from “clever” to “attractive” in the early Twentieth Century or of “gay” from “happy” to “homosexual” in mid-Century weaken a language with as many synonyms as English. When clumsy or unclear constructions do arise (such as the use of “not” at the end of a sentence), their impracticality generally ensures that they are brief fads.

At any rate, what is degenerate and what is progressive is often a matter of opinion. To J.R.R. Tolkien, the Anglo-Saxon scholar and author of The Lord of the Rings, the hundreds of words added by Shakespeare and his contemporaries are a corruption that ruined English forever. Yet to most scholars, these coinages are an expression of a fertile inventiveness and part of the greatest literary era ever known.

  • London English is standard English: Like most English writers, the grammarians and their publishers centered around London. The language of the upper classes in the Home Counties had already become Standard English by the Fifteenth Century, which is why many people have heard of Chaucer and few people have heard of (much less read) “Sir Gawain and the Green Knight,” a brilliant poem by one of Chaucer’s contemporaries, written in an obscure North Country dialect. That is also why Chaucer and Shakespeare poke fun at other dialects–they already had the idea that some forms of English were better than others. In basing their work on the English spoken around London, the grammarians were simply working in a long-established tradition.

Far from seeing their biases as contradicting their claims to scientific objectivity, the grammarians openly proclaimed their goal of saving English from itself. In fact, by the standards of the time, proclaiming their educational and cultural assumptions was a means of asserting their ability to be objective on the matter.

Of all the early grammarians, the one most keenly aware that prescriptive grammars were biased was Noah Webster, the writer of the first American dictionary. Working on his dictionary between 1801 and 1828, Webster was not content simply to record how words were used. Instead, he was also concerned with producing a distinctly American language. To this end, Webster not only included words such as “skunk” and “squash” in his dictionary, but also introduced American spellings, such as “center” instead of “centre.” In addition, he encouraged the spread of a uniquely American pronunciation by consistently placing the stress on the first syllable of the word. In other words, Webster attempted to deliberately manipulate the use of English in the United States for patriotic reasons. Whatever anyone thinks of those reasons, Webster’s efforts are one of the best proofs that prescriptive grammars are not as value-free as many people imagine.

The same is true today in the debate over whether “they” can be used as the indefinite pronoun instead of “he/his” (“Everyone is entitled to their opinion”). On the one hand, traditionalists who insist on “he/his” are perpetuating the male bias of the first grammarians. On the other hand, reformers who favor “they” are trying to remake the language in their own world view. It is not a question of objectivity on either side. It is simply a question of which world view will prevail.

The Problem of Change

But the major problem with prescriptive attitude is that it resists the fact that languages are continually changing. If a community newspaper constantly includes editorials urging people to drink less, the amount of concern suggests a local drinking problem. In the same way, the constantly expressed wish to set standards for the language reflect the massive changes in English at the time that the first grammarians worked. Although the rate of change was probably slower than that of the Fifteenth and Sixteenth Centuries, English was still changing much faster between 1650 and 1800 than it does today.

Some of the changes that occurred or were completed in this period include:

  • The disappearance of dozens of Old English words like “bairn,” “kirk,” and “gang” (to go). Many of these words survived in Northern and Scottish dialects for another century, but became non-standard in written English.
  • The addition of dozens of new words. Some were deliberately coined by scientists, such as “atom.” Some were borrowed from the regions that England was conquering, such as “moccasin” or “thug.”
  • The replacement of “thou” and “ye” with “you” in the second person plural. These forms survived only in poetry.
  • The standard plural became “s” or “es.” Only a few exceptions such as “oxen” survived.
  • The loss of all case endings except the third person singular in most verbs (“I read,” “he reads”). Some of the older forms ending in “th” survived longer in poetry.
  • The loss of inflection in most adjectives.
  • The regularization of past tenses to “ed” or, occasionally, “t” (“dreamed” or “dreamt”).

Many of these changes are easy to overlook today because popular editions of texts from this period routinely modernize the spelling. However, the sheer number of changes makes clear that the early grammarians were fighting a rear guard action. If they have helped to slow the rate of change in the last two centuries, sometimes they have also accelerated it; the loss of many Northern words, for example, is probably partly due to the standardization on Home County English. Yet, despite these efforts, English continues to change as the need arises. Many of these changes come from the least educated parts of society–those ones least likely to be influenced by the prescriptive attitude.

Today, all prescriptive grammarians can do is resist changes as long as possible before accepting them. This constant retreat means that most prescriptive grammars are usually a couple of decades behind the way that the language is actually used in speech and contemporary publications.

The Descriptive Alternative

While prescriptive grammars were finding their way into the schools, an alternative approach to the study of language was being developed by linguists. Imitating the naturalists of the Eighteenth Century, linguists began to observe the pronunciation, vocabulary, grammars, and variations of languages, and began cataloging them in ways that suggested how they related to each other. In 1786, Sir William Jones established that most of the languages of India and Europe were related to each other. By 1848, Jacob Grimm, one of the famous Brothers Grimm of fairy tale fame, had detailed in his History of the German Language how English, Dutch, German, and the Scandinavian languages had descended from languages like Gothic. Content to observe and speculate, the early linguists developed what is now called the descriptive approach to grammar.

The descriptive approach avoids most of the distractions of prescriptive grammars. Today, most linguists would probably accept the following statements:

  • Change is a given. In fact, a working definition for linguistics is the study of how languages change. Linguists can offer a snapshot of how a language is used, but that snapshot is valid for only a particular place and time.
  • No value judgement should be placed on changes to a language. They happen, regardless of whether anyone approves of them or not.
  • The fact that one language is descended from a second language does not make the first language inferior. Nor does it mean that the first language must be modelled on the second.
  • Linguistics does not claim any special authority, beyond that of accurate observation or verifiable theory. Claims are open to discussion and require validation before being accepted.
  • No form of a language is given special status over another. Regardless of who speaks a variation of a language, where it is spoken, or whether it is oral or written, all variations are simply topics to be observed. For example, when Alan Ross and Nancy Mitford coined the phrases “U” and “non-U” for the differences between upper and middle class vocabularies in Britain, they did not mean to suggest that one should be preferred (although others have suggested that deliberately using a U vocabulary might be a way to be promoted).
  • Variations of a language may be more or less suitable in different contexts, but none are right or wrong. The fact that you might speak more formally in a job interview than at a night club does not mean the language of a job interview is proper English, or that the language of the night club is not.
  • Proper usage is defined by whatever the users of the language generally accept as normal.
  • A language is not neutral. It reflects the concerns and values of its speakers. For example, the fact that every few years North American teenagers develop new synonyms for drinking and sex reflects teenagers’ immense preoccupation with the subjects. The fact that they also develop new synonyms for “slut” reflects their sexual morality as applied to girls. A similar viewpoint is known in psychology as the Sapir-Whorf hypothesis.

To those conditioned by prescriptive grammars, many of these statements are unsettling. For instance, when I suggested on the TECHWR-L list that proper usage was determined by common usage, one list member went so far as to call the view “libertarian.”

Actually, these statements are simply realistic. Despite two centuries of classroom conditioning, average users of English have never been overly concerned about the pronouncements of prescriptive grammar. Instead, people continue to use English in whatever ways are most convenient. If the prescriptive usage is not widely used in practice, it may even sound odd, even to educated people. For example, to an ear attuned to the spoken language, the lack of contractions in an academic essay or business plan may sound stilted. In fact, an entire written vocabulary exists that is almost never used in speaking. The descriptive approach simply acknowledges what has always been the case. In doing so, it frees users from the contortions of prescriptive grammar, allowing them to focus on communication–where their focus should have been all along.

Professional Writers and Grammar

Writing well, as George Orwell observes in “Politics and the English Language,” “has nothing to do with correct grammar and syntax.” If it did, then two centuries of prescriptive grammar in the classroom should have resulted in higher standards of writing. Yet there is no evidence that the language is used more skillfully in 2001 than in 1750. The truth is that, prescriptive grammar and effective use of English have almost no connection. A passage can meet the highest prescriptive standards and still convey little if its thoughts are not clearly expressed or organized. Conversely, a passage can have several grammatical mistakes per line and still be comprehensible and informative. Prescriptive grammars are interesting as a first attempt to approach the subject of language, but today they are as useless to writers as they are to linguists. So long as writers have a basic competence in English, prescriptive grammar is largely a distraction that keeps them from focusing on the needs of their work.

By abandoning prescriptive grammar, writers shift the responsibility for their work to themselves. In practice, this shift means making choices that are not right or wrong in the abstract, but, rather, useful in a particular context or purpose. For example, instead of agonizing over whether “User’s Guide” or “Users Guide” is correct, writers can choose whichever suits the situation. They can even flip a coin, if they have no better means of deciding. In such cases, which choice is made is less important than using it consistently throughout the document to avoid confusion. Even then, writers may decide to be inconsistent if they have a good reason for being so.

Similarly, the decision whether to use a particular word or phrase is no longer a matter of referring to a standard dictionary or somebody else’s style guide. Instead, writers have to fall back on the basics: Will the intended audience understand the word? Is it the most exact word for the circumstances? Does it convey the image that the company wants to present? In the same way, while the need for clarity and a factual tone makes complete sentences and unemotional words useful choices in typical manuals, in a product brochure, sentence fragments and words heavy with connotation are more common. Writers may still want to summarize their decisions in a corporate style guide, but the style guide will be based on their own considerations, not the rules that someone else tells them to follow.

None of which is revolutionary–except that, under the prescriptive attitude, irrelevant purposes are often inflated until they become more important than a writer’s practical concerns.

That is not to say that taking a descriptive approach to grammar means writing in the latest slang. Nothing could date a document faster, or be more intrusive to a technical manual. Nor does it mean abandoning technical vocabularies that are known to the audience or that make explanations easier. If anything, a descriptive approach demands a much greater awareness of the language than a prescriptive one. Instead of learning the single correct version of the languages, writers who take a descriptive approach need to be aware–probably through constant reading–not only of dozens of different versions, but of how each version is changing.

If necessary, writers can use descriptive grammars such as journalistic style guides to help them. On the whole, however, the descriptive approach leaves writers where they should have been all along, deciding for themselves what helps their documents to achieve their purposes. The only difference is that, under the descriptive approach, they are fully aware of their situation. If they say anything unclear or stupid, they can no longer hide behind tradition.

Prescriptive grammar is useful for teaching English as a second language, but it has little value for the practicing writer. Clinging to it may provide emotional security, but only at the expense of making writing harder than it needs to be. The culture-wide devotion to it will not be changed in a moment. But conscientious writers can at least change their own habits, and make life easier for themselves. And, from time to time, they can even laugh some worn-out, crippling concept — such as not ending a sentence in a preposition, or not splitting an infinitive — into the recycle bin where it belongs.

(With apologies to George Orwell.)

Read Full Post »

Since I’m a Canadian, Memorial Day doesn’t mean much to me. Our May long weekend is Victoria Day, and is often the weekend before. From the times I’ve been travelling in the United States on the Memorial Day long weekend, it seems to involve a lot of parade drill from everyone from octogenarians to people in wheelchairs – and as a sport, parade drill is low on the list for breakneck action and usually looks faintly ridiculous to my outsider’s eye. But I do remember one unforgettable Memorial Day, when we visited the fantasist Avram Davidson at the Veteran’s Hospital in Bremerton, Washington.

As you may know if you have any taste for literary fantasy, Avram was one of the great fantasists and humorists of the 20th Century. But, as sometimes happens with greater writers, he was not very skilled at taking care of himself. Through poverty, he had developed the habit of living in small towns where rents were cheaper, and, according to him, moving on when he had exhausted the local library.

In the last couple of years of his life, this habit had brought him to Bremerton. When his long-neglected health began to fail, he landed in the local veteran’s hospital, thanks to his service in World War 2 as a hospital corpsman in the Marine Corp in the Pacific. There, from the narrow confines of his room, he fought a running battle with the bureaucrats of the hospital and of Veteran Affairs, none of whom were used to dealing with patients who were not only highly intelligent but who had a high degree of curmudgeon and anarchist in their mental makeup.

Perhaps it was a campaign in this ongoing battle that prompted Avram to invite everyone he knew within a day’s travel distance to the hospital’s Memorial Day celebration, just to annoy his opponents. Or maybe Avram’s famous generosity, so long denied because of his poverty, seized on the celebration as a overdue way to treat his friends and repay them for their visits. He could, too, have been restless in the limitations of his life, and worrying that he might not have long to live.

Knowing Avram, the invitation was probably extended for all these reasons. But, whatever his motivations, the invitation went out, and we drove down from British Columbia that morning with all the excitement that inhabitants of Hobbiton must have tramped over to the party field to celebrate Bilbo’s birthday party.

The trip was memorable as the only one we ever took south of the border in which the American customs guard did not interrogate us on the strength of our rustbucket Maverick.. In those days (and possibly still, for all I know), custom jobs were veteran-preferred postings. The second that the guard heard that we were visiting the veteran’s hospital, he smiled and waved us through without another question.

When we got there, we found that the hospital had laid dozens of tables out on the lawn. The celebration was in full swing, but we had no difficulty finding Avram. He was sitting as far away from the bandstand as possible, surrounded by a dozen people, holding court in his wheel chair and telling stories about recent and past events.

At this late date, I don’t remember everything everyone said. But I do remember that, when someone noted that a tavern sat just beyond the hospital grounds, Avram said that many of the patients would go to any length to get to the hard liquor served at the tavern. When the hospital tried to discourage the custom by planting a pole in the gap of the fence, so that wheelchairs couldn’t squeeze through it, wheelchair patients would drag themselves along the fence, inch by painful inch, to get to the tavern. On Friday nights, he said, they looked like insects spread across a windshield as they clung there.

At some point, too another guest took out a letter he had been asked to forward to Avram six months ago. To my surprise, it was from me – I had completely forgotten the incident.

The stories and jokes went on, many told by Avram, but others contributing their share as well. A band arrived and played the usual American patriotic songs. We continued talking, oblivious to the occasional glares from other visitors. We lined up for food, and the hospital staff glared at our numbers and said nothing. The celebrations ended, and staff started to clean, until only our table was left standing, and still we talked. We didn’t care. I don’t know how Avram’s other visitors were feeling, but I felt as though I had stumbled into a London coffee house on an evening when Samuel Johnson was holding forth, and I didn’t want the evening to end. If we hadn’t had a ferry to catch and a two hour drive on the other side, we might have stayed until midnight.

As things happened, that was the last I saw of Avram. He was dead less than a year later, having left the veteran’s hospital for a basement suite in the town just before the bureaucrats could throw him out. I understand that he was only found a couple of days after he died, and I don’t like to think about his final moments alone.

Instead, I prefer to think of him as I last saw him when I looked back across the grass. He looked tired, but he was obviously in his element, telling stories and laughing at what other people said – a master storyteller even in his leisure.

Read Full Post »

In the past, I’ve described bloggers as amateur journalists. Those who are good enough and ambitious enough eventually find paying gigs and become professional. Broadly speaking, that’s still true, but I now think that’s incomplete. Where a professional journalist is constrained to follow a code of ethics in doing reviews, bloggers only need to follow their consciences. And, for some, their consciences are not enough.

As a professional journalist, I am required by my editors to follow a well-recognized set of guidelines in dealing with my subject matter. If I write about an organization to which I have connections, I’m supposed to disclose that connection, if only at the end of the story. If I receive a piece of proprietary software (not that I ever get much, since I cover free and open source software), I either return it or throw it away when I’m finished with the review. Similarly hardware (again, I don’t get much; due to the vagaries of the tariffs imposed by Canada Customs, few companies are willing to ship from the United States to Canada), I return it to the sender when I’m done.

This basic code of ethics isn’t always comfortable. It means, among other things, that I don’t take out membership in the Free Software Foundation, even though I support that organization’s goals, because I might be tempted to pull my punches should a time ever come when I need to criticize freely. But I try to follow it because part of what I sell is a truthful voice. Unless I make an effort to keep that voice, then what I write is useless.

Probably, the editors I sell to regularly wouldn’t fire me if I knowingly lapsed from these standards. But they would reprimand me the first time, and would probably stop buying my work if I continued in the ethical lapse. They have their own credibility to consider, and buying tainted work doesn’t enhance it. And, at the risk of sounding priggish, I accept these standards as natural and, if not ideal, then at least the best that can be followed to retain integrity.

Imagine my shocked innocence, then, when I discovered that some bloggers do not consider themselves similarly restrained (I won’t name them; I have no wish to pick a fight, and the names don’t matter as much as the behavior). At least one well-known blogger openly advertises on his front page how much he charges to blog about a product. Another blogg accepted samples of moderately priced merchandise to write about it. Then, when the advertising agency that connected them with the manufacturers changed the rules on them but continued to invite them to participate in such campaigns, they were conscience-free enough to complain of maltreatment and spamming. Others also complained about spamming by the same advertiser, but expressed wishes that they could have qualified to take part in such a campaign.

To say the least, these people live in a very different ethical universe than me – and, by extension, than other professional journalists. And, much as I hate to say it (since they all seem decent enough people when I’ve met them socially), their definitions of acceptable behavior makes everything they write unreliable. Unless they announce that they’ve changed their ways, how can I know that what they write is a honest opinion, and not a bought one? Even if they’re writing on an innocuous subject, I’ll always wonder if their opinions are tainted.

Am I being too rigid here? Nobody else seems to be bothered by such behavior, so why should I be? Maybe my self-mocking description of myself as a modern Puritan has more truth than I realized.

All the same, I keep thinking of the comedian Bill Hicks’ comment about people who do product endorsements: “Do a commercial, and you’re off the artistic roll call. Every word you say is suspect, you’re a corporate whore. End of story.”

Read Full Post »

(The following is a recreation and expansion of the talk – or maybe “rant” is a better word – that I gave at the Tazzu WordPress Camp on April 30. The talk was titled by Rastin Mehr, but I decided to keep it for the sake of irony.)

I’m a little surprised to be here tonight. Two years ago, the last thing I thought I’d be doing was blogging.

Back then, I thought that bloggers were self-important amateurs. When I looked at the topics for blogging conferences, I was reminded of academic seminars, and it all looked so serious and earnest that I wanted to shake the nearest blogger and say, “For God’s sake, well you get over yourself? Why don’t you just shut up and write?”

For me, blogging was like vanity publishing, or playing tennis with the net down: You could do it, but wouldn’t you always wonder if you were good enough to make it on your own?

Yes, I know there are a handful of bloggers who are respected for their in-depth coverage of a subject and who have essentially become professional journalists. Pamela Jones of Groklaw springs to mind. But these bloggers probably would have been well-known anyway, and had they gone the traditional routes to recognition, on the way they might have shed some of the amateur self-indulgence that often still mars their work.

As for the majority of bloggers, they’re never going to be recognized and they’re never going to monetize their blog in anyway. In fact, even most of those who succeed in living off their blog are probably only going to do so by focusing on the marketing to the expense of content – if not their integrity.

Yet here I am today, a blogging addict. I still haven’t changed my opinions of most blogs, yet despite my reservations, I still believe that the worst of them has value.

Why I blog

My own reasons for blogging are probably peculiar. I started because, while I am a professional journalist who covers free and open source software, there are other subjects that I want to write about. Mostly, I stay away from free software subjects, although I know that I can get thousands of hits a day if I discuss them. But I can do the same and get paid for it, so I have no great interest in increasing my audience.

Still, for a professional (which really is just a name for an exhibitionist with respectable outlets for their proclivities), writing implies an audience, no matter how small. In fact, philosophically speaking, a writer without an audience can hardly be said to be a writer at all. Even Samuel Pepys, the famous secret diarist, seems to have developed the idea of a future readership as he went on. So, if I’m going to write, I do want a few people to react to it, if only a handful.

For me, writing a blog entry is a warmup for my paid work, or a way to bleed off excess energy when I’m done for the day. It’s a place where I can experiment with structure and subject matter, and learn about the short personal essay as an art form. Sometimes, I even use it as a sandbox for subjects that I later write a paid article for, its content enriched by the feedback from commenters.

But all these are idiosyncratic reasons. Why do I think blogging holds value for anyone?

Reasons for blogging

My answer begins with my past occupation as a university composition instructor. I used to ask students to keep a journal during the semester with a minimal number of entries, to be graded simply on whether it was done or not done. Early on in my thinking, I realized that, if I were still teaching, I would have graduated to asking students to keep blogs. The trendiness of blogging would encourage them in a way that private journals never could.

The reasons I assigned a journal also applies to blogs. Unless you are doing an entry level manual job, the ability to write clearly is always going to give you an edge in your profession. The medium of your writing, whether it’s paper or a computer file doesn’t matter. And if you want to write well, the only way to do it is to keep in practice. You wouldn’t expect to play a guitar well or run ten kilometers easily if you only tried once every three weeks, so why would you imagine that writing is any different?

More importantly, writing is an ideal way to explore your thoughts. I think it was the American writer William Faulkner who said he wrote to learn what he thought on a particular subject, and that idea is in tune with my own experience. It’s only after I stop researching a subject and start thinking how to structure an article that I know my opinion on most of what I write about. When an interviewee asks me what the point of an article will be, most of the time, my only honest answer would be, “I don’t know. I haven’t written it yet.” So, if my own experience holds true for others, writing is a way to self-knowledge. Through the act of writing, you can under both your subject and yourself better.

Even more importantly, writing is one of the lowest-entry creative tasks that you can do. Admittedly, blogging requires access to some relatively expensive hardware, but a computer is relatively cheap compared to say, a painter’s supplies or a dancer’s outfits. If you have to, you can even do blog from a public library terminal, reducing your costs to next to nothing. And if you believe with Abraham Maslow, that everyone has a basic need for creativity – well, how can you argue with a trend that gives everyone who wants it a means of self-expression?

All this, and blogging is fun, too. For some, it’s a way to keep in touch with their friends. And for those who, in the words of Ray Wylie Hubbard, “are condemned by the gods to write,” doing so becomes nothing short of addictive. And if you are an addict (“Hello, my name is Bruce, and I’m a writing junkie”), then you know that nothing quite compares. Personally, I’ve always appreciated the response that science fiction writer Isaac Asimov made when asked if he would rather make love or write: “I can write for twelve hours a day.”

In this commercial, supposedly hard-headed days, these reasons for valuing something may be slight. And it’s true – blogging has more to do with a liberal education than going to law school or getting your MBA. For most of those who blog, the activity is not going to pay off, definitely not in the short term and almost certainly not in the long term. Get used to it.

Yet contrary to the conventional wisdom, choosing to do something without the potential for a return can be neither stupid nor naive. When you’re talking about something like blogging, it means you have your priorities straight, and you know the intrinsic worth of what you’re doing.

I have no claim to wisdom or influence, but, if I did, I’d urge bloggers to stop taking themselves so seriously and just enjoy what they are doing. If you’re blogging, you’re helping yourself to think better and can have fun while you do so. I mean, what more joy do you need? In my experience, money come and goes, but personal growth stays with you forever.

Read Full Post »

Every day, thousands of news releases are emailed. And, every day, thousands of news releases are deleted unread or only partially unread — all because their writers don’t bother to make their news sound important.

Let me explain. Not counting duplicates, I receive several hundred news releases a week. However, I only read about 30-40. I can discard most of them because their distributors haven’t bothered to target their work, and the releases have nothing to do with free software or GNU/Linux. But I also discard many of the rest without reading beyond the first paragraph because they fail to make me care about their news. If the PR people can’t focus enough to make their news interesting, why should I waste more than a minimal amount of time reading their releases?

That may sound harsh. Yet, without ruthless tactics, I would hardly have time to do anything except read releases. Nor did I (or would I) ask for most of the releases I receive.

Besides, I am hardly alone. If anything, I probably receive fewer releases than many computer journalists. A public relations writer who doesn’t know this reality is ignorant about one of the basics facts of their trade. So, really, it is only common sense that they should do what they can to emphasize the relevance of their news, especially when the task is fairly simple.

I always say that, to write successful PR, you need to assume that your everyone in your audience has an attention deficit disorder. They see so many releases that they’re easily bored. A PR writer’s job is to break through that lack of attention so that journalists will read the details and be roused to do a story based on the news.

The best way to attract attention would be to write a custom release for every long term connection. However, that’s hardly practical (although targeting your release is, despite the modern PR writer’s fondness for spamming techniques). But. with a little effort a writer can craft a release that keeps recipients reading.

If you want to attract interest in a release, the place to start is with the head – which should also probably be the subject line if you send the release in an email. Far from being the after-thought that many PR writers seem to make it, the head should be a pithy summary of the news and why it matters. It should not be – as so many PR writers make it – something as bland as “News release from MyPR.”

In fact, it should not just be a bare statement of fact, no matter how specific. For instance, instead of “Jack Parker becomes company CTO” try “Company refocuses on core values by appointing Jack Parker CTO.” The first head sounds irrelevant, while the second explains how the news might affect the company.

A head is usually less than a dozen words, but if you’ve struggled with them the way you should, you won’t need a sub-head. Many long-time writers will actually tell you not to bother with a sub-head, because it’s usually just one more chance to lose the reader. However, if your news is especially complex, those few more words might help keep readers’ attention.

However, most of the time, you’ll want to get directly into the lede. Like the opening of a short story, your first sentence should be the hook you use to catch readers’ attention. You can use the rest of the first paragraph to expand on the gist of what you have to say, but if readers flounder on the first sentence, many of them won’t read beyond the rest of the paragraph, let alone the release.

One thing you do not do is throw away the first sentence with long sentences and cliches. Yes, you want the lede to summarize your news and its importance. But it won’t fill this goal if it’s a compound-complex sentence, and even the most sympathetic reader has trouble following through its entire ten line.

Nor do you want to lose interest by describing your client as a “world leader” in its field or by using any other cliche that the reader has heard thousands of times before. Cliches lose readers’ attention, accomplishing the exact opposite of what you should be trying to do.

Once past the head and the lede, you can relax a bit. However, keep the release short for all but the most monumental news, and put a few quotes in to break up the bald recital of facts. But remember that the quotes should be people talking like people, not like an animated dictionary. Like a cliche, lame prose is just going to lose the reader.

Don’t worry, either, about giving a company bio until the end. Anything more than a clause half a dozen words long will only complicate your basic message unnecessarily. The only reason that anyone will want more about the company is because they are going ahead with a story based on your release. Providing a corporate bio is a courtesy you do journalists, not something that will help you drum up interest in your story.

The idea that a news release should explain why the information it carries is important sounds obvious. Apparently, though, the idea has never occurred to the majority of people working in public relations. Perhaps they are so busy writing a release that pleases their boss or client that they don’t stop to think that they are being paid to offer their expertise as well as please. Or, perhaps, they think the importance of their news is self-evident; the fact that their company has a new point release of a product has kept everyone in the office working overtime for weeks, so why shouldn’t the rest of the world be concerned?

I suspect, though, that many PR writers simply find mass mailouts easier than taking the time to craft a release that journalists will read. Spam, after all, is easy, and effective writing hard. But it is only by effective writing that the composers of news releases can even hope to have their efforts read. Otherwise, they may as well not even bother.

Read Full Post »

“I’d like to write more often for your organization,” a some-time contributor to Linux.com wrote to the editors the other day. “However, I was hoping you’d have some advice for someone like me that suffers from writer’s block. Sometimes I’ll come up with a topic, other times I struggle for ideas, then I read other articles on Linux.com and think to myself, ‘Why didn’t I think of that?’”

By the time I logged on in the morning, Lisa Hoover had already given a comprehensive answer, to which I could only add a few points (I’m on Pacific time, so I log in long after everyone else in North America, although I sometimes get there before the Indian editors). Lisa posted her reply in the Linux.com forums, and I urge anyone interested to read it.

Meanwhile, here’s my suggestions, which includes my rewriting of Lisa’s as well. I’m talking about free and open source software. but I think that with a few changes of context most of the points would apply equally to any kind of journalism:

  • Know the field you’re writing about. In this case, that means keeping up to date on the other basic news sites, such as Linux Today, LWN, and FS Daily. It also means checking out information on new software from FreshMeat, and keeping track of what’s happening with major organizations in the field, such as the Free Software Foundation or The Linux Foundation. Often, these sites just give the bare bones announcement of events, so there is almost always room to go deeper. Moreover, once you have extensive knowledge, you’ll be able to see connections – and, therefore, possible stories – as you make connections between different pieces of information.
  • Subscribe to mail forums in areas that you’d like to write about, and join local meetups of people with similar interests. Their problems and interests will provide endless stories, and, occasionally, a piece of breaking news.
  • When you read a how-to, try it out. What information does it leave out? What information is now outdated in it? Could the information be presented more coherently? Is there a related topic that is left out? Could your personal experience add anything to the instructions? Answering any of these questions can lead to an article.
  • Question what you read. If someone makes a claim about a particular piece of software, go see for yourself. If someone is quoted, contact them to expand on their comments. The more you know about the field, the more you are likely to question. For instance, a few months ago, I got a story when a software project’s members were being quoted as having an opinion which I knew was likely to be wrong.
  • Read bloggers and columnists in the field. Note their opinion, and see if you can come up with a counter-argument (For the record, I write a lot of blog entries using this technique, especially when the subject is career advice).
  • As you get to know your chosen field, you will become familiar with the truisms that everyone knows. Play the contrarian, and see if you can come up with a valid argument that qualifies or over-turns conventional wisdom. An example is my article for Datamation, “It’s time to get over Microsoft,” which suggested that free software was now strong enough to have no need to fear its traditional nemesis. Of course, I received plenty of negative criticism, but I still feel that the point needed to be made.
  • Everyone has a story, and so does every group. I’ve never yet met someone who was boring when talking about what matters to them, so get in touch and tell those stories.
  • Watch for common problems that people have, either in online forum or in your everyday life. Lists of resources or steps to overcome these problems are articles that editors will love, because they’ll continue to be read for months after they’re published.
  • Make lists. For instance, in the last 3 years, I’ve written “11 tips for moving to OpenOffice.org,” “9 characteristics of free software users,” and at least a dozen more. Lists are an excellent way to make use of random observations and thoughts.
  • Think of what’s appropriate to the season. For instance, last Christmas, Linux.com carried articles about gifts for geeks, and non-profits to which people might want to donate before the end of the year to get a tax break. For Valentine’s Day, the site carried suggestions of how to mark the day using free software. In the past, other articles were published to mark the university of the OpenOffice.org and Debian projects.
  • Think about your own experience in the field, whether with your home computers or at the office. Often, what you’re doing with your computer will make a good how-to article, especially for beginners. For instance, I got at least half a dozen stories from my customization of my new laptop last summer.
  • Contact companies and experts, asking for more information about new software or new policies. If you see something interesting in the way of hardware, ask about getting a review unit.
  • Network like crazy, not only with movers and shakers, but also PR experts and ordinary developers. This advice is always sound no matter what you’re doing, but, in journalism, it pays greater and greater dividends as you continue to write, because people will contact you when they think they have a possible story. I don’t know how it works for other journalists, but I now get 2-3 stories and another 2-3 possible leads per month – a substantial reduction of my need to generate ideas. Three years ago, when I started, I got none. And, increasingly, those stories are scoops, given to me because people feel that I’ve written about them or their colleagues with some fairness or insight in the past. Of course, many of these contacts have their own agenda, but generally that agenda is only to get publicity, so you generally don’t have to worry about preserving your independence.

You see the common thread? Consistently generating ideas to write about means that a part of you is always hunting for stories. As you go about your business, a part of you needs to be always analyzing the story potential of what you encounter.

If my experience is anything to go by, once you have this habit, your problem won’t be coming up with ideas. It will be choosing which stories you want to write in the limited time that you have in the day.

Read Full Post »

Probably the best piece of advice I’ve heard for writers is from screen writer William Goldman the writer of The Princess Bride (You know, the book and the movie with the immortal line: “Hello, my name is Inigo Montoya. You killed my father. Prepare to die.”). When writing a script, Goldman says in Adventures in the Screen Trade and Which Lie Did I Tell?, you need to discover what he calls the spine of the story – that is, the impression you want to leave with the audience, or what the story is about besides the bare events. If you’re an English major, you could say he is talking about the main theme. But, whatever you call it, the advice holds true for both fiction and non-fiction.

The point is easiest to understand if you think in terms of fiction. Imagine that you are writing one of those time-honored stories in which a young man rises from obscurity to become rich and influential. It is not enough simply to narrate the events in his life; if that is all you do, then the result will be like one of those endless cell phone conversations teenagers seem to have at the top of their voices when you’re trapped with with them on a bus (“Then he says, and I say, then he goes . . .”), until you want to scream with boredom.

Instead, you need to understand how you want the audience to view the events. Will the story be about how the young man is unable to shed conventions until he finds himself trapped by his own success? Or will it be about personal courage and having the strength to realize your dreams? Either of these perspectives could be a legitimate spine, and each could apply to the same sequence of events. But without discovering the spine, you won’t know what to emphasize, or even the metaphors you need to tell the story.

This need explains why telling a real person’s story is notoriously difficult to do well. Very few people’s lives have a spine – even a well-known person’s life contains a lot of living for the moment and random incidents. You need to find a perspective from which to tell a person’s life, and usually it’s easier to find meaning in a small portion of a life rather than the whole thing.

Goldman is talking about fiction, of course – specifically, writing movie scripts, although his comment applies equally well to short stories or novels. I’ve found it useful when writing the handful of stories I’ve published professionally. However, as I’ve slowly struggled to learn journalism over the last few years, I’ve realized that his advice applies equally well to features and news items.

It’s not surprising, really, because articles are narratives, too. Take, for example, a simple news release. A company issues a news release because it has a story it wants told – it has a new product, it has hired a new executive, or maybe it has a comment on industry news. The publicist’s job is find the perspective that the company wants on the news, while a journalist’s is to find the perspective that makes the story worth the attention of the audience. The publicist who has found the spine of the story is working hard to make sure that journalists believe the perspective offered, while journalists – if they have any integrity – are trying to discover the spine for themselves.

At least, that’s the way it should be. In reality, many publicists and journalists never discover what the spine of a particular story should be, either because they are in a rush or lazy or just plain ignorant of their roles. A publicist without a spine sends out a boring release that no one wants to read, technically fulfilling the needs of their client or employer, but in truth doing them no favor at all. Similarly, a journalist who doesn’t bother to find the spine either tells a disconnected story, or worst, shows a lack of integrity by simply accepting the one that the publicist offers. You can find hundreds of such releases or stories on any given day, but, unless the news is so major that it tells itself, none of them are of any value to the audience.

In both fiction and non-fiction, finding the spine takes time. Yet the effort is always worth making. Not only is the search a matter of integrity, but writing without the spine is infinitely harder, and is far more likely to produce rambling or mediocre results – and to be excruciatingly boring and painful to produce.

Read Full Post »

« Newer Posts - Older Posts »