Instant Change?

The term “instant change” is in fact, so far as societies are concerned, an oxymoron, because meaningful change in any society is anything but instant, and is almost always agonizingly painful for significant segments, if not for all, of that society. Yet here in the United States, we’ve just witnessed five months of political primary election contests where all the candidates have promised “change,” and where the apparent winner of the Democratic presidential primary is the one who promised the most radical, and yet, the most painless change.

Needless to say, I’m skeptical. Not about the need for change, but about all the rhetoric and implications that suggest radical changes will be comparatively easy and painless. Now… let’s consider that John Adams and others among the Founding Fathers insisted [and failed] on radical change in abolishing slavery in 1776. Some eighty-four years later, the United States was ripped apart by Civil War, at least in part, if not in large part, over the issue of slavery. Despite the Emancipation Proclamation and the 13th, 14th, and 15th amendments that ensued, full legal civil rights were effectively denied to blacks until the Supreme Court outlawed the worst of discriminatory measures in Brown vs. Board of Education in 1954. That proved insufficient, and in 1964 a more far-reaching Civil Rights Act was passed, followed by the Voting Rights Act of 1965. And more riots followed. Admittedly, we have changed radically in respect to legal rights for blacks since the United States was founded — but such radical change was anything but swift or painless.

Some change in any society is good, but the lessons of history suggested to the founding fathers that most change, especially popularly-based change whipped up by demagogues and political opportunists, was not. They felt that order was more conducive to liberty than the ability to change societal structures quickly. So our government was designed with all fashion of checks and balances, primarily to ensure that no change could be instantly railroaded through. Some of those checks and balances have been changed, and while some people would claim “eroded” is a better term, the fact remains that radical change cannot be implemented legally and quickly.

Some would also claim, not without reason, that the current Administration has made radical changes in personal liberties, but the legality of many of those changes remains untested, and some have been curtailed. That said, would a new Administration really wish to employ similar methods to force change? If so, such an Administration would not be changing anything, but merely using the same structure for differing ends, and maintaining a loss of liberty to obtain its goals. If not, then radical change will be time-consuming and expensive, as it always has been.

In the meantime, what of all those voters who endorsed quick and painless change? Will they be so enthusiastic as time passes, as endless votes and amendments pile up, as the costs for implementing those changes further increase their taxes or decrease services in other areas?

Of course, the quick and simple [and wrong] answer to those questions is that all we have to do is decrease government waste. The problem is: One person’s “waste” is another person’s livelihood. For example, we pay what I believe are excessive farm subsidies, but cutting those subsidies will be painful to those who receive them, and they will protest and harass their representatives and present all manner of arguments to prove that the subsidies are good programs. Bridges and roads to small communities are expensive, and many are certainly not “cost-effective,” but those communities often cannot pay for such improvements, and a bridge described as “one to nowhere” in Washington is certainly one to somewhere out in the state that wants it built. Requiring national health care is a goal that’s been cited repeatedly, but exactly who will pay for the services to the 47 million uninsured Americans? Even a rock-bottom [and unrealistically low] premium of $200 a month and health care expenditures averaging a mere $1,000 a year for each of those uninsured Americans would require either taxpayers or employers or some combination of each to come up with an additional $160 billion annually. If we’re talking about a government program, that works out to over $1,000 more in income or payroll taxes per taxpaying family per year. That’s unless we cut some other government programs by the same amount. If the program is supposed to be funded by employers, how many more jobs will vanish, the way they have in the auto industry, over just that issue? And if the program is funded by increasing taxes on the “rich,” that won’t work unless the “rich” are defined as any couple that makes over $150,000 — and that number takes in millions of people who definitely believe they’re anything but rich.

As I indicated earlier, I’m not against change, but I am against rhetoric and hype that suggests change is automatically wonderful, painless, and free. Change is always more expensive than anyone realizes, especially to those who fail to understand that point. Just look at the changes in the USA today as a result of quick and easy promises to fight terror… and the fact that we’ve spent over a trillion dollars, lost civil liberties and thousands of lives, and still haven’t succeeded.

Change — do you really think it’s ever quick, easy, cheap, and painless? Or do you assume that someone else will end up paying for it?

The [Restricted/Slanted/Inaccurate/Incomplete/Mis-] Information Society

There’s been an overwhelming amount of material written about how people today, especially in the United States, live in the “Information Age.” And we do… but the vast majority of that information is anything but what it seems on its face, and, often, lacks significant facts that might change the entire meaning of what was initially presented. Now, some cynics will ask, “And what else is new?”

The answer to that question is: The volume, complexity, and increased power of information are what’s new, and those aspects of information make all the difference.

While it shouldn’t be any great surprise to anyone who follows political news, the recent book by a former press secretary of President Bush describes in detail how the current administration manipulated the news by the use of inaccurate, slanted, and misleading information. The official White House response seems to be that the President will try to forgive his former aide. Forgive the man? That suggests that Bush believes it was wrong to reveal the White House’s informational shenanigans, and that personal loyalty is far more important than truth. This viewpoint isn’t new to the Presidency, but the degree to which it’s being carried appears to be.

One of the aspects of the mortgage banking and housing sector melt-down that’s also been downplayed is the incredible amount of false, misleading, and inaccurate information at all levels. Large numbers of homeowners were lied to and misled, and many were simply unable to wade through the paperwork to discover what was buried there in the legalese. The mortgage securitization firms misled the securities underwriters. The information issued by the underwriters misled the securities traders, and in the end, with all the misinformation, it appears that almost no one understood the magnitude of the problem before the meltdown.

We’re seeing, or not seeing, the same problem with recent economic statistics, particularly those measuring inflation. Until 2000, the most common indicator of the rate of inflation was the amount of change in the Consumer Price Index (CPI), which measured price fluctuations in a market-basket of goods. In 2000, however, the Administration decided to remove food and energy from that market basket on the grounds that changes in food and energy were “too volatile,” and the “new” index was named the Personal Consumption Expenditures Price Index, or PCE, which was described as better able to measure “core inflation.” That means that, although the price of crude oil has more than tripled in the past seven years, and food prices are rising significantly, neither affects the PCE… and the government is telling us that inflation is only a bit over two percent, while, measured by the old CPI, it’s at least four percent, which works out to 40% higher than the “official” figures.

Misleading or restricted information certainly isn’t limited to the federal government, either. One of the Salt Lake City papers noted that a local public health teacher was suspended for discussing sex education material not in the approved curriculum. Her offense? She factually answered student questions about such topics as homosexuality and masturbation, which angered a group of parents. Interestingly enough, the students protested her suspension with a rally and signs with such statements as “We’re the Guilty Ones. We Asked the Questions.” In the good old USA, we still have school boards restricting what can be taught or read based not on what is factual or accurate, but based on religious beliefs.

The multibillion dollar advertising industry consistently manipulates images and facts to create misleading impressions about various products, as do the majority of politicians and political parties, not to mention the plethora of interest groups ranging from the far right to the far left, each of which tends to state that its facts are the correct ones. Needless to say, those with resources and money are the ones whose facts tend to get seen and used the most.

Years ago, the psycholinguist Deborah Tannen observed that there is a gender difference in the use of information. According to her work, in general, men tend to use information to amass and maintain power, while women use it to built networks and draw people to them. That’s one reason why many men refuse to ask for directions — it’s an admission of failure and powerlessness.

Could it be that one reason why the United States so abuses information is that information has become the principal tool for obtaining power in a still-patriarchal and masculine dominated society? I may be stretching matters a bit, but I’m not so certain that I’m all that far off when information is so critical to almost every aspect of American society.

The underlying problem is that, in a mass media culture, even one with theoretical First Amendment protections, the “truth” doesn’t always come out. It often only appears when someone either has enough money and influence to get it on the various airways or when some diligent individual spends hours digging for it.

And then, how can the average individual, even one who is highly educated, determine the accuracy of such “counter-information,” particularly when such a large proportion of the information available to Americans has come to be false, slanted, inaccurate, misleading, or incomplete?

The ancient Romans had a saying — Quis custodiet ipsos custodes? — that asked, “Who watches the watchmen?” Perhaps we should consider asking, “Who scrutinizes the information on which we act?”

Or are we already too late? Were H. G. Wells and Orwell all too accurate in their prophecies?

Sexism, Ageism, and Racism — Just Manifestations of Human Placeism?

The past half-year’s round of presidential political primary contests in the United States has raised cries of sexism, racism, and even ageism, hardly surprising when the three leading candidates are, respectively, a woman, a black man, and the oldest man ever to seek the presidency for a first term. My wife and I were discussing this when she made the observation that all three “isms” are really just different forms of “placeism.”

By that, she meant that sexism against women is really just a manifestation of the idea that a woman’s place is, variously, in the home, raising children, or even just plain barefoot and pregnant… and that a woman who aspires to be president, or a corporate CEO, or a noted surgeon is, heaven forbid, leaving her culture-required or God-decreed “place.”

Likewise, a black man who aspires to be president is also out of place, because, for many people, whether they will admit it or not, a black’s place is one of subservience to Caucasians. And, of course, an older man’s place is in a rocking chair, on a golf course, or doing some sort of volunteer good works.

Such “places,” while certainly tacitly accepted and reinforced to some degree in most cultures across the globe, don’t have a basis in fact, but in custom. For generations, if not centuries, bias against people “of color” [and this also refers to Asian prejudices against Caucasians, Bantu prejudices against Bushmen, Chinese biases against all outsiders, as well as Caucasian prejudices against blacks or American Indians] has been based on the assumption that whoever was defined as being “of color” was genetically “inferior.” Now that the human genome has been largely sequenced, it’s more than clear that, not only is there no overriding genetic difference in terms of “race,” but the variations between people of similar “races” are often greater than the differences between those of one skin color and another.

The same argument applies to age. Senator McCain is far younger than a great number of world leaders who accomplished significant deeds at ages far older than the senator presently is. But in our youth-oriented society, someone who is old is regarded as out-of-place, with values and views at variance with popular culture, as well they may be, for with age can come a perspective lacking in the young. And, yes, with age for some people comes infirmity, but that infirmity is based on individual factors and not on a physical absolute that, at a “pre-set” age, one is automatically old and unable to function. As with all the other “place-isms,” ageism is effectively an attempt to dismiss someone who is older as out of place with the unspoken implication that the oldster is somehow unsuitable because he or she refuses to accept the “customary” place.

All such placeisms are rooted in prejudicial customs and flower into full distastefulness and unfairness when people hide behind the unspoken prejudice of tradition, religion, or custom and remain either unwilling or unable to judge people as individuals.

The results of a study published in the May 31st issue of The Economist also shed a new light on “placeism” with regard to women. The study surveyed the tested abilities of older male and female students in mathematical and verbal skills across a range of countries and cultures. The researchers concluded that, in those cultures where women had the greatest level of social, economic, and political equality, women’s test scores in math were equal to those of men, and their verbal skills were far greater — even greater than the current gap in countries such as the United States, where women already outshine men. In short, if men and women are treated as true equals with regard to rights and opportunity, on average the women will outperform the men in all mental areas. Could it just be that men understand that, and that instinctive understanding is why in most cultures men want to keep women “in their place?”

Heavens no! It couldn’t be that, could it? It must be that women are just so much better suited to the home or, if in the public arena, supporting men, just as black are far better in athletic endeavors because their genes make them better in sports and less able in politics and business, and just as all old people have lost all judgment the moment they’re eligible to join AARP or collect Social Security checks.

That’s right, isn’t it? After all, there’s a place for everything, and everyone has his — or her — place, and we know just where that should be, don’t we?

F&SF Writers: Popularity and Influence

Literary critics like to write about the importance of an author and his/her work, but many of them seldom put it quite that way. They write about themes and styles and relationships and relevance, but, most of the time, when they write about an author, they’re only guessing as to whether an author will really have a lasting influence over readers and culture and whether anything written by that author will resonate or last beyond the author’s lifespan.

Because critics seldom seem to consider history, although they’ve doubtless read about it, readers tend to forget little things like the fact that Shakespeare was NOT the pre-eminent playwright of his time, and that Beaumont and Fletcher ended up interred in Westminster Abbey long before the Bard did. Rudyard Kipling won the Nobel Prize for literature, but few today read anything of what he wrote anymore, except for The Jungle Book, Just So Stories, and a handful of poems.

Publishers and booksellers tend not to care as much about potential influence, but about sales — or popularity. And, of course, our current media culture is all about instant-popularity. So… in the field of fantasy and science fiction, the media tends to focus on the mega-sellers like Harry Potter or The Wheel of Time. Certainly, both series have sold well and inspired many imitators, but how well will they fare over time in influencing readers and overall culture?

Will either approach J.R.R. Tolkien? Or for that matter, Edgar Allan Poe or Mary Shelley?

Tolkien was both popular and influential, to the point that a great many of today’s popular fantasy writers are not influential at all. They’re merely imitators, using pale similarities, that include trolls, orcs, faerie, variations on European feudalism, and the same kind of vaguely defined magic as Tolkien employed. These writers have sold a great number of books, but exactly what is their influence, except as extensions of the approach that Tolkien pioneered?

Poe could be said to have pioneered the horror genre, with a relevance and an influence great enough that movies have been made and re-made more than a century after his death. Mary Shelley’s Frankenstein has long outlasted her considerable output of scholarly and other works and is perhaps the model for the nurture/nature conflict horror story.

What works of today’s F&SF writers will outlive them?

As has been the case with all cultures, while all of us who write would like to think that it will be our works that survive, in almost all cases, that won’t be so. That realization may well be, in fact, why I intend to keep writing so long as I can do so at a professional level. That way, if my works fall out of favor, I won’t be around to see it. And if they don’t, well, that would be an added bonus, even if I wouldn’t know it.

Still… what factors are likely to keep a book alive?

Some of them are obvious, such as an appeal to basic human feelings with which readers can instantly identify. Other factors, such as style, are far more transient. Shakespeare’s work, with its comparative linguistic directness, has fared far better than those writers whose style was considered more “erudite.” And with our mass-media-simplifying culture, I have great doubts that the work of writers whose appeal to critics is primarily stylistic will long endure. Works which explore ideas and ideals and how they apply to people are more likely to last, but whose works… I certainly couldn’t say.

For all that the critics write, with their [sometimes] crystal prose, I have to wonder just how many of them have accurately predicted or will be able to determine which works of today’s authors will still be around — and influential — in fifty years… or a century.

What’s a Story

Recently, I was asked, as I am occasionally, very occasionally, to judge a writing contest. It was an extremely painful experience. Now, in past years, one of the more agonizing aspects of going through manuscripts was dealing with the rather deplorable grammar and spelling. Clearly, spell-checkers and grammar checkers have had an impact, because the absolutely worst grammatical errors have largely vanished. The less obvious errors of grammatical and syntactical misuse remain, as do errors in referential pronouns, among others.

What struck me the most, however, was the almost total lack of story-telling. In years past, I read awfully-written and ungrammatical work, but a large percentage of the submissions were actual stories.

This, of course, leads to the question — what is a story? For most people, trying to define a story is like the reputed reply given by an elder statesman when he was asked to define pornography. “I can’t define it, but when I see it, I know it.” That sort of definition isn’t much help to a would-be writer. So I went back to my now-ancient Handbook to Literature and checked the definition:

…any narrative of events in a sequential arrangement. The one merit of a story is its ability to make us want to know what happened next… Plot takes a story, selects its materials not in terms of time but causality; gives it a beginning, a middle, and an end; and makes it serve to elucidate character, express an idea, or incite to an action.

Robert Heinlein once defined a story this way: “A story is an account which is not necessarily true but which is interesting to read.”

Put more directly, in a story, the writer has to express events so that they progress in a way that makes sense, while hanging together and making the reader want to continue reading.

Almost all of the stories I read were anything but interesting to read, and not just to me, but to a jury of first readers, none of whom could recommend any. So the first readers thought they weren’t seeing something and passed all of them on to me. Unhappily, they were right. But why?

In considering these stories, I realized they all shared several faults. First, while almost all had a series of events, there was no real rationale for those events, except that the writer had written them. In real life, there is, as the definition above notes, a certain causality. It may be the result of our actions or the actions of others, or even of nature, but events do follow causes, notwithstanding the views of some quantum physicists. A story, at least occasionally, should give a nod to causality, either through background or the words or actions of the characters. After a reader finishes the story, he or she should be able to say why things happened, or at least feel that how they happened was “right” for the story.

Second, all too many of the stories shifted viewpoints, even verb tenses, almost from sentence to sentence. This is a trend that has been growing with younger writers over the years, and I think it’s probably the result of our video culture, with its rapid camera cuts, and multiple plot lines, but what works, if imperfectly, on a video screen, doesn’t translate to the printed page because a reader doesn’t have all the visual and tonal cues provided by video. The words have to carry the action and the emotions, and when those words are absent or scattered among a number of characters, the reader is going to have trouble following and identifying with anyone.

Third, almost none of the stories showed any real understanding of human character and motivation, yet one of the unspoken reasons why most readers read is because of the characters or the glimpses of characters. Again, I suspect that this lack of understanding stems in large part from a video entertainment culture that focuses on action to the exclusion of character. I’ve noticed this change in other ways, as well, because many younger readers have great difficulty in picking up on subtle written clues to character in novels. I’ve seen more than a few comments about books, my own as well as that of other authors, decrying the lack of characterization, while older and more experienced readers often praise the same books for their depth of characterization. Because I’m not of the younger generation, I can only guess, but it appears to me that when they write, while they may imagine such characteristics, they neglect to write them down, believing that other readers will imagine as they do, even without any written clues. Needless to say, each of us imagines differently, and without cues, many readers may not imagine at all, which leads to a lack of interest.

In the end, a story has to contain all the words, phrases, description, and causality necessary to carry the reader along. Or, as one man put it years ago, “If it doesn’t say it in black and white, it doesn’t say it.”