Archive for the ‘General’ Category

The [Restricted/Slanted/Inaccurate/Incomplete/Mis-] Information Society

There’s been an overwhelming amount of material written about how people today, especially in the United States, live in the “Information Age.” And we do… but the vast majority of that information is anything but what it seems on its face, and, often, lacks significant facts that might change the entire meaning of what was initially presented. Now, some cynics will ask, “And what else is new?”

The answer to that question is: The volume, complexity, and increased power of information are what’s new, and those aspects of information make all the difference.

While it shouldn’t be any great surprise to anyone who follows political news, the recent book by a former press secretary of President Bush describes in detail how the current administration manipulated the news by the use of inaccurate, slanted, and misleading information. The official White House response seems to be that the President will try to forgive his former aide. Forgive the man? That suggests that Bush believes it was wrong to reveal the White House’s informational shenanigans, and that personal loyalty is far more important than truth. This viewpoint isn’t new to the Presidency, but the degree to which it’s being carried appears to be.

One of the aspects of the mortgage banking and housing sector melt-down that’s also been downplayed is the incredible amount of false, misleading, and inaccurate information at all levels. Large numbers of homeowners were lied to and misled, and many were simply unable to wade through the paperwork to discover what was buried there in the legalese. The mortgage securitization firms misled the securities underwriters. The information issued by the underwriters misled the securities traders, and in the end, with all the misinformation, it appears that almost no one understood the magnitude of the problem before the meltdown.

We’re seeing, or not seeing, the same problem with recent economic statistics, particularly those measuring inflation. Until 2000, the most common indicator of the rate of inflation was the amount of change in the Consumer Price Index (CPI), which measured price fluctuations in a market-basket of goods. In 2000, however, the Administration decided to remove food and energy from that market basket on the grounds that changes in food and energy were “too volatile,” and the “new” index was named the Personal Consumption Expenditures Price Index, or PCE, which was described as better able to measure “core inflation.” That means that, although the price of crude oil has more than tripled in the past seven years, and food prices are rising significantly, neither affects the PCE… and the government is telling us that inflation is only a bit over two percent, while, measured by the old CPI, it’s at least four percent, which works out to 40% higher than the “official” figures.

Misleading or restricted information certainly isn’t limited to the federal government, either. One of the Salt Lake City papers noted that a local public health teacher was suspended for discussing sex education material not in the approved curriculum. Her offense? She factually answered student questions about such topics as homosexuality and masturbation, which angered a group of parents. Interestingly enough, the students protested her suspension with a rally and signs with such statements as “We’re the Guilty Ones. We Asked the Questions.” In the good old USA, we still have school boards restricting what can be taught or read based not on what is factual or accurate, but based on religious beliefs.

The multibillion dollar advertising industry consistently manipulates images and facts to create misleading impressions about various products, as do the majority of politicians and political parties, not to mention the plethora of interest groups ranging from the far right to the far left, each of which tends to state that its facts are the correct ones. Needless to say, those with resources and money are the ones whose facts tend to get seen and used the most.

Years ago, the psycholinguist Deborah Tannen observed that there is a gender difference in the use of information. According to her work, in general, men tend to use information to amass and maintain power, while women use it to built networks and draw people to them. That’s one reason why many men refuse to ask for directions — it’s an admission of failure and powerlessness.

Could it be that one reason why the United States so abuses information is that information has become the principal tool for obtaining power in a still-patriarchal and masculine dominated society? I may be stretching matters a bit, but I’m not so certain that I’m all that far off when information is so critical to almost every aspect of American society.

The underlying problem is that, in a mass media culture, even one with theoretical First Amendment protections, the “truth” doesn’t always come out. It often only appears when someone either has enough money and influence to get it on the various airways or when some diligent individual spends hours digging for it.

And then, how can the average individual, even one who is highly educated, determine the accuracy of such “counter-information,” particularly when such a large proportion of the information available to Americans has come to be false, slanted, inaccurate, misleading, or incomplete?

The ancient Romans had a saying — Quis custodiet ipsos custodes? — that asked, “Who watches the watchmen?” Perhaps we should consider asking, “Who scrutinizes the information on which we act?”

Or are we already too late? Were H. G. Wells and Orwell all too accurate in their prophecies?

Sexism, Ageism, and Racism — Just Manifestations of Human Placeism?

The past half-year’s round of presidential political primary contests in the United States has raised cries of sexism, racism, and even ageism, hardly surprising when the three leading candidates are, respectively, a woman, a black man, and the oldest man ever to seek the presidency for a first term. My wife and I were discussing this when she made the observation that all three “isms” are really just different forms of “placeism.”

By that, she meant that sexism against women is really just a manifestation of the idea that a woman’s place is, variously, in the home, raising children, or even just plain barefoot and pregnant… and that a woman who aspires to be president, or a corporate CEO, or a noted surgeon is, heaven forbid, leaving her culture-required or God-decreed “place.”

Likewise, a black man who aspires to be president is also out of place, because, for many people, whether they will admit it or not, a black’s place is one of subservience to Caucasians. And, of course, an older man’s place is in a rocking chair, on a golf course, or doing some sort of volunteer good works.

Such “places,” while certainly tacitly accepted and reinforced to some degree in most cultures across the globe, don’t have a basis in fact, but in custom. For generations, if not centuries, bias against people “of color” [and this also refers to Asian prejudices against Caucasians, Bantu prejudices against Bushmen, Chinese biases against all outsiders, as well as Caucasian prejudices against blacks or American Indians] has been based on the assumption that whoever was defined as being “of color” was genetically “inferior.” Now that the human genome has been largely sequenced, it’s more than clear that, not only is there no overriding genetic difference in terms of “race,” but the variations between people of similar “races” are often greater than the differences between those of one skin color and another.

The same argument applies to age. Senator McCain is far younger than a great number of world leaders who accomplished significant deeds at ages far older than the senator presently is. But in our youth-oriented society, someone who is old is regarded as out-of-place, with values and views at variance with popular culture, as well they may be, for with age can come a perspective lacking in the young. And, yes, with age for some people comes infirmity, but that infirmity is based on individual factors and not on a physical absolute that, at a “pre-set” age, one is automatically old and unable to function. As with all the other “place-isms,” ageism is effectively an attempt to dismiss someone who is older as out of place with the unspoken implication that the oldster is somehow unsuitable because he or she refuses to accept the “customary” place.

All such placeisms are rooted in prejudicial customs and flower into full distastefulness and unfairness when people hide behind the unspoken prejudice of tradition, religion, or custom and remain either unwilling or unable to judge people as individuals.

The results of a study published in the May 31st issue of The Economist also shed a new light on “placeism” with regard to women. The study surveyed the tested abilities of older male and female students in mathematical and verbal skills across a range of countries and cultures. The researchers concluded that, in those cultures where women had the greatest level of social, economic, and political equality, women’s test scores in math were equal to those of men, and their verbal skills were far greater — even greater than the current gap in countries such as the United States, where women already outshine men. In short, if men and women are treated as true equals with regard to rights and opportunity, on average the women will outperform the men in all mental areas. Could it just be that men understand that, and that instinctive understanding is why in most cultures men want to keep women “in their place?”

Heavens no! It couldn’t be that, could it? It must be that women are just so much better suited to the home or, if in the public arena, supporting men, just as black are far better in athletic endeavors because their genes make them better in sports and less able in politics and business, and just as all old people have lost all judgment the moment they’re eligible to join AARP or collect Social Security checks.

That’s right, isn’t it? After all, there’s a place for everything, and everyone has his — or her — place, and we know just where that should be, don’t we?

F&SF Writers: Popularity and Influence

Literary critics like to write about the importance of an author and his/her work, but many of them seldom put it quite that way. They write about themes and styles and relationships and relevance, but, most of the time, when they write about an author, they’re only guessing as to whether an author will really have a lasting influence over readers and culture and whether anything written by that author will resonate or last beyond the author’s lifespan.

Because critics seldom seem to consider history, although they’ve doubtless read about it, readers tend to forget little things like the fact that Shakespeare was NOT the pre-eminent playwright of his time, and that Beaumont and Fletcher ended up interred in Westminster Abbey long before the Bard did. Rudyard Kipling won the Nobel Prize for literature, but few today read anything of what he wrote anymore, except for The Jungle Book, Just So Stories, and a handful of poems.

Publishers and booksellers tend not to care as much about potential influence, but about sales — or popularity. And, of course, our current media culture is all about instant-popularity. So… in the field of fantasy and science fiction, the media tends to focus on the mega-sellers like Harry Potter or The Wheel of Time. Certainly, both series have sold well and inspired many imitators, but how well will they fare over time in influencing readers and overall culture?

Will either approach J.R.R. Tolkien? Or for that matter, Edgar Allan Poe or Mary Shelley?

Tolkien was both popular and influential, to the point that a great many of today’s popular fantasy writers are not influential at all. They’re merely imitators, using pale similarities, that include trolls, orcs, faerie, variations on European feudalism, and the same kind of vaguely defined magic as Tolkien employed. These writers have sold a great number of books, but exactly what is their influence, except as extensions of the approach that Tolkien pioneered?

Poe could be said to have pioneered the horror genre, with a relevance and an influence great enough that movies have been made and re-made more than a century after his death. Mary Shelley’s Frankenstein has long outlasted her considerable output of scholarly and other works and is perhaps the model for the nurture/nature conflict horror story.

What works of today’s F&SF writers will outlive them?

As has been the case with all cultures, while all of us who write would like to think that it will be our works that survive, in almost all cases, that won’t be so. That realization may well be, in fact, why I intend to keep writing so long as I can do so at a professional level. That way, if my works fall out of favor, I won’t be around to see it. And if they don’t, well, that would be an added bonus, even if I wouldn’t know it.

Still… what factors are likely to keep a book alive?

Some of them are obvious, such as an appeal to basic human feelings with which readers can instantly identify. Other factors, such as style, are far more transient. Shakespeare’s work, with its comparative linguistic directness, has fared far better than those writers whose style was considered more “erudite.” And with our mass-media-simplifying culture, I have great doubts that the work of writers whose appeal to critics is primarily stylistic will long endure. Works which explore ideas and ideals and how they apply to people are more likely to last, but whose works… I certainly couldn’t say.

For all that the critics write, with their [sometimes] crystal prose, I have to wonder just how many of them have accurately predicted or will be able to determine which works of today’s authors will still be around — and influential — in fifty years… or a century.

What’s a Story

Recently, I was asked, as I am occasionally, very occasionally, to judge a writing contest. It was an extremely painful experience. Now, in past years, one of the more agonizing aspects of going through manuscripts was dealing with the rather deplorable grammar and spelling. Clearly, spell-checkers and grammar checkers have had an impact, because the absolutely worst grammatical errors have largely vanished. The less obvious errors of grammatical and syntactical misuse remain, as do errors in referential pronouns, among others.

What struck me the most, however, was the almost total lack of story-telling. In years past, I read awfully-written and ungrammatical work, but a large percentage of the submissions were actual stories.

This, of course, leads to the question — what is a story? For most people, trying to define a story is like the reputed reply given by an elder statesman when he was asked to define pornography. “I can’t define it, but when I see it, I know it.” That sort of definition isn’t much help to a would-be writer. So I went back to my now-ancient Handbook to Literature and checked the definition:

…any narrative of events in a sequential arrangement. The one merit of a story is its ability to make us want to know what happened next… Plot takes a story, selects its materials not in terms of time but causality; gives it a beginning, a middle, and an end; and makes it serve to elucidate character, express an idea, or incite to an action.

Robert Heinlein once defined a story this way: “A story is an account which is not necessarily true but which is interesting to read.”

Put more directly, in a story, the writer has to express events so that they progress in a way that makes sense, while hanging together and making the reader want to continue reading.

Almost all of the stories I read were anything but interesting to read, and not just to me, but to a jury of first readers, none of whom could recommend any. So the first readers thought they weren’t seeing something and passed all of them on to me. Unhappily, they were right. But why?

In considering these stories, I realized they all shared several faults. First, while almost all had a series of events, there was no real rationale for those events, except that the writer had written them. In real life, there is, as the definition above notes, a certain causality. It may be the result of our actions or the actions of others, or even of nature, but events do follow causes, notwithstanding the views of some quantum physicists. A story, at least occasionally, should give a nod to causality, either through background or the words or actions of the characters. After a reader finishes the story, he or she should be able to say why things happened, or at least feel that how they happened was “right” for the story.

Second, all too many of the stories shifted viewpoints, even verb tenses, almost from sentence to sentence. This is a trend that has been growing with younger writers over the years, and I think it’s probably the result of our video culture, with its rapid camera cuts, and multiple plot lines, but what works, if imperfectly, on a video screen, doesn’t translate to the printed page because a reader doesn’t have all the visual and tonal cues provided by video. The words have to carry the action and the emotions, and when those words are absent or scattered among a number of characters, the reader is going to have trouble following and identifying with anyone.

Third, almost none of the stories showed any real understanding of human character and motivation, yet one of the unspoken reasons why most readers read is because of the characters or the glimpses of characters. Again, I suspect that this lack of understanding stems in large part from a video entertainment culture that focuses on action to the exclusion of character. I’ve noticed this change in other ways, as well, because many younger readers have great difficulty in picking up on subtle written clues to character in novels. I’ve seen more than a few comments about books, my own as well as that of other authors, decrying the lack of characterization, while older and more experienced readers often praise the same books for their depth of characterization. Because I’m not of the younger generation, I can only guess, but it appears to me that when they write, while they may imagine such characteristics, they neglect to write them down, believing that other readers will imagine as they do, even without any written clues. Needless to say, each of us imagines differently, and without cues, many readers may not imagine at all, which leads to a lack of interest.

In the end, a story has to contain all the words, phrases, description, and causality necessary to carry the reader along. Or, as one man put it years ago, “If it doesn’t say it in black and white, it doesn’t say it.”

Questions of Change

Science fiction and fantasy have always dealt, at least ostensibly, with change, about how the future might be with technology, aliens, biotech, or whatever, or how our world or others might be if some form of workable magic existed. In a world where change is ongoing and seemingly accelerating, we tend to forget that for much of human history change was either slow or non-existent. And it wasn’t just a question of technology. The Ptolemaic Egyptians had a rather interesting array of technological gadgets. And they were nothing compared to what had already been developed in China. The Roman Empire implemented Greek technology, but added little, except concrete, central heating, and plumbing, despite conquering a large section of the “known” world. So why did technology lead to change and ever more change in post-Renaissance Europe and virtually none in earlier prosperous societies?

Africa is clearly the cradle of homo sapiens, and where tool-making began, yet after the Egyptians, the Nubians, and perhaps the Carthaginians, in a sense, nothing changed, and societies in Africa declined, both in cultural and technological terms. Why?

Today, after several centuries of comparatively rapid change, despite outward appearances, the pace of change is again slowing. About the only significant change in space exploration and travel over the last forty years has been the advances in communications and video areas so that we can see more of the solar system and the universe in far greater clarity. We still can’t get anywhere significantly any faster, and, in fact, we’ve really done less human traveling in space. Do better pretty pictures of space represent a real change, or just an illusion of change?

Despite Einstein and atomic power, essentially we’re still using an improved model of the first atomic power plant. That’s after fifty years of accelerators, totamaks, and other gadgets designed to discover more about the nature of matter and energy, and we don’t seem much closer to practical fusion power than a generation ago. The fastest commercial air travel is slower than it was two decades ago. We have a better understanding of medicine and better medical procedures, but much of our own population and most of the rest of the world can’t afford the costs of availing themselves of such medical improvements. Will such costs eventually choke off real change in the medical procedures available to most people?

According to some test scores, American students are smarter and improving in their knowledge of various subjects, and certainly there are more students in both absolute and percentage terms who are completing high school and college. Yet the high-level functional literacy rate of college graduates and post-graduate degree holders continues to decrease, and the absolute performance of males is declining relative to women. The United States, despite a century or more of effort to eliminate sexual discrimination, is one of the few western industrial nations that has never had a female head of state, and, unless matters shift dramatically, has never even had a major party candidate who was female. The U.S. is also the most overtly religious of the major western industrial nations. Does that religious background mitigate against significant real change in the gender power balance? And perhaps in other aspects of society?

Both Democratic Party candidates have called for “change,” but for what sort of change? I don’t see a call for re-invigorating our space program, or more more research in basic science, or for real and fundamental change in our approach to education, or anything approximating real change. What I see is an emphasis on changing who controls government and resources and who benefits from them, and that’s not the same thing… is it? Really?

The Future of False Hope

Those of us who write science fiction and fantasy are often considered to be people who enable escapism through our writing. Certainly, I’d dispute that, particularly given what I write. But…even if the charge happened to be true, which it’s not, we writers would hardly be the only ones in U.S. society institutionalizing escapism.

The other day a husband and wife who are acquaintances told me how upset they were by the university commencement address given by a Nobel laureate because the scientist had laid out rather directly and bluntly some of the challenges that the next generation would have to face, in particular those involving energy supplies and global warming. They both felt that a commencement address should be inspiring and uplifting, and “not a real downer.”

On the one hand, I can see their point. Hitting bright young graduates between the eyes with the cold water of realism is not exactly encouraging, when commencement is considered “their” day.

On the other hand, times have changed. Many long years ago, when I was in high school, educators made a practice of pointing out one’s short-comings in more than graphic detail, day after day, while suggesting that major improvements in attitude, effort, and skills were the only way to avoid a life of failure and lack of accomplishment. And when one got to college, the “standard” entry address to college freshmen was: “Look to your left; look to your right. By the end of the year, one of you won’t be here.” In those days, there was a draft and a war in Vietnam, and for young men, at least, not being there meant a good chance of being somewhere else — a place distant, hot, damp and dangerous. And more than a few students didn’t make it through the curriculum. Those that did finally got to hear an excessively optimistic speech about how they would go forth to conquer the world… or at least their chosen profession.

Today, except in a comparative handful of institutions, education tends to be all about encouragement and reward for often negligible accomplishments. For all the talk about tightening standards, and the like, the functional literacy of American university graduates continues to decline, even while the grades given — and received with little gratitude — has continued to inflate. Given the recent financial crises, it’s also clear that fewer Americans seem to know enough basic mathematics to understand how to calculate the impact of a mortgage payment on their monthly budget… or even what a budget might be.

So… we’ve moved from a more realistic system of education, where the commencement addresses were always falsely encouraging, to an educational regime that tends to exude false hope and low standards, but where commencement addresses are occasionally sobering. Personally, as a curmudgeon and cautious optimist, I think the old system prepared more students for the real world… and back then false hope was limited to an occasional commencement address and not dispensed throughout an entire course of studies.

The Vastness Illusion

Recently, especially in dealing with subjects like near-earth-objects or global warming, I’ve come across more and more people whose reaction to these subjects is conditioned by or based on what I’d call the “vastness illusion.” I’m not talking about unintelligent individuals, either, but people who have been highly successful in business, academia, and in other fields requiring education, skills, and experience.

Put simply, the vastness illusion is the belief that the earth, and especially our solar system, is so vast that nothing we as human beings do could possibly affect it in any measurable fashion.

Like many illusory beliefs held by humans over history, there’s a grain of truth behind the vastness illusion. In fact, there’s nothing that a given individual — unaided by technology and the efforts of others — can do that will make a measurable impact on our world. For better or worse, however, there are six billion humans now living on the face of the planet, and those six billion people and their technology, both high and low, do have a significant impact on the world and, in particular, on its climate.

Those six billion people rely on 3.3 billion cattle, sheep, and goats for milk, meat, wool, and other products, and those billions of head of livestock require food, most of it derived from grazing. Presently, over half the grass and rangelands are at least moderately degraded as a result of the more than doubling of livestock production over the past century. Human activities, mainly those associated with agriculture, have increased annual methane emissions from less than 80 million tons in 1860 to over 500 million metric tons a year at present, and those emissions remain in the atmosphere for an average of 12 years, and they are a greenhouse gas that helps warm the atmosphere.

The six billion people and their activities are also adding 30 billion tons of carbon dioxide, another greenhouse gas, to the atmosphere every year, and the majority of that CO2 remains in the atmosphere for close to a century. Both these greenhouse gases have feedback effects on the water vapor that is and has always been the largest greenhouse gas in terms of impact. Even a slight increase in global temperature results in more water vapor. So while the advocates of the “vastness” theory point out that CO2, methane, and other greenhouse gases are “marginal” in their direct contribution to global warming, they tend to ignore their considerable feedback impact on water vapor, which is anything but marginal.

Admittedly, the earth’s atmosphere is indeed vast, but human technology and human numbers multiply our effects on the world, in a real fashion analogous to compound interest. A percentage point here and another one there, and millions have trouble making their house payments. Well… the same is true about human impacts on our planet… except that if we lose a climate conducive to maintaining our present human cultures, we lose a great deal more than a few million houses, and it’s a different kind of arrogance to insist that our activities have no impact.

The earth is over four billion years old, and yet, in the last few centuries we’ve managed to consume between a third and half the fossil fuels created over that long span… and the earth is too vast for us to have any impact? We’ve hunted scores of species out of existence, and we can make no difference? The levels of carbon dioxide in the atmosphere are the highest in more than 650,000 years, and that’s been with no large or sustained unusual natural occurrences; the last eleven years have been the among the 13 warmest over the past century and a half… and possibly the longest “warm streak” in thousands of years, if not longer; the northern polar ice cap has been shrinking steadily for forty years, and now is at the smallest extent and thickness in thousands of years, if not longer.

Yet, there are those who insist that the earth is too vast for us to have any measurable impact. What sort of impact do they want before they’re convinced? All of Florida under water? Starvation of billions because of climate shifts? Or would anything matter, because they believe that we’re essentially helpless to affect matters one way or the other?

I suppose that’s comforting, in a way, because it means we can do anything we want without having to be held accountable. Just claim that earth is too vast for us to be responsible, as well as being so vast that we can’t change or affect any major challenge that nature hurls at us. And, of course, that means admitting that, as a species, we’re merely hostages to fate, unable to direct our destiny, poor lost souls, depending on chance or deities to rescue us from disaster. But then, since neither chance nor deities have had a very good record in that department, if the majority of homo sapiens cast their lot with those who claim earth is too vast for us to affect matters, they’re essentially condemning the rest of us to great privation and possibly even marginalization or extinction as a species — and sooner, rather than later.

Not only does that make for lousy government and cultural direction, it’s also a terrible plot for either science fiction or fantasy.

Death of an Anecdotal Species?

We of the species homo sapiens may not exactly deserve the “sapiens” label, since the terminology homo anecdotus or something similar might be more accurate. We react to what we see and what we hear, and tend to believe stories others tell, rather than facts, mathematics, or statistics.

When I was with the U.S. Environmental Protection Agency, there was such a furor over hazardous waste sites that, effectively, almost the entire political staff of the Agency was canned, including the Administrator, as well as the Secretary of the Interior. While I thought then, and still do, that the issue was badly bungled by the Administration, and that’s putting it mildly, they did have a certain point in believing that people were overreacting. That was because people could see the hazardous waste sites and the handful of children and others who suffered damaged health, as well as the contaminated neighborhoods.

HOWEVER… in perspective, as shown by a later series of studies, the “Superfund” hazardous waste sites were far from the most dangerous environment concerns. Yearly deaths from exposure to household radon were far more dangerous, by five to twenty times, as was asbestos exposure, which has resulted in more than 10,000 deaths annually. Cancer deaths from smoking exceed 300,000 annually, and automobile accidents account for some 45,000. Yet the Superfund political upheaval resulted in Congressional action headed toward impeaching the head of EPA and resulted in the resignations of both the Interior Secretary and the EPA Administrator, and the conviction of an assistant administrator for perjury before Congress.

Another example of this anecdotalism is exemplified by people who refuse to fly because they feel driving is safer. For them, the anecdotal example of the infrequent air crash where 300 people die has a greater impact that the fact that most people are ten thousand times more likely to die in an automobile accident than in a plane crash.

On a far larger scale, take the issue of cometary or asteroidal impacts on the earth. Based on what was seen, i.e., anecdotal evidence, scientists originally estimated that the chance of a “space rock” large enough to create a catastrophic impact on earth, such as the one thought to have wiped out the dinosaurs, was roughly once every million years. Then, more digging and satellite photography analysis discovered more craters, and the odds were increased to something like once every 100,000 years. Then, several years ago, several scientists made the rather obvious observation that the craters that had been discovered were all where we could see them — on land — but that the earth’s surface is something like seventy percent water. More investigation and correlation with historical and climate records revealed several more near-catastrophic water impacts over the past 10,000 years.

Then, recent astronomic discoveries have revealed that the population of near-earth objects [NEOs] big enough to wipe out cities or larger sections of the planet is approaching more than a thousand, and that their orbits aren’t nearly so stable as was originally surmised. Yet NASA, the U.S. space agency that might be considered to have a certain concern about space-related potential disasters, blithely informed Congress several years ago that any really reliable survey of NEOs would cost $1 billion, about seven percent of its annual budget — or one percent if spread over seven years — and that NASA had no intention of spending money on what is clearly a real threat, nor did it even have a draft contingency plan of what it might do if one of those objects was discovered to be on a collision course with earth, even though some respected astronomers have now estimated that the chances of a city-destroying [or worse] object hitting earth in any given century are about one in ten. In short, since we haven’t seen anything like this recently, except maybe something did explode above Siberia a century ago that we still can’t explain fully, it can’t be as real as the need to pinch pennies for other projects that don’t bear on the survival of our entire species, as well as a few thousand others.

The anecdotal mind-set may function adequately in a hunter-gatherer society, but just as we’ve given up, largely, chipped flint hammers for better tools, isn’t it time to go beyond the anecdotal mind-set, one that’s clearly limited to what we can see, and use a wider and deeper perspective?

Because, over time, if we don’t, earth will see the end of our anecdotal species.

New… and True… and Trite

I happened to come across a reader’s comments about the Spellsong Cycle, most of which boiled down to the fact that he liked all my books — except those, because they were “trite.” I mean, after all, writing about sexism and stereotypes is just so old and trite, and the idea of magic being wielded through song in a logical and technical basis is almost as trite, as well. Except… outside of Alan Dean Foster and Louise Marley, I haven’t seen any other decent, in depth, and logical treatments of vocal music as the basis of magic. It’s very rare, as Louise Marley herself has said upon occasion, and as both a noted novelist and a professional opera singer, she does have a bit of expertise in those fields.

That leaves the issue of novels dealing with sexism as perpetuating “trite” stereotypes and something that is so old and last-century, or even so nineteenth century. If anyone thinks that sexism is that out-of-date, then you’re living in a greater fantasy than anything I’ve ever written. A few examples follow. A highly-qualified gynecological oncologist [female] who runs the a division at a top medical school is paid less than a younger colleague [male] with far less academic and occupational qualifications, publications, or surgical expertise. Female full professors at any number of colleges and universities — with equivalent or greater time in rank and professional qualifications — are on average paid more on the level of male associate professors in the same disciplines. A similar discrepancy occurs in the ranks of business executives [when one can even find senior female executives who have managed to break through the glass ceiling]. What is interesting about all this is that these days, if you look at university graduates and post-graduates, women are winning a wide majority of the academic honors, with the exception of a few areas of science.

I’d also note the large number of political pundits who are calling for Senator Clinton to drop out of the race for the Democratic presidential nomination. As a long-time Republican, if of the Teddy Roosevelt stripe, I can claim a certain distance… but I would note that in my own twenty-odd years of political involvement I never saw anyone even broach that sort of suggestion to a male candidate. After all, it’s only right that a real man fights it out to the last, isn’t it?

Obviously, with six daughters and a wife all in professional fields, I have a wealth of insights and information from which I can draw, in addition to the statistics that are available to all — and which are largely ignored and minimized.

Now… one of the roles that F&SF fills in our society is to explore ideas and issues and problems, and it’s one of the few writing fields that does so consistently. I’d be the first to agree that readers certainly don’t have to read what they don’t like… and they don’t. Some readers have indicated that they stay away from my work that deals too directly with real-world issues. I can understand that. There are times when I certainly don’t want to deal with them. But issues tend to keep coming up until they are addressed.

After all, some of the Founding Fathers, among them John Adams, suggested that the slavery issue wasn’t going away — and it didn’t. Nor did the civil rights issues that followed. Nor will the issues raised by the current Administration in instigating a war and in suppressing civil liberties in the name of “security.” Nor will the problems raised in a society where almost any working woman has to do more and do it better than her male peers in order to even come close to them in terms of compensation.

Is sexism a long and enduring problem? Absolutely. Does that make it “trite?” Not in the slightest.

A reader can certainly complain about anything, and an author has to take complaints with enough grains of salt to fill all the shakers in my house. But… don’t tell me or anyone else that a real social problem is “trite.” You can tell me that the plot’s lousy, that you don’t want to read about women and their problems, or that the kind of fantasy you really want to read has to have more testosterone in it. You can claim my style’s weak, that the book’s too long or too short, or that the song lyrics should have been better. But when a reader claims that a real and unsolved social issue is trite… that’s a pretty good explanation in itself why that issue hasn’t been resolved… and why I’ll continue to raise the issue at least periodically.

Health Care… and the Future

The April 28th issue of the Wall Street Journal carried an article that would have been considered science fiction some thirty years ago — and James Gunn was one of the writers who addressed it then. Now it’s reality. Major non-profit hospitals are demanding payment up-front for expensive treatments when significant portions of the cost of treatment aren’t covered by insurance.

I suspect that the initial reaction of most people will be along the line of “that’s uncaring and cruel.” The problem isn’t uncaring health professionals or even heartless insurance companies, although I have my doubts that the accountants and actuaries operating most insurance operations have anything remotely resembling heart or compassion. The problem is that to deal with life-threatening diseases and conditions that were an automatic death sentence fifty years ago, medicine has become high-tech and expensive, even when pared down to cut-rate costs. Another problem is the cost of malpractice insurance, because in some specialties, malpractice insurance is the largest single expense for a physician, sometimes costing more than the doctor takes home for himself or herself.

Several years ago, my wife shattered her leg and ankle in a freak hiking accident on a very moderate trail. For a complicated, but relatively common surgery and a plate and screws in her leg, the total cost was almost as much as the average annual American worker’s yearly income. That was for something that is comparatively simple in medical terms. Other medical procedures that deal with life-threatening conditions are far more expensive. Cancer surgery and treatments appear to start at over $100,000 and climb rapidly. When somewhere over 40 million Americans don’t have any form of health insurance, wide-spread use of “pay-before-treatment” is effectively a death sentence for those who cannot find a hospital willing to treat them without a healthy deposit, and the numbers of hospitals who will do so — or that can afford to — is rapidly shrinking.

Non-profit hospitals have seen their unpaid bills pile up. Some have unpaid bills totaling $30 million to $50 million annually, up from a tenth of that two or three years ago. They’ve also discovered that collecting on such bills is often impossible. After all, if you don’t make the house payment or the car payment, the lender can foreclose and take them back. What sort of threat can a hospital make? They can refuse future treatment, but they can’t take back their treatment.

If they don’t collect on these bills, then people who can pay their bills — and their insurance companies — will pay more. That has already raised insurance costs and out-of-pocket costs for the financially able, and is likely to fuel future cost increases as well as make health insurance more expensive and less affordable for working Americans. If the government ends up picking up the losses, taxpayers end up paying the bill. All of the increased costs aren’t going to the doctors, nurses, and technicians, but also fund research, more and more elaborate equipment, and insurance.

There’s another fact that complicates matters more. Statistics released last week show that, for the first time, life expectancies are declining in the poorer U.S. counties. While statistics are not readily available, I suspect that in metropolitan areas, the group that may suffer the most is not necessarily those labeled as poor who receive government assistance and Medicaid, but those who earn just enough not to receive health care. For the past half-century, most Americans have taken health care as fairly much a given, but now, for a growing number, it’s not a given, and, equally to the point, regardless of all the political rhetoric, there not only isn’t a simple solution, there may not be one that allows more than basic health care for most Americans — and that may well result in the kind of future that Joe Haldeman suggested in one version of The Forever War — where virtually no medical care was available for the extreme elderly. Given the nature of advanced medical treatments and the resources required, it appears more and more likely that the most advanced medical care will only be universally available to the affluent, just as Gunn forecast over forty years ago… unthinkable as that was then, and certainly still is.

All Hail…

This afternoon, Saturday, May third, right after the completion of the 134th Kentucky Derby, the filly Eight Belles, who finished second, broke both front ankles and collapsed. The injuries were so severe that the runner-up had to be euthanized on the spot. NBC Sports, which covered the event, spent less than two minutes dealing with the tragic death of the filly, instead concentrating through the remaining 30 minutes of the telecast on interviews with the winning jockey, trainer, and owners, and showing at least three recaps of the race.

To me, that symbolized a certain emphasis that has overtaken the United States, and possibly the entire modern technological age — the focus on winning to the near-total exclusion of anything else. I’m not taking anything away from Big Brown, the winning horse. But he will live to race another day and probably survive to a ripe old age in stud in some green pasture. For Eight Belles, there are no other days.

For Eight Belles, all that remains, at best, is a hurried grave, if that, and a fleeting memory of a gallant race.

I’ve already heard words that her race and death was a metaphor for the efforts of women to achieve some sort of equality in society — a gallant race where they come off in second place, followed by death. Is that harsh? Perhaps… but I’m not so certain that it’s all that extreme.

And I’m absolutely convinced that the NBC coverage pattern is all too typical of the media, and possibly our entire societal focus — all honors and praise to the winner, no matter how he won, and but a fleeting mention of all the other gallant struggles that didn’t end in success. And then all the so-called pundits wonder why life seems to have gotten cheaper by the year, why business and politics have become ever more cut-throat, while reality TV gets higher and higher reviews, and while “gentler” sports and pursuits, the arts, and even reading, seem to fade.

Or, as I’m doubtless misquoting someone, “Winning isn’t everything; it’s the only thing.”

All hail, great media caesars, for those who die and are forgotten are about to salute you.

Of Sacred Poets and Sacredness

Years ago, Isaac Asimov wrote one of his columns in The Magazine of Fantasy and Science Fiction on the subject of the role of “sacred poets” — the idea that poetry immortalizes and dramatizes in a way no other aspect of human culture does. He actually took the term “sacred poet” from the Latin poet Horace, who had used it in pointing out that there were other heroes besides those immortalized in Homer’s Iliad, but they had lapsed into nothingness because they lacked a “sacred poet.” Asimov also made the point that even bad poetry has resulted in creating immortality, while often creating a false impression of history, such as in the case of Longfellow’s poem about Paul Revere’s ride, which leaves the impression that Revere was the hero who warned the Massachusetts colonists about the British, when in actuality Revere never completed the ride and the colonists in Concord were actually warned by Samuel Prescott. Yet most Americans who know anything about this part of American history remember Revere, not Prescott.

Rhythmic words, especially when coupled with music, indeed can have a powerful effect, but such “sacred” songs also require something beyond well-chosen rhymed words and music. They require knowledge and understanding of the events portrayed by the words and music. The more popular religion-based sacred songs rest on scripture and doctrine, but the more secular “sacred” songs [a juxtaposition that seems strange, but accurate in the sense described by Horace and Asimov] are based on history.

Thus, the Iliad is merely a long epic poem to those American students who even know anything about it, while it was effectively a “sacred poem” to the Athenians of Greece in the fourth century B.C. “The Star Spangled Banner” is a sacred song to most Americans, in addition to being the national anthem of the United States, but what is often forgotten is that it did not actually become the official national anthem until 1931, more than 117 years after it was composed during the bombardment of Fort McHenry during the War of 1812. It became the national anthem because it was a “sacred” song that linked history to the national emblem — the flag — not a “sacred” song because it was the national anthem.

Because the continued impact of sacred songs and texts depend on not only words and possibly music, but upon knowledge, they may fade into obscurity when the knowledge is lost, or disregarded, or minimalized by later generations. Songs such as “Blowin’ in the Wind” or “One Tin Soldier” were close to “sacred” songs for the young people of the Vietnam era, but they quickly faded. Today, it appears that there aren’t any replacements, not even of that nature.

What is also interesting is that the Iliad, as a sacred poem, was essentially book length. Such “sacred” songs as “America the Beautiful,” “The Star Spangled Banner,” and “The Battle Hymn of the Republic” are far shorter. The lyrics of the Vietnam-era songs were about the same in length, but were simpler and more repetitive. What people seem to remember — as a group, not as individuals — today seems to be confined to slogans, advertising slogans in particular.

Could it be that the death of “sacred” songs, texts, and poets will lie in the inability of people to listen to anything of length or complexity? Or will it lie in a cynicism that suggests that there’s little worth in “sacred” texts, regardless of the fusion of text, rhythm, and music? Or will such poems, songs, and texts just be replaced by consumeristic slogans?

The Instant News… and Its Implications

Whether it’s Headline News, Bloomberg News, Fox News, and AOL… everywhere there’s instant news… and where there’s not instant news, there are instant comment shows, or failing that, instant action dramas. But the instant news exemplifies the trend… and the problems. The other day, in a moment of weakness, I happened to be actually using the satellite TV and came across a well-known sports commentator who was pitching an instant sports news network or program with words to the effect that this instant sports news access venue [whatever it was] was a must to the young and hip, and that only those over fifty waited for the regular news to learn what was happening.

My first thought, as quickly as I turned off the system and regretted the impulse that had led me to even consider that there might be anything interesting being broadcast, particularly on a Saturday, was to wonder why anyone HAD to know the sports “news” that quickly. Then, there was the secondary thought about how much of the news, these days, is really so vital that one can’t wait for the next day’s newspaper. But then, our society is all about, as one commercial called to my attention by my wife stated, how “I want it all, and I want it now.” So I suppose instant news of all sorts is just another aspect of that attitude.

Still… for all the growth of and popularity of all these forms of instant news, it seems to me that either very few people realize the implications behind these demand for instant information or those that do feel that protesting what seems like a popular tsunami of support is futile.

So… here are the implications as I see them. First, as we already know, all these varieties of media “news” have become entertainment, not a source of real information, and whatever information is contained tends to be so condensed, slanted, or incomplete as to be either inaccurate or misleading. There’s a headline about how a substance increases cancer risks by sixty percent, but nowhere does the story point out that the risk for most people for that kind of cancer is something like 1/20th of one percent. Hazardous waste sites and nuclear power plants are touted as great health risks, when guns, falls, substance abuse, and automobile accidents are all literally hundreds of thousands of times more dangerous.

Second, because the media focuses on sensationalism in one form or another, meaningful news that impacts most Americans is ignored until it becomes a sensational disaster. The problems with adjustable rate mortgages and securities derivatives weren’t exactly a great secret. They just weren’t worth exploring as news until they created hundreds of billions of dollars in losses and started costing tens of thousands of Americans their homes.

Third, it perpetuates caricaturing as a media art-form, creating images of individuals in the news that may well be at variance with who they are or what they have done… or failed to do. This has always been a mass media problem, and some of the most notable examples are the way in which Hitler used the media in Germany, the American media’s creation of an image of JFK that bore little resemblance to the actual man and his considerable lack of achievement as president or the media’s depiction of Gerald Ford as a clumsy physical bumbler when Ford was in fact perhaps one of the most graceful and athletic presidents. In our present electronic age, especially, because of the mass media time-limits and the capabilities of technology, anyone presented in this format becomes an instant caricature.

Fourth, the emphasis on the current, new, and instant creates a pressure to act and react on inadequate information, and, as the Founding Fathers knew [which was why they structured our government to preclude hasty action and reaction], hasty actions almost always result in bad decisions and less than desirable repercussions. Yet today, the entire media culture presses people to decide “now.” Check your credit card balance instantly so that you can decide how much you can charge for that new wide-screen television. Vote your preferences online for the candidates — political or American idol, it makes no difference. It’s only entertainment.

Finally, as a result of the above, the entire idea of “news” as having a special or intrinsic value is devalued, and it becomes harder and harder for the average person to find the information that they need and should have without digging deeper and harder than ever before — exactly at a time when those who should learn more don’t want to and those who would like to know more have less and less time to explore it.

If these pressures remained in the electronic media, that would be bad enough, but they’re not. They’re also now exerting a considerable impact on the publishing industry. I can recall, years ago, reading the introductory chapters of James Michener’s books. Frankly, I really didn’t care much for the novels, but I found the popularized history and background fascinating, and that led to my reading more and more non-fiction in those areas.

One of the fastest-growing print entertainment areas is the anime/manga subgenre or cross-genre. I don’t have a problem with anime/manga per se, but I have great problems when I go into a bookstore and see carrels of books being replaced by what amounts to graphic novels, because, regardless of what the anime aficionados may claim, when real books are being replaced by grown-up comic books, the intellectual capacity of the culture isn’t headed in the right direction. It’s just another form of the over-visualized and over-simplified.

In the end, thinking requires a depth of information and time to consider. Instant news, instant entertainment, and instant reaction are all being pushed by the media in order to get people to instant-buy, but this rush to instant-everything denies any real depth of information and denigrates thoughtful consideration of facts and issues. And, if the trends continue, they’ll also water down, if not destroy, the thoughtful side of the print fiction market.

And the thought of losing future readers to instant sports or celebrity news tends to irritate me… and probably more other writers than would care to admit it.

There Must Be a Reason

Most current American fiction,by its very nature, and especially science fiction and fantasy, generally tends to repudiate the “absurdist” movement of the French existentialists of the mid-twentieth century. Does this repudiation, both directly and through its indirect influence on other media, actually perpetuate the very question that the existentialists raised, as well as help fuel the high degree of religious belief in the United States? Now that I have at least a few readers stunned…

I’ll doubtless end up grossly oversimplifying, but since I don’t wish to write the equivalent of an English Ph.D. dissertation, we’ll go for a modicum of simplicity. Sartre and Camus and others of the absurdist school tended to put forth the proposition that, in essence, life had no intrinsic meaning, that it was “absurd,” and that, in as illustrated in Camus’s L’Etranger, the only real choice one had in life was what to do with one’s life, i.e., whether to take a meaningful step to end it or to let life continue meaninglessly.

The question is, simply enough: “Does an individual life have intrinsic worth or meaning by the mere fact of existence?” The absurdist view would tend to imply that it doesn’t. The deeply religious Christian view is that every single life has meaning to the Deity.

While I can’t claim, and won’t, to have read even a significant faction of the something like 30,000 new adult fiction titles published every year, at times I have read a large fraction of what’s been published in the F&SF field, and I can’t recall more than a handful of books that discussed or considered intelligently the absurdist premises or more than a tiny fraction where the characters acted as though life had no intrinsic meaning. In some, a disturbing fraction, I have to admit, that intrinsic meaning was to be available to get slaughtered by the heroes or the villains, but a certain sense of value was still placed on the lives of even the most worthless.

Is this comparative authorial lack of interest in the possible meaninglessness of life bad? Not necessarily. In LeGuin’s The Left Hand of Darkness, however, one of her characters makes the statement that to oppose something is to maintain it. I’d suggest, following LeGuin’s words, that the continued cheerful, and unthinking assumption that life has intrinsic Deity-supported meaning leaves all too many readers and people wondering if that is really so… and why they should believe it.

For whatever wonder may be generated, though, very little finds its way to the printed fiction page.

I will offer one observation and clarification. Many, many authors speculate on the meaninglessness of a given character’s chosen life path, but that isn’t the same as whether life has an intrinsic meaning to or within the universe. In fact, I could even claim that the realization of or belief in a meaningless occupation or set of acts affirms the idea that life is meaningful in a cosmic sense — an application in a backwards way of LeGuin’s words.

Yet… on one side we have a universe some sixteen billion light years across in all directions with some 100 billion galaxies, each with between 50 and 100 billion stars, with the believers in intrinsic meaning claiming that each life has a special meaning. And there’s almost no one on the other side?

Well… maybe there are many on the other side, but outside of Richard Dawkins and those few like him, I’m not seeing all that many, and I’ve certainly not read about many heroes or heroines who look up into the night sky and consider the odds on whether life has that kind of meaning. Almost a century ago, Alfred North-Whitehead observed that when one wishes to understand truly a society, one should examine the basic assumptions of that culture, those which are so basic that no one has ever scrutinized them. I’d submit that one of those assumptions underlying western European-derived culture is that there is a God-given meaning to each life, and that the fact that the absurdist proposition died away so quickly suggests that this assumption remains strong… and largely unexamined.

I tend to deal with this issue, as I suspect a few other writers do, at the second remove, by having my characters act along the lines of: If there is a God/prime mover, then we should do the best we can because that’s what expected; if there’s not, it’s even more important that we do our best because we’re not getting any divine support.

But I do wonder if we’ll see many popular atheist/absurdist heroes or heroines anytime in the near future.

Standing Ovations & "Discrimination"

Many years ago, when all my grown children were still minors, one of them wanted to know why I seldom said that anything they did was good. My answer was approximately, “You’re intelligent and talented, and you’ve had many advantages. I expect the merely good from you as a matter of course. If you do better than that, then I’ll be the first to let you know.” Perhaps I was too hard on them, but that was the answer I’d gotten from my father. But my answer clearly didn’t crush them, or they survived the devastation of not having a father who praised everything, because they’ve all turned out to be successful and productive, and they seem to be reasonably happy in life.

As some of my readers know, I’m married to a professional singer who is also a university professor and opera director. She has made the observation that these days almost any musical or stage play, whether a Broadway production in New York, a touring Broadway production, a Shakespeare festival play, or a college production, seems to get a standing ovation… unless it is so terrible as to be abysmal, in which case the production merely gets enthusiastic applause. The one exception to this appears to be opera, which seldom gets more than moderately enthusiastic applause, even though the singers in opera are almost invariably far better performers than those in any stage musical, and they don’t need body mikes, either. Maybe the fact that excellence still has a place in opera is why I’ve come to appreciate it more as I’ve become older and more and more of a curmudgeon.

My wife has also noted that the vast majority of students she gets coming out of high school these days have almost all been told through their entire lives that they’re “wonderful.” This is bolstered, of course, by a grade inflation that shows that at least a third of some high school senior classes have averages in excess of 3.8.

In a way, I see the same trend in writing, even while I observe a loosening of standards of grammar, diction, and the growth of improbable inconsistencies in all too many stories. I’ve even had copy-editors who failed to understand what the subjunctive happens to be and who believed that the adverb “then” was a conjunction [which it is most emphatically not]. Matt Cheney notwithstanding “alright” is not proper English and shouldn’t be used, except in the dialogue of someone who has less than an adequate command of the language, but today that means many, many characters could use it.

At the same time, I can’t help but continue to reflect on the change in the meaning of the word “discrimination.” When I was growing up, to discriminate meant to choose wisely and well between alternatives. A person of discrimination was one of culture and taste, not one who was prejudiced or bigoted, but then, maybe they were, in the sense that they were prejudiced against those aspects of society that did not reflect superiority and excellence.

But really, does everything merit the equivalent of a standing ovation? Is excellence measured by accomplishment, or have we come to the point of awarding standing ovations for the equivalent of showing up for work? Can “The Marching Morons” of Cyril M. Kornbluth be all that far in our future?

More Writing About Writing

To begin with, I have to confess I’m as guilty as anyone. About what? About writing about writing, of course. Now… for some background.

When I began to consider being a writer, I thought I was going to be a poet, and I did get some poems published in various small poetry and literary magazines. And then, there was this escalating altercation in Southeast Asia, and I ended up piloting helicopters for the U.S. Navy and didn’t write very much. When I got out of the Navy, I started writing market research reports dealing with the demand for industrial pneumatic accessories by large factories. Then I wrote a very bad mystery novel, awful enough that I later burned it so that it could never be resurrected. Only after all that did I attempt to write science fiction, and after close to ten years of hit or miss short-story submissions, with only about half a dozen sales while I was working full-time at my various “day jobs,” I finally got a rejection letter from Ben Bova which told me to lay off the stories and write a novel. And I did, and I sold it, and I’ve sold, so far, every one I’ve written since. Now… all this history is not bragging, or not too much, but to point out that virtually all the writing I did for almost forty years was either occupational-subject-related or poetry or fiction that I hoped to see published — and even more hopefully, sold for real money and not copies of magazines and publications.

All that changed a year ago, when I started blogging… or more specifically, writing about writing or about subjects that bear on writing, if sometimes tangentially. Instead of writing fiction for publication, I’m writing close to the equivalent of a book a year… about writing. I’m certainly not the only one out there doing this. In fact, I’m probably one of the later arrivals in this area.

But I can’t help wondering, no matter how my publicist has said that it’s a good idea, if there’s something just a bit wrong about writing about writing, instead of just writing. What’s happened to our culture and our society when readers seem to be as interested, or more interested, in writing about writing than in the writing itself. And why are so many younger writers going to such lengths in their blogs to attract attention?

At least one well-known publisher has noted that no publicity is all bad, but is this sort of thing all that good? Or is it not all that good, but necessary in a society that seems to reward shameless self-promotion as vital for success?

Who could say… except here I am, along with hundreds of others, writing about writing.

The Future of the Adversarial Society

Some twenty years ago, when I was a consultant in Washington, D.C. [i.e., beltway bandit], a chemicals, paint, and coatings company came up with an environmentally safe way to get rid of their hydrocarbon leavings [still bottoms]. They wanted to transport and sell them to a steel company, which would then use them in its smelting process. This had the advantage of first, destroying the semi-toxic waste in a safe fashion that did not harm the environment, and second, providing a cheaper source of usable carbon for carbon steel. Not only that, but the steel furnaces were far hotter than commercial hazardous waste incinerators. To me, it seemed perfectly reasonable. Needless to say, this environmentally beneficial trade-off never occurred.

Why not? Because the U.S. EPA wanted to make sure that the process was 100% regulated, and that meant that the steel company would have had to apply for a hazardous waste disposal permit and submit itself to another layer of extremely burdensome federal regulation. Even then, U.S. steelmakers were having trouble competing, and more federal regulation would have compounded the problem. So, instead of having a cheaper source of carbon and a cleaner environment, the steelmaker paid more for conventional carbon sources, while the chemical company had to pay money to have its still bottoms incinerated in an approved hazardous waste incinerator. This didn’t help the American economy or the environment very much.

Unfortunately, I can now understand the combination of reasons as to why this happened… and why it continues. Most industrial companies haven’t historically acted, frankly, in the best interests of the population and the environment as a whole. That’s understandable. Their charter is to make money for the corporation and its shareholders, and one of the underlying and unspoken assumptions has historically been that corporations will do so in any way that is legal and will not besmirch their reputations. Likewise, because most corporations haven’t exactly been trustworthy or all that responsible for the larger issues, government bureaucrats haven’t been all that willing to trust them without imposing restrictions.

And exactly how did we get to this point?

First is the fact that, no matter what most people in the United States say, they essentially believe in a world of limitless resources. Somehow in some way, they believe, ingenuity and technology will keep things going, and there’s no real shortage, and if there is, it’s caused by government regulations or business greed. Second, we believe that competition is the way to ensure efficiency and lower prices. Third, we don’t trust government.

The problem is that all these beliefs are partial truths. There are great resources, but not unlimited ones. Competition indeed spurs lower prices, but it also encourages cut-throat competition and continued attempts by those who produce goods and provide services to transfer costs to others. Pollution transfers costs to the public, as does deforestation, strip mining, and a host of other activities. And government is certainly an institution to be wary of… but it’s the only institution that has the power to rein in out-of-control giant corporations, or on the local level, lawbreakers.

So… we have a society that is basically adversarial. Even our legal system is designed more like a stylized trial by combat than a means of finding truth or justice. How often does the better attorney transcend the “truth?” We’ve just seen a case where a pair of attorneys kept silent for years even when they had evidence that an innocent man was unfairly convicted. Why? Because our adversarial system would have disbarred them because revealing that evidence would have meant they were not fully representing the interests of their client.

So long as there are “excess” resources, an adversarial society can continue, but how long will a United States, with 5% of the world’s population, be able to continue to consume 26% of world resources? The Wall Street Journal just reported that literally billions of dollars worth of fuel is being wasted at U.S. corporations because cooperative waste reduction and energy efficiency initiatives keep falling afoul of adversarial attitudes between different divisions, differing regulatory agencies, and differing executives. At the same time, over the past five years, the price of energy has tripled…and that doesn’t count the costs of the energy-related war in Iraq, or the recent Russian announcement that Russian oil production has peaked and is declining.

Yet… are we seeing any changes? If anything, it appears as though our society is becoming even more adversarial, and that leads to a last question.

At what point does an adversarial society self-destruct?

SF and Future Business

The other day, as I was considering the origins of war, some observations came to me. When I thought over history and what I know, I realized — again — that most wars have economic origins, regardless of their widely identified or proximate causes. Helen didn’t have the face that launched a thousand ships, regardless of what Homer sang and others later inscribed. The Mycenaeans were after the lucrative Black Sea and Asia Minor trade dominated by Troy.

But that led to a second observation — that very little science fiction or fantasy actually deals with the hand-maiden of economics, that is, business itself, or even delves into the business rationales that explain why so many business tycoons cultivate political connections. Charles Stross’s The Clan Corporate deals with alternate world mafia-style types who mix special abilities, alternate worlds, murder and mayhem with business, and more than a few books cast corporate types as various types of villains. While I know I haven’t read everything out there, it does seem that books that deal with business itself are rare. One of the classics is Pohl and Kornbuth’s The Space Merchants, and two of my own books — Flash and The Octagonal Raven — deal heavily with business, but I can’t recall any others offhand.

Considering just how involved businesses have been in the disasters and wars of the nineteenth, twentieth and twenty-first centuries, it’s rather amusing that so few SF authors have taken on the challenge of dealing with business directly. Is it so impossible? Or is business just dull? Let’s see. One of the strongest factors in contributing to the Civil War wasn’t slavery, but the desire of indebted southern planters to repudiate their debts to New York bankers. Because of the influence of U.S. business types in Hawaii, a U.S. warship in Honolulu effectively supported the pseudo-revolution that overthrew the independent Hawaiian monarchy and turned Hawaii into a U.S. territory. The need for a shorter route for U.S. shipping prompted the U.S. to foment and encourage, and then support militarily, an independent Panama… and made U.S. construction and domination of the Panama Canal possible. Most of the industrialized world collaborated to put down the Boxer Revolt in China because they didn’t want existing trade agreements — and profits, including those from the opium trade — destroyed. Japan effectively started its part of WWII in order to gain resources for Japanese business, and Hitler was successful not just because of popular support, but because his acts restored German business. And, of course, despite knowledge of what was going on in Germany, during the early part of WWII, a number of U.S. companies were still in communications with their German counterparts and subsidiaries. More than a few industrial firms in the U.S. were opposed to an early pullout in Vietnam, and interestingly enough, the Texas-based firms prospered greatly, especially after Kennedy’s assassination. Now we have a war in Iraq, which occurred as oil demand continued to grow in the U.S. and after Iraq had given indications that it wanted to base its oil sales on the euro and not on the dollar. And those examples are barely the tip of the iceberg.

So… is business really that dull? We have expose after expose about what happens, and each year it seems to get more sordid… yet comparatively few authors seem to want to extrapolate into the future. Or is that just because none of them feel that they could possibly imagine anything wilder and more corrupt than what has already happened?

Simplistics in Writing and Society

As I have listened to the various candidates for president trot out their ideas and policies, and as I see and hear the public and media responses, I’m not just disturbed, but appalled. Beyond that, I also have to wonder how long intelligent fiction will remain economically viable. As it is, from what I can see, intelligent writing, which considers and reflects on matters in more than “seven steps” or “five tools” or “the church/government/corporation/male sex is the root of all evil” or “the more violence/sex/both the better” is already fast becoming limited to a small part of F&SF or non-fiction.

We live in a complex world, and it’s not getting any simpler, but there’s an ever increasing pressure on all fronts to make it seem simpler by blaming the “bad guys.” Now, who the bad guys are varies from group to group and individual to individual, depending on personal views and biases, but these “bad guys” all have one thing in common. They aren’t us.

Gasoline prices are rising. So let’s blame the multinational corporations and the Arabs for their greed… and, of course, the U.S. government for giving tax breaks to oil producers. Along the way, everyone seems to ignore the fact that the United States remains the third largest producer of crude oil in the world, behind Saudi Arabia and Russia, and that we produce twice as much oil as does Iran and four times as much as Iraq at present. But… with five percent of the world’s population, we’re consuming something like 26% of annual world production. Why are all those tax breaks there? Because producing oil in the US is far more expensive than elsewhere, and without those tax breaks U.S. oil production would decline even more. Does anyone consider what a few million 12 mile/per/gallon SUVs represent?

We have over 41 million Americans without health insurance, and guaranteed pension plans for Americans are declining faster than the government can count. Why? Might it just have something to do with the fact that we Americans are always looking for the lowest priced goods and services, and health care insurance and guaranteed pension benefits are the principal reasons why foreign car manufacturers can produce lower-priced/higher quality vehicles than the U.S. big three?

Housing prices, despite the current collapse, are still astronomical compared to fifty years ago, but how many people really look at the fact that the average new house is twice the size of the average new post-WWII dwelling… and has more than twice the conveniences and contains a two or three-car garage?

Immigration is another case in point. Building a 700 mile fence isn’t going to stop immigration. It might detour or slow it, but Americans want too many of the services immigrants provide, and we don’t want to pay exorbitant prices for them, no matter what we say publicly. Of course, there’s also the rather hypocritical aspect that everyone in America today is either an immigrant or the descendent of one — and that includes Native Americans. The latest studies indicate that the European immigrants of the 15th and 16th century brought the diseases that killed off close to eighty percent of the continent’s then-indigenous population. So… it was all right for our so-upright ancestors to seek a better life, but these people today shouldn’t have that opportunity?

As an economist, I could go on and on, with example after example, but these examples are just illustrations of a general mind-set. The current political mood is: “We want change.” The real translation of that is: “We really don’t want to consider how we got here, but please get us out without making us think about how we got ourselves into this mess, and, by the way, don’t make us pay for it.”

Unfortunately, this also carries over into writing, and particularly into fiction. Is it any wonder that the Harry Potter books have swept the world, but particularly the English-speaking world? In a stylized way, they recall certainties of a past time and offer a dash of short-term hard work and magic to solve the problems at hand. The success of The DaVinci Code offers another example of blaming ills on a mysterious church-related conspiracy. We have conspiracy and spy and thriller books and movies, all pointing to relatively simplistic villains who aren’t us.

Yes, as I discussed previously, for a writer to be successful, he or she must entertain, but why have so many writers retreated to or succumbed to the allure of the simplistic? Novels can certainly entertain without being simplistic, and without purveying gloom, doom, and despair, but there’s always the question of how many readers will buy the more thoughtful and thought-provoking work. I’ve certainly had readers who have written to say that they just weren’t interested in my “deeper” work, and I’m certain I’m not alone. I know several best-selling writers who began by writing some very thoughtful work that I felt was thoughtful, intriguing, and entertaining, not to mention fairly well written. They don’t write such work any more, and they make a great deal more money from what they do write.

Given the pressures of society toward the simplistic, how long will those writers who have not given into the allure and rewards of the overly simplistic be able to hold out against such pressures… and even if they do, how many readers will they be able to attract?

Thoughts on Writing Success

Jim Baen and Eric Flint, as well as other fiction writers and editors, have both made statements to the effect that every writer and publisher is competing for a reader’s “beer and movie money.” While not always literally true, their underlying point is all too accurate. A successful fiction writer has to leave his or her readers feeling that their time and funds were well-spent.

That’s obvious enough, but is there any single great and glorious formula for success in achieving that end? Not exactly, because there are as many types of successful writers as there are types of readers willing to pay for books. As a result, we have writers who range from those who produce what can most charitably described as “mindless entertainment” to those who write books that are so involuted and complex that often a single book is all that they ever publish.

Years ago, a well-known news magazine used to publish a chart on which the bestsellers were listed, along with a red or green arrow. The red arrow pointed down and the green one up, and the arrows represented the consensus of a span of reviewers. What I found interesting was that the vast majority of bestsellers almost invariably had red arrows after the title. While I tended to agreed with the arrows, beyond that my perceptions certainly didn’t agree with those of the reviewers in all cases.

These days, for whatever reason, I tend to agree with reviewers in the F&SF field even less than I did twenty years ago, and I usually didn’t agree all that often even then. That may brand me as a curmudgeon, and someone who was one even before I was old enough to claim that title by virtue of age, but I think the reason was simple enough. It had to do with the “suspension of disbelief.” I’ve never had that much trouble suspending my disbelief about plausible future high-tech gadgetry or even about magic — if the author is logical and consistent in describing and using such gadgetry and magic, but I’ve always had real problems when authors have characters and societies which act and react in ways contrary to basic human nature — and one of the historic problems with science fiction has been its excessive emphasis on the technical in ways often at odds with how societies work. Readers will easily and often point out that Dyson rings or the like need steering jets [or whatever], but will swallow far more easily economic, social, and political systems that could never work, usually because they’re at great variance with human nature.

In an overall sense, my writing reflects my views in this area and how I approach writing. In my opinion, this is as it should be, at least for me. As for editors, that’s another question, and one I’m not about to touch here.

All that said… books sell because the stories they tell and the way in which they’re told appeal to various types of readers. Some authors appeal primarily to readers whose make-up falls within clear preference lines. Others don’t. And there’s a temptation for newer writers to “aim” their works directly at a given type of reader.

To that, I say, “Don’t.” Especially if you’re new to writing professionally and if you want to have an identity and stay around for a while. I’m not saying there aren’t writers who aren’t good at targeting markets. There are. Some of them even are quite successful, but far fewer are successful than one might imagine. Why do I say this? Because any written work of any length reveals as much about the writer as do the story and the characters. If a writer’s style, structure, and views are consistently and widely at variance with the stories he or she is telling, sooner or later, in most cases, one of two things are likely to occur. Either the writer will burn out because he or she is fighting his or her nature, or the readers will drift away because of the dichotomy between the overt actions and characters and the conflicting subtexts.

And what of those few who can write “anything,” and do? More power to them, but I certainly wouldn’t want to be one of them. Not for a million years… or dollars, and I suspect those who read and like my work might understand why, and for those who don’t… it doesn’t matter.