Archive for the ‘General’ Category

Character-Driven or Plot-Driven?

Right now, from what I can tell, there seems to be a bit of an emphasis by some who think themselves experts on F&SF on the need for more “character-driven” fiction. Then, perhaps this has always been true. Whether or not it’s a resurgent emphasis or a long-standing one is irrelevant. It’s wrong. Dead wrong.

Now, before you scream for my head, I’d also like to say that dominance or emphasis on plot-driven or device-driven or any other form of “driven” is also wrong. The best fiction should always be an intertwined blend of character, plot, setting, and style.

If all a serious/experienced reader notices is one of those elements, whether it’s the characters, the plot, the setting, or the style, the work is not all it could or should be. I use the term “serious/experienced reader,” however, most advisedly, because we all have preferences, and we praise those books most highly that reflect our likes. Some readers want most of all to know the characters better and see what they will do when faced with both adversity and success. Others are most intrigued with the plot and how matters will work out. Others concentrate on the world-building or the setting, and for others the way in which the words are used is of paramount importance.

I’ve seen this one-aspect-focus with respect to my own work, where one reader will praise a book for its style, while another will denigrate the style, where another will praise the characterization, and another will declare the characters cardboard cutouts. Part of this results, of course, from each individual’s background, because words and phrases which are evocative and filled with both connotations and implications for one reader may convey nothing to a reader with a dissimilar background or tastes. Generally speaking, but not always, or exclusively, readers with wide-ranging tastes and experience pick up a wider range of what an author may convey… or they may understand all too well that the author’s presentation is merely slick superficiality.

“Character” doesn’t exist in a void, independent of the setting or the action, and both impact how character is revealed. In one novel I wrote years ago, there is a scene where a character has learned that a woman he loved has died in combat. He does not moan or say a word to anyone. He takes a throwing knife and keeps flinging it at a target until the target is mostly splintered wood and his hands are bloody. Yet some felt that this character was cardboard because he said nothing. Characters reveal who they are in various ways, but always more by their actions than their words. In another book of mine, the main character lies early in the book. He does not reveal that he lies, even to himself at the time. The words he utters are far less important than the fact that he has spoken them. He says nothing about it, nor does he reflect on those words. Other than that, he is most honorable in his actions, yet that lie reveals more about what he feels than any other single act in the book… and almost no readers have caught it, even though the lie is totally in character and vital to the conclusion.

And, just for the record, in my view, the only way in which the choice of words in a novel truly reflects the various characters is by the dialogue, those words spoken by each person. All the rest of those beautiful — or not so beautiful — words reflect on the setting.

Does this mean I’m against beautiful words — or lovely flowing sentences? No. It means I’m against sentences that are “beautiful” for the sake of being beautiful, just as I’m against flamboyant characters for the sake of having flamboyant characters or against miraculously crafted settings for the sake of the settings. In short, I’m in favor of what works best for the story at hand, not for what might be termed literary special effects.

Hack Work?

The other day I came across a blog that questioned how a number of well-known F&SF writers could physically produce the amount of work that they do. The blogger’s obvious and pat answer was that they could because “they’re hacks, and their readers have minimal expectations.” He then went on to mention some well-known mainstream authors who are prolific… but stated that these mainstream authors were quality writers. The blog had a clear implication that genre authors who write quickly must be hacks, unlike prolific mainstream authors.

As H.L. Mencken was reputed to have said, and as I recall, “For every difficult and involved question, there is an answer that is clear, simple… and wrong.”

Not only was the blog’s conclusion an insult to the genre writers, but it was also an insult to their readers.

The writers in question [who will remain nameless, because this is not exactly about them, but about preconceptions] have won more than forty “literary” awards, including the Hugo. Between them, so far as I was able to determine, their books have received more than 30 starred reviews from “mainstream literary” sources such as Booklist, Publishers Weekly, and Kirkus. Several of their works have been named as “books of the year” by Kirkus and Booklist. Some have even won awards from Romantic Times.

Yet this blogger [it would be an insult to professionals to term him a writer] could only term these successful genre authors as “hacks” because of the number of books they wrote in the speculative genre. I’d call them professionals, who have worked long and hard at their craft and who have been able to please both fans and literary critics. Pleasing both is far from easy.

Yet there remains a preconception that any writer who is prolific must be a hack, because good writing must be agony and take forever. I’m sorry. It doesn’t work that way. I’ve seen terrible novels that took the writer ten years or more to produce and good novels that a talented writer produced in less than a year. A good novel is a good novel, regardless of how long it took to write it, and the same is true of a bad novel.

As for time… think about it this way. There are 52 weeks in the year. Assume a writer only works five days a week like many people [this isn’t true, but assume it is], and that he or she sits before the computer or pad of paper or old-fashioned typewriter seven hours a day [an hour off for lunch and other sundries]. If that author writes one hundred words a hour, or 1.7 words a minute, at the end of a year, he or she will have written something like 175,000 words. This is not exactly breakneck speed. It’s also why I don’t have much patience with so-called professional authors who complain that they can’t produce a book more than every other year.

Now, obviously, that’s just for purposes of illustration, because there’s a need for such matters as research, editing, and lots of rewriting. Still… if that writer speeds up to three words a minute, that leaves a full five months of the year for rewriting, research, and “inspiration.”

On the other side of the “numbers mean hacks” issue are the readers. Yes, there are thousands, if not hundreds of thousands, of readers who are only looking for a story that will pull them in, and there are plenty of authors who can do that. But there are also thousands and thousands of readers who are looking for more than just a “quick read.” This latter group of readers can be quite critical, as I well know, and they don’t continue to support authors who don’t meet their expectations. Those expectations are not based on how many books an author publishes, but how well he or she writes what is published.

And, as I will repeat, quality is often independent of quantity, especially in our field, something that the blogger I’ve referenced didn’t seem to understand. Judge the books, not their numbers, nor the field in which they’ve been published.

Procrastination, Stupidity, or Species Suicide?

An asteroid appears likely to hit the planet Mars. Several years ago, a large comet impacted Jupiter, and its fragments created disturbances in the Jovian atmosphere that could have encompassed much of earth. Geologists have discovered the remnants of massive craters on earth itself, most of which totally restructured the environment and the atmosphere, not to mention life itself.

Another impact such as these could well threaten, if not destroy, life as we know it on earth. Does anyone care? Really care?

In 1968, the movie 2001:A Space Odyssey came out, and in it, Kubrick postulated space stations with tourists and space travel within the inner solar system, and an expedition to Jupiter. That was almost forty years ago, and despite all our advances in technology and computers, we haven’t even been back to the moon since 1972 — 35 years ago.

We have the basic technology to ensure the future of our species, and, with relatively minor improvements, to remove the threat to our planet from such asteroid or cometary impacts. And… what have we done? We’ve cut back on NASA and space research. And frankly, a number of the scientists haven’t helped much when they point out that unmanned missions are more cost-effective for gathering data. They doubtless are, but data isn’t likely to help us much if we need a large and powerful space drive to move an asteroid or plant a colony somewhere other than on an earth about to be devastated by some cosmic catastrophe.

That catastrophe will occur. The only question is when. The problem is that we’re a short-term culture facing an inevitable long-term problem, and our outlook is becoming more and more short-term year by year.

Look at the reaction to global climate change… or even to how many Americans continue to smoke, or drive while impaired, whether by cellphones or intoxicants. At the same time, we’ve glamorized making money and short-term fleeting fame to the point where fewer and fewer American students pursue advanced scientific studies and careers, and then we limit the access to foreign students who would do so, and who have consistently done so to our own benefit in the past.

As a society as a whole, the United States has become less and less interested in anything long-term, anything truly ethical [and I’m not talking about religion, which, unfortunately, ranges from a few deep and ethical believers to a mass of seekers of quick salvation], and far more interested in the quick acquisition of assets and things, the proliferation of entertainment options, interactive video or internet games, or who controls Iraq and Iran, or which theology should be dominant in what culture and society.

Long-term issues, like global catastrophe and environmental degradation, just don’t have much appeal. Admittedly, such issues have never appealed to most people, trying to survive day-to-day, but there were, from time to time, elites and educated individuals who did care. Where are they now, and what is the public reaction to them?

That reaction, it seems to me, is mostly along the lines of: I don’t believe you, and, besides, even if something does happen, it won’t be in my lifetime, and that means it’s not my problem.

And we’re supposed to be a sapient species?

Gimmick or Tool?

I recently read a reader’s review of one of my books that complained that I’d used the same “device” in several Recluce books — a use of order/chaos and drugs that suppressed memories. Earlier, other readers complained that surely, in a high-tech future, there would be more fantastic weapons than space torps. These “reviewers” then concluded, on this basis, that the books were repetitive.

My first reaction was, “Come off it, idiots!” My second was, “Why do you bother reading when you obviously don’t understand much about human nature and culture… and clearly don’t want to?” My third reaction was to write this blog to attempt to clarify something that has come up more than a few times, not only in regard to my writing, but in regard to the work of more than a few other writers.

Let’s start out with one basic point that I’ve discussed before, and that Heinlein pointed out in print more than 35 years ago. There are no new plots. There are only differing ways of addressing the eternal basic plots.

The second point is that human beings use tools. We develop them; we use them; we keep using them so long as they work. Hammers have been in existence for as long as we have historical evidence, and for at least some 50,000 years, if not longer. They meet a need, and they aren’t going away.

Now… how does this apply to F&SF? It’s so simple that I’m almost embarrassed to put it in print, but it’s also so simple and basic that more than a few readers obviously haven’t thought about it. When a writer creates a fantasy world and its subcultures, assuming that these cultures are populated by beings with human or humanlike characteristics, these beings will use tools, techniques, and the like for replicable results. They will continue to use them so long as they work, or until they are supplanted by something else which they find better. That means that they will hone and use the “magic talents” that they possess that are useful. They will not throw them away or forget about them unless they are not useful. Thus, fantasy series that are true to societal nature will in fact — and should — present various techniques and tools used over and over again by those who can.

Likewise, these tools — whatever they may be in whatever books by whatever authors — will always be used in furtherance of human motives along one or more of the basic plots in human literature.

New gimmicks merely for the sake of introducing new gimmickry to avoid reader “boredom” are not only fraudulent, but bad writing. They may provide momentary excitement, like a sugar high, or other highs, but there’s not much behind it. And like those addicted to other highs, readers who continually desire new gadgets, gimmicks, and twists can seldom fully appreciate much beyond such.

Now… those who desire the continually “new” will and do argue against writing too many books in a given fantasy universe, but I consider that about as valid as saying writers should stop writing mainstream fiction because people use weapons to get their way in all cultures or because bribery is endemic, or asking why people all travel by one of the limited means of transport in a given culture.

By the same token, hewing to the “traditional” for the sake of the traditional and because the unfamiliar is unacceptable is just as much a fault. Neither new for the sake of new nor tradition for the sake of tradition makes for good writing.

Certain Blessings

At least in western European cultures, we have entered the holidays, and much has been written about how the time has changed from a period of spiritual rejoicing to unbridled materialism, if a materialism leavened by those who still endeavor to do good and by that small minority that always do their best, regardless of season.

In that mixed light, I’d like to reflect on speculative fiction. Although I can scarcely claim to be impartial, given my occupation, I do believe that speculative fiction, certainly at its best, and even at its worst, does convey some blessings upon this troubled world, and, if more people read it, would convey even greater blessings. Am I saying I like all that’s printed in the field? Heaven forbid. I’m not certain I even like or agree with the majority of it. But what speculative fiction does that no other form of literature or entertainment [for the most part] does is speculate on cultures, ideas, likes, dislikes, prejudices, technologies, governments, sexuality and its variations, and much, much more. By doing so, the field offers readers the chance to think about things before they happen. Admittedly, most of what appears in print won’t happen, and much of it couldn’t happen, for various reasons. But that doesn’t matter. What does matter is that the ideas and the reactions and actions of characters to those ideas and places and events give readers not only an intellectual view of them, but a view with emotional overtones.

The emotional overtones are especially important because, for most people, an idea or a possibility has no sense of reality without an emotional component involving a feeling of how it impacts people. What speculative fiction does at its best is to involve readers with new ideas and settings in a context that evokes a range of feelings.

So often, when people or nations are confronted with a perceived danger, fear reigns, and thoughtful consideration is overwhelmed, if not submerged. And unscrupulous leaders and demagogues prey on that fear to enhance their own power and prestige. The most deadly fear is fear of the unknown. Speculative fiction explores the unknown, and the more people who read it and understand it, the smaller that sphere of the unknown becomes, and the less prone to political manipulation those readers become. To some degree, this is true of all fiction, but it is more true of speculative fiction.

And that is, I believe, one of the blessings the genre conveys, and one of which we who write it should always be mindful.

Truths and Untruths

The other day, as I was driving from one errand to another, I was listening to an NPR radio talk show where two independent budget analysts were discussing the federal budget and taking listener calls. One caller wanted to know why Congress didn’t stop all that wasteful foreign aid and use it to deal with the Social Security and Medicare problems. When the analysts both tried to point out that foreign aid is less than one percent of federal outlays [and they were absolutely correct], the caller insisted that they were wrong and that the government was giving foreigners money from other accounts hand over fist. Now, I spent nearly twenty years in and around the federal government, and I left Washington, D.C., some eighteen years ago. I started out as a legislative and economic analyst for a congressman, and I heard the same arguments and complaints about all that wasted foreign aid back then. Those arguments were numerically and statistically wrong in the 1960s and 1970s… and they’re wrong today.

Polls reveal that Americans believe that as much as ten to fifteen percent of federal spending goes to foreign aid, if not more. We’re talking about almost forty years of people believing in this total untruth. Why?

Despite the war in Iraq, the consistent trend in federal spending since WWII has been to spend a smaller and smaller percentage of the federal budget on defense [and foreign aid] and more and more on various domestic programs… and a majority of the American people still don’t know this, or the fact that domestic programs comprise over roughly 75% of federal spending and defense spending just over 20%.

Various groups of people, of varying sizes, believe in other “facts” that are not in fact true, including matters such as, but not limited to, the fact that the moon landings were a hoax, that the United States is a democracy [for those interested, it’s technically a form of representative federal republic], that Social Security taxes are invested, that the line you’re not standing in always moves faster, that North America was a barely inhabited wilderness at the time of Columbus, and that the world was created in 4004 B.C. [or thereabouts]… or [pick your own example].

Moreover, if you ever attempt to explain, rationally or otherwise, why such “facts” are not so to those who deeply believe in them, you risk indignation, anger, or even great bodily harm.

And many well-meaning souls will say in defense of those believers, “Everyone is entitled to his or her own beliefs.”

To what degree? Is a man who “believes” that the federal income tax is unconstitutional free not to pay his taxes? Does he deserve the same benefits as do other citizens? Is the soldier who enlists free to refuse to fight in a war he or she doesn’t believe in?

On another level, what happens to public policy making and politicians when large groups of their constituents believe in such facts and demand more domestic programs and lower taxes because they “believe” that there’s enough in the budget for those programs so long as foreign aid and waste are eliminated? Or when one group believes that abortion is murder and starts murdering doctors who practice it and another group believes it’s a woman’s right to control her own body and they start attacking politicians, financially, verbally, and otherwise, who insist on opposing abortion at all costs?

Just what is a “truth,” and how far can one go ethically in supporting it? And what does society do when that “truth” is an untruth? Or when large segments of the population believe in opposing “truths” and are willing to go to great lengths in support of their particular truth, as is the case in Iraq and other nations around the world, and as appears to be a growing trend in the United States?

Who’s Really in Charge?

In an earlier blog post, I intimated that at least some of those who espouse feminism in politics or science fiction were not so much interested in changing the structure of society as changing which sex had the socially dominant position. This leads to a related question: In any society, who’s actually in control?

Despite all the political scholars, the media talking heads who pontificate on the subject, the professional politicians, and the academics on both the left and the right and elsewhere, all of whom claim something along the line of “Whoever it is that’s in charge, things would be better if we were,” the answer is far from that simple.

Today, most polls suggest that the war in Iraq is unpopular with more than half the U.S. Yet we live in what is technically termed a representative democratic republic, and those representatives seem unwilling or unable to bring the war to a halt. Less than a third of the population is in favor of either the President or the Congress, and yet both the President and the members of Congress have been elected democratically, albeit by an actual minority of qualified electors.

Those merely slightly less cynical than I would claim that “apathy” is really in charge, but I can only find it chilling that with each expansion of the electorate two trends have continued to predominate if not accelerate. The first is that the intelligence of the average member of Congress has increased dramatically while the quality of decision-making has deteriorated equally dramatically. The second is that the numbers and scope of pork-barrel, earmarked, federally-funded projects have sky-rocketed.

Could it just possibly be that the expansion of the electorate might just have resulted in a political system where ever-brighter politicians use increasingly sophisticated technology and techniques to pander to the wishes of a majority of their constituents, regardless of the long-term consequences or the overarching national considerations?

Could it be that the majority of those voting are actually in charge? How could that be? Surely, the astute citizens of our great land would not continue to vote into office politicians whose principal interest in maintaining position and office translates into an ever-increasing drive to funnel federal bacon into their states and districts, to the detriment of larger national interests. Surely, the desire to do right could not degenerate into merely doing whatever is necessary to perpetuate one’s self in office… could it?

Thoughts on "Good" Writing

After more than thirty years as a published professional author, I’ve seen more than a few statements, essays, comments, remarks, and unprintable quotations about writers and writing, and, as I noted in an earlier blog, I’ve seen the proliferation of lists of “bests.”

Just recently, Brian Aldiss published an essay in the Times of London that pointed out how neglected and overlooked so many good speculative fiction writers happen to be.

But… is what constitutes “good writing” merely a subjective judgment?

At the risk of alienating almost everyone who writes and who reads, I’ll go out on a limb and say that I don’t think so. I firmly believe that there are certain basics to good writing that, if we had the tools, which we do not, as of yet, could be measured objectively. But since those tools have yet to make an appearance, I’ll merely offer some subjective and scattered observations.

Some aspects of writing can already be measured objectively, such as basic grammar. When subjects and verbs do not agree, the writing is bad. When punctuation is lacking, the writing is certainly suspect. When six different readers come up with six totally disparate meanings for a passage, the writer’s skill is most probably lacking.

Beyond such basics, however, writers, English professors, reviews, and editors can argue vociferously. Some believe that style is paramount, and that beautiful sentences, impeccably crafted, with each word sparkling like a gem in its own precisely placed setting, are the mark of good writing. Certainly, well-crafted sentences are indeed the mark of a good writer, but when the sentences take over from the meaning, the emotional connotations and overtones, and the plot, those beautiful sentences become purple prose, no matter how well-crafted.

Still others advocate the stripped-down Hemingwayesque style of short direct and punchy sentences and actions. My personal feeling, which I’ve discovered is shared by very few, is that in the best writing neither the reader nor the reviewer notices the writer’s style and sentences, because story and style become one. Put another way, the style becomes transparent in allowing the reader to fully experience the story. When the way in which a story is told is noticed more than the story itself, the writing is not as good as it could or should be.

Others cite originality in plot and the need for every book by an author to have a different plot. This particular fixity seems far more prevalent in F&SF; certainly mystery and romance readers don’t seem to mind the same basic plot time after time, and more than a few “great” writers have used a limited number of basic plots. In fact, Heinlein noted that there were only three basic plots.

Even today, there are editors who believe that any novel that is written in any other tense or persona than third person past tense cannot possibly reach the highest level of literary and artistic perfection. Unlike them, I believe that the choice of tense and persona should be dictated by the story itself and represents an integral part of the novel or story, and that the default third-person, past tense is only a general guideline and certainly not part of a set of objective criteria for excellence in writing.

Endings clearly vary from genre to genre. Certainly, very few “great” mainstream novels have happy or up-beat endings, while very few fantasy novels have endings leaving the main characters as miserable — or as dead or dysfunctional, if not both — as do those mainstream novels. The implication from the “literary” critics seems to be that a novel cannot be good or considered as great unless it leaves the reader lower than a snake’s belly, while the fantasy critics tend to believe that a book cannot be good unless the supply of nifty magic “stuff” is not endlessly innovative and unless the hero or heroine suffers and triumphs over hardships and difficulties so massive and entrenched that the efforts of entire societies had theretofore proved insufficient to surmount. [And I confess that, once or twice, I have succumbed to this weakness, and I do hope that I will possess the fortitude to resist the temptation to go forth and do the same in the future.]

The human condition, in general, tends toward optimism in a world whose behavior tends to reinforce the reality of pessimism. For that reason alone, my personal feeling is that “good” writing should encourage and represent realistic hope.

The Instant Society… and Rise of Stress and the Decline of Forethought

Final examinations are nearing at Southern Utah University, and student stress is building to incredible levels, as it does near the end of every semester these days.

Every day, my wife, who is a full professor at S.U.U., is deluged by students who are “so stressed” that they’re having trouble coping. They have great trouble dealing with the term papers, the projects, the juries, the performances, and the examinations that all come due in the last week of the semester. Now… such requirements aren’t exactly new. They’ve been a part of collegiate curricula literally for generations, and my wife and other professors continually warn students not to procrastinate and attempt to get them to think ahead. But very few of them do, and this generation seems to have far more difficulty in dealing with the situation than any previous generation. Yet the world that awaits them beyond school is filled with deadlines and pressures, and eliminating or reducing such pressures from college, as some institutions are apparently attempting to do, hardly seems a good way to prepare students for “real” life.

Why? Is just that they’re more verbal about the pressures? No… I don’t think so. There are too many other indications that they actually do feel stressed out. But why? Why should these college students be so stressed? They have the highest standard of living of any group of students in history and the most opportunities. When I was their age, the country was in turmoil, and there were riots about the Vietnam War, and a goodly percentage of young men faced the draft or military service in the service of their “choice” before the draft claimed them for the Army. When my parents were students, it was the middle of the Great Depression, and Germany was turning to Nazism, and World War II loomed. When their parents were students, the era of the Robber Barons was in full swing, and the nation was heading into World War I.

The vast majority of problems faced by today’s students are internal, arising out of their own chosen life-style and habit patterns. Yes, there is a drug problem, but they don’t have to use or abuse; that’s a matter of choice. Even war, for them is a matter of choice, given that we have an all-volunteer armed services. HIV, AIDS… those too are essentially a matter of choice, except in very rare cases. Whether one gets into the “right” university or graduate school is not a matter of survival, unlike being conscripted for WWI, WWII, Korea, and Vietnam. And while the “right” school may confer greater opportunities, those opportunities don’t come down to actual survival, but to a higher level of income and prosperity.

Yet “stress” and college counselors abound, and most students seem to complain about being “stressed out.”

I’d submit that this wide-spread epidemic of stress is the result of our “instant society.” Back before the age of computers, doing something like a term paper required a certain amount of forethought. Papers, strangely enough, were far longer then, and required more research, with extensive footnotes and bibliographies. Typing them required more time, and anything more than punctuation revisions could not be made without retyping the entire page. Tables had to be carefully measured and hand-typed. Graphs were hand-drawn. What can be done in minutes today on a computer took hours and then some.

Today’s students are used to getting everything “instantly.” When I was a student, unless you were wealthy, telephone calls required either lots of quarters and a pay phone [now nearly obsolete] or a recipient who would accept the charges. That necessitated at least some forethought. Today, it’s just flip open the cellphone and call. There was exactly one fast food restaurant in the town where my alma mater is located, and it was a long walk from campus, and the college grill closed at 10:00 p.m. And late late or Sunday shopping for paper or supplies… forget it.

Now… I’m not praising the “good old days.” I’m just saying that they were different, and that difference required a basic understanding that you couldn’t do everything at the last moment, because very little in society was “instant.” Even so, some students procrastinated… and flunked out. Today, they can procrastinate, and technology sort of allows them to throw something together… but it’s often a mess… and they end up stressed out.

No matter what anyone says, it just doesn’t occur to most of them to plan ahead. Why should it? Between watered-down high school curricula where last minute preparation usually suffices, especially for the brighter students, and a society that caters to instant gratification on all levels, very few of them have ever had to plan ahead in terms of dealing with day-to-day work and studies.

They’re intelligent; they’re incredibly quick at some things, like video and computer games and tasks and internet searches. What they aren’t good at is foreseeing the convergence of the mundane into a barrier that can’t be surmounted at the last minute. Nor are they all that good at seeing beyond the immediate visual superficiality and assessing how what they see may play out in the long run.

So… we have stressed-out students, many of whom will turn into adults who will end up even more stressed out when it turns out that neither technology nor the instant society have an instant solution for their lack of forethought… when they truly have run out of time.

The Commentator Culture

Last weekend, as with almost every weekend this fall, the college football pundits were proven wrong once more as Oklahoma upset Missouri and West Virginia lost. The commentators were wrong. All this got me to thinking about just that — commentators.

We have sports commentators, who are “experts” on everything from bowling, golf, and football to anything that appears on some form of television — and that’s anything that’s professional, in additional to the collegiate “money” sports. We have financial commentators. We have political commentators. We have news analysts and commentators. We have religious commentators. We even have F&SF reviewers and commentators.

Yet all too many of these commentators are really just dressed-up versions of Monday morning quarterbacks, with explanations of why things happened after they already did. Pardon me, but anyone with a certain amount of intelligence and knowledge about a field ought to be able to explain what did happen. But how many of them, particularly outside of sports, have that good an average in predicting what will happen?

Besides, what about the old idea of thinking for one’s self? Doesn’t anyone think out their own views — by themselves — any more?

While it’s always been obvious that a certain percentage of any population is unable to formulate coherent and logical opinions about much of anything, I have to wonder whether many are even trying these days. Oh, I’m certain that people retain that capability, but with instant polls on everything from whether anyone agrees with what Celebrity X is doing to who leads in what Presidential primary state or whether the results of the Hugo voting are superior to the results of the World Fantasy Awards or whether some other writers and books really deserved the “award,” we’re inundated with commentary and interpretation of news, polls, and events, so much so that it’s often hard to find a complete set of facts by which one might, just might, have the opportunity to make a judgment based on facts, rather than on commentary.

It almost seems that, in more and more fields, commentary is replacing facts and news about the events, as if readers and viewers could not be bothered with learning the facts and deciding by themselves. I know that I have to take and read more and more periodicals, often more obscure ones, just to find information. Even news stories in the local papers are filled with speculations and commentaries on why something happened, so much so that it’s difficult, if not sometimes impossible, to discover the facts.

I’m dating myself, but I really miss the attitude of Jack Webb on the old Dragnet, when he’d say, “Just the facts, sir, just the facts.”

That’s one reason why I’ve been so pleased with the unpredictability of the college football season. At least somewhere, real life is destroying the false image of the infallibility of “professional” commentators.

Writers: Is It Overused "Theme"or Truthful Observation?

Over the years, I’ve noticed that various readers and reviewers have remarked on the fact that I seemed obsessed with the “theme” of power, and sometimes the “theme” of gender and sexual politics. Other writers get identified with these or other “themes,” and usually, but not always, the noted identification carries the implication that the writer under discussion should get on with it and stop pounding at that theme.

But… is there a distinction between observation of human nature and a theme that underlies human behavior? Or is it just a matter of reader and reviewer opinion? Is it a repetitive and unnecessary theme when the reader or reviewer doesn’t want to accept the observations, but merely life-like when they do?

For better or worse, before I became a full-time writer, I spent almost thirty years in the worlds of the military, business, and government and politics, and in these worlds I received a thorough education in how power is used and abused in all fashions by human beings. As many others before me have noted, and as doubtless many others after me will note, very few people really understand and know how to use power effectively, and even fewer use it for what might be called the “greater good.” This is not a “theme.” It’s an observed fact, and if I include fictionalized versions and variations on what I’ve observed, as an author, I’m being true to human nature.

This issue applies to other aspects of writing science fiction and fantasy as well.

In the Spellsong Cycle, Anna continues to use the same tactics, often in battle after battle. So do various others of my characters in other books, and some readers have complained that was “unrealistic,” that such tactics wouldn’t continue to work. In combat, effective tactics are based on the abilities of the combatants, the weapons at hand, the geography, and various other limited factors. The range of effective tactics is indeed limited, and tactics are used effectively over and over again. This is why military strategists study ancient and modern campaigns. In addition, weapons change their form, but their functions change slowly over time, and sometimes not at all over centuries. Today, the function of the vast majority of modern weapons is the same as two centuries ago — to apply various destructive and explosive devices to the most vulnerable aspects of the enemy. We’ve gone from musket balls to cluster-bombs and RPVs, but the function remains the same. Even in science fiction, this observation holds true.

Likewise, so does another human variable — the slowness of human beings, especially in groups — to learn from experience. Even after WWI, the armies of most industrialized nations, including the U.S., still retained cavalry units — with horses — despite the clear knowledge that mounted cavalry was ineffective and counter-productive against such weapons as the machine gun. Castles took a long time to vanish after the development of artillery. Yet I can’t count the number of times I’ve had readers complain about — and even some reviewers comment on — why one side or the other doesn’t learn how to cope with something after one or two battles. Borrowing from another media… Duhhh!

After a certain amount of experience, I learned that fights of all kinds consist of short and violent action, punctuated by far longer periods of comparatively little action. As a beginning Naval aviator, I was told that flying was “99% boredom and one percent sheer terror.” In a sense, it’s true. Most time in the air is spent getting to a place where intense action occurs or is undertaken before you return. Some missions are designed to have no action; you’re either gathering information or waiting on station in the event something might happen. Yet far too many books depict only the action and all action… and more action. To me, that’s incredibly unrealistic.

Yes, fiction has to offer entertainment, and no one wants to read, and I certainly don’t want to write, something as boring as a moment-by-moment adaptation of boring reality. By the same token, not taking into account the crux of human nature and human brilliance and stupidity — and at least some of the waiting in between — can only result in the written version of a high-speed video game.

I don’t write those, but they do get written, and that’s part of the marketplace. I don’t mind that, either, believe it or not, but what I do mind is when readers and reviewers with a “video-game” mindset criticize those authors who are trying to enlighten and educate, as well as entertain, because their books are more true to life. Some themes are true, both in life and fiction, and ignoring them is one reason why conflicts like Vietnam and Iraq, or the Middle East, or… [pick your favorite politico-military morass] have occurred and will continue to happen.

Overused theme or time-tested observation? In the end, it still depends on the reader’s viewpoint.

Living Forever — Fact, Faith, or F&SF?

The other day, my wife made an interesting observation. She asked, “If so many people believe in Heaven and an afterlife, and Heaven is so wonderful, why is everyone trying to live forever?” At that, I got to thinking about the associations and corollaries. According to the polls and statistics, the United States is the most “religious” nation in the world. And from what I read and can determine, we’re also the nation that spends by far the most money on medical research and procedures to keep older people young and to extend life-spans. We’re also the nation where talk of practical immortality and agelessness holds great sway, where the singularity will lead to practical agelessness, if not immortality. The entire issue of immortality has been one of the staples of both science fiction and fantasy from the beginning, with the immortal land of faerie or such works as Zelazny’s This Immortal.

Yet, if the true believers are right, what’s the point? Heaven is obviously a far better place than here on earth. If it weren’t, how could it be Heaven? So why are we spending billions to keep the most elderly barely alive, if that, when they could be in a better place… that is, if you’re a believing and practicing Christian or Muslim? And why have so many books and stories centered on immortality?

Now, I’m not disabusing medicine or medical research. People shouldn’t have to suffer horrible diseases or die of infections or be paralyzed for life or otherwise incapacitated when medicine can cure them or improve their life or condition. Yet, the plain fact of medicine is that, in the United States, the vast majority of medical care and expense goes to those who are in their last year of life, and far, far, less money in research and treatment goes to children and infants.

If those dying of old age are going to a better life anyway, wouldn’t it make much more sense to spend more of that medical funding on finding cures for children’s ailments… or providing better nutrition and preventative care for the young?

But then, do all those true believers really believe in Heaven and the afterlife? It’s often been said that actions speak louder than words and that people put their money in what they believe… and they read that which interests them. If that’s so, all the medical scrambling to extend lives and find immortality might suggest a certain, shall we say, shallowness of belief. Even hypocrisy, perhaps? Or, too, perhaps they do indeed believe in an afterlife, and subconsciously don’t want to face the theological nether regions reserved for those whose actions are less than charitable and worthy.

Either way, I find it food for thought. Exactly why does a society with so many true believers support medical age-extension and the quest for physical and earthly immortality anyway? And why is there now such an increase in books about immortal vampires and werewolves and the like? Are the two trends connected… and if they are… how do they square with the fact that the fastest growing religions are those which are best described as fundamentalist evangelical… with the attendant belief in an afterlife?

Genre Chaos

This past week saw yet another group of reviewers post their “best books of the year” listings, and there will be more yet to come. In times to come in Locus, at least four or five respected gurus will list their choices. Why, if an author can’t get something somewhere into something labeled as best, he or she obviously hasn’t been trying hard enough.

But what constitutes “the best?” According to my outdated Funk & Wagnall’s, “best” means “of the highest quality.” This doesn’t help much, because also according to that same dictionary, “quality” is defined as “the degree of excellence,” and “excellence,” in turn is defined as of “superior quality.” When you get a definitional circular argument in meaning such as this, it’s a fairly good indication of subjectivity. “Excellence” or “quality” falls into that category that might be described as, “I can’t really quantify or objectively explain why this is good, but by [the appropriate deity] I know excellence when I see it.”

Compared to what? To other books just like it? To all fiction? To a selected body of work based on the subjective criteria of the reviewer?

As all too many readers of speculative fiction know, a number of writers of “mainstream” fiction, or thrillers, or romances, or mysteries have adopted SF themes in their work, the majority, sadly to say, often badly, if not totally ineptly. The critique has often been that, first, yes, they were writing science fiction and, second, they did it badly. The real critique should have been more direct — they wrote bad books.

This basic issue of quality has been obscured by the “dictates of the marketplace” and aided and abetted by the growth of the book chain superstores [yes, yet another great sin laid at the feet of the evil empires of book marketing]. In their zeal to sell as many books as possible as easily as possible, clearly in competition with the comparative mindlessness of broadcast/satellite/multimedia entertainment, the publishers and the book chains have broken fiction into genres and sub-genres, and sub-sub genres. We have whole sections of bookstores devoted to media-spin-off-teenage vampire series or Star Trek spin-offs, or… the list is long and getting longer.

Yet for all this splintering of fiction into genres or sub-genres — perhaps better identified as niche marketing opportunities that require less thought and consideration by would-be readers — what we’re seeing is a lower and lower percentage of the population remaining as serious readers. One explanation for this is that reading is “merely” entertainment, and with the proliferation of other venues of entertainment — video and online gaming, satellite and cable television, multiplex theatres, DVDs, year-round broadcast sports, etc. — the proportion of readers is bound to decline. I certainly can’t argue with that, because it’s exactly what’s appeared to have occurred.

When I was growing up, back in the dark ages when television meant three network channels and one independent local TV station, reading was effectively subdivided into non-fiction, fiction, and magazines and comics. Comics were for kids, and reading only magazines suggested a certain lack of intellectual perseverance, which may have been why there were book clubs where people really read the books and discussed them. And… more people of all ages used to read books.

All the multiplicity of fiction genres, and their accompanying awards, to my way of thinking, puts more and more emphasis on following genre or sub-genre rules than on writing a good, intriguing, entertaining, well-written, and logically internally consistent work. Yet, every time I look around, it seems as though there’s another set of awards, based on another media offshoot, or another genre or sub-genre. As all these awards have proliferated, and as the book marketing empires segregate books, often quite artificially, a smaller and smaller percentage of the general population reads. Amid all the genre-chaos and confusion, a few handfuls of authors succeed in establishing themselves as “brands,” which is one of the few ways in which a writer can transcend the limitations of genre-niche-marketing taken to extremes. Others work on media or gaming tie-ins. The rest… well, the numbers are showing there are more and more writers being published, and most of them are selling fewer copies of individual titles than their predecessors of a generation earlier. Yet the multiplicity of awards continues to proliferate.

But, no matter, if I get an award for the best novel dealing with alternate history of an established fantasy series universe [if it’s my own universe, anyway], based on the greatest logical constructions of fantasy improbabilities, I’ll take it… graciously and gratefully, and my publicist will probably find a way to get it on the cover of my books after that.

The Wrong/Incomplete Data

Several years ago, an acquaintance made a comment that almost caused me to take his head off. He said, “Your wife has a really cushy job. She doesn’t even leave for work until 9:30 every morning.” I refrained from homicide and tried to explain that, first, that because she is both a college professor and an opera director, as well as a performer, she seldom got home before nine or ten o’clock at night, and usually it was later, far later, that she worked four out of five weekends at the university, and that overtime compensation was non-existent. He replied by pointing out that she only had to work nine months out of the year. I just shook my head and walked away, because that wasn’t true, either. Generally, she only gets paid for nine months, but she works between eleven and twelve months a year — admittedly “only” about forty hours a week in the summer to catch up on what won’t fit in the year, to research and often write the shows for the coming year, to conduct job searches, and to write the required scholarly articles. And for all that, with all of her graduate work and international expertise, and as a tenured full professor, she makes far less money than do almost any of our offspring — only one of whom has more degrees.

I’m not writing this to say how down-trodden professors are — I do know some who truly skate by, although they’re a tiny minority, and that could be yet another example — but to offer the first instance of what might be called “data abuse.”

The second example is that of the Mars probe that crashed several years ago, because its systems clashed. One system had been programmed for “English” measurements, the other for metric. A third example is NASA itself, and the fact that manned space exploration has actually declined in scope and in accomplishments ever since the Apollo missions of more than 30 years ago.

A fourth example is the issue of school voucher programs, a proposal that was just defeated in Utah. Proponents argued that providing vouchers for roughly $3,000 a year per student for those who wished to go to private schools would actually allow more money for those students who remained. Mathematically, this would have been true, but the most salient points were minimized and never addressed in all the sound-bite coverage. First, even if every student received the maximum voucher amount, on average families would have to come up with an additional $4,000 per student. Exactly how many families making less than the Census Bureau’s “middle-class” income of $42,000 are going to be able to come up with an additional $8,000 in after-tax income [assuming two children in school]? Currently, only about 15% of all private school students receive financial aid, and that means that schools cannot afford to grant significant additional aid, not without raising tuition. Second, a great many communities in the state have no private schools at all. Third, the program did not provide additional funding to pay for the voucher program, but would have diverted it from existing [and inadequate] public school funds. So, in effect, the voucher program would not have benefited low-income students, or most middle-class students, but, for the most part, would have subsidized the tuition of those who could already afford such schools. Certainly, the program would have done little for the public school system, even though the supporters claimed that it would have.

Another example is the “core” inflation version of the Consumer Price Index, which is supposed to measure the rate of price inflation, and is the index used by government to measure how inflation affects consumers. Several years ago, however, the changes in the prices of food and energy were removed because they were too “volatile.” Yet 67% of all petroleum products go to transportation, and the majority goes into the tanks of American cars. So, as we have seen a price increase of almost 60%, as measured by the cost of a barrel of oil, over the past year or so, that increase doesn’t appear as part of inflation measurements. Thirty-three percent of all the petroleum we use goes into making industrial products, such as rubber and plastic, and chemicals. But those costs are reduced by “hedonics” or implied quality improvements. If your new car has better disc brakes or cruise control, or automatic stability, the CPI auto component for durable goods is adjusted downward to reflect quality improvement. The only problem is that the price paid by the consumer doesn’t go down, but up, yet the statistics show a decline the durable goods index.

These are all examples of what I’d loosely term “using the wrong data.” At times, as in the case of the Mars probe, such usage can be truly accidental. At other times, as in the case of my acquaintance, such incorrect data usage is because the user fits a prejudice into existing data and doesn’t really want to seek out conflicting and more accurate data.

In other cases, as exemplified by the NASA budget, other data, chosen to exploit other political priorities, take precedence. And, as illustrated by the voucher issue or the CPI measurements, all too often those with a political agenda have no real interest in using or examining the full and more accurate range of data.

What is often overlooked in all of these cases, however, is that in none of them did those involved use “incorrect” data. The figures used were accurate, if often selective. Yet in political and policy debates; in inter-office and intra-office, or departmental budget or resource allocation tussles; even in conversation; what people focus on all too often is whether the numbers are accurate, rather than whether they’re the numbers that they should be considering at all. Seeking accuracy in irrelevant data isn’t exactly a virtue.

It’s not just whether the data is accurate, but whether it’s the right data at all.

More on the Hugos

Several people have contacted me about my proposal for a Hugo for Betty Ballantine, and one pointed out that the World Science Fiction Society Constitution limits what Hugos can be given, and further stated that the special award given to Betty in 2006 was probably the only practical kind of recognition possible.

After reviewing the WSFS Constitution, I will note that section 3.3.15 states:

Additional Category. Not more than one special category may be created by the current Worldcon Committee with nomination and voting to be the same as for the permanent categories. The Worldcon Committee is not required to create any such category; such action by a Worldcon Committee should be under exceptional circumstances only; and the special category created by one Worldcon Committee shall not be binding on following Committees. Awards created under this paragraph shall be considered to be Hugo Awards.

I note the last sentence: Awards created under this paragraph shall be considered to be Hugo Awards.

Now… was Betty Ballantine’s special award in 2006 a Hugo under the rules? I honestly do not know, but, given the comments I’ve received, it doesn’t appear to be, and more than a few life-time professionals in the field have declared that what Betty received is not a Hugo.

I still believe that Betty deserves a Hugo, but in studying the WSFS Constitution, I discovered what I believe to be a serious fault, and the fact that Betty has not received a Hugo is just one example of that fault.

The fault is simple, but basic and obvious. There is no single standing and permanent award for achievement in a body of work, whether in writing, editing, art, or publishing. Every single award is for work appearing in the previous year. Now, for authors who have a substantial body of work, and who have not received a Hugo, at some point, there is a chance that a “late-in-career” book will receive a nomination and a Hugo, one that it probably does not merit, in order for the voters to recognize, if belatedly, someone who has been overlooked in the annual popularity contest. The same is true of artists, and under the revisions involving editors, for them as well.

Wouldn’t it be far better simply to create an on-going Hugo for life-time achievement, the way the World Fantasy Convention has [horror of horrors] than to keep ignoring those whose contributions may have been less spectacular in any given year, but whose overall achievements dwarf those of many one-time Hugo award winners?

If the WSFS does not wish to address this, then perhaps the Constitution should be amended to read — “The Hugo awards reflect only popularity among a limited number of readers in the previous year and do not attempt to reflect continued and sustained excellence by members of the speculative fiction community.”

Is this an issue that members of the WSFS wish to address, one way or another, or is everyone happy with the continuation of the annual popularity polls and the ignoring of long-standing contributions to the field?

A Hugo for a True F&SF Pioneer

When I was at the World Fantasy Convention earlier this month, I had the privilege of having breakfast with Betty Ballantine, whom I had never met before. Even at 88, she’s sprightly and has a cheerful and feisty wit, but after that breakfast, I realized that only a comparative handful of people truly know or understand the contribution that Betty, along with her late husband Ian, made to western literature and publishing, and particularly to science fiction and fantasy.

Betty and Ian began importing mass-market paperbacks from the United Kingdom in 1939 before helping to form Bantam Books and then launching their own firm, Ballantine Books. Prior to the Ballantines’ efforts, there were virtually no paperback books in the United States, except those already imported by the Ballantines. Ballantine Books became one of the earliest publishers of original science fiction books, publishing such authors as Arthur C. Clarke, Anne McCaffrey, and H.P. Lovecraft. They even published the first “authorized” edition of J.R.R. Tolkien’s works. By their efforts, they effectively lifted science fiction and fantasy from the pulp magazines to paperback books and created a commercially viable genre that in turn laid the groundwork for the media take-offs for such television shows as Star Trek and movies such as Star Wars, not to mention such later bestsellers as The Wheel of Time and Harry Potter.

Of course, one of the reasons why Betty was at the convention was that she had been selected to be the recipient of a Lifetime Achievement award from the World Fantasy Convention. But, as noted by many, it did seem rather strange, in retrospect, that this woman, who has done so much for both science fiction and fantasy, has never been honored with a Hugo — the most recognized popular award in speculative fiction.

While I understand that L.A. Con IV did offer a “special committee” award to Betty Ballantine in 2006, a special committee award is almost a slap in the face for someone to whom every speculative fiction author and reader owes so much.

All too often, those who pioneered and made something possible are forgotten in the glare of the successes of other people, successes that the pioneers made possible. That’s particularly true today, where fame is even more fleeting than ever and where celebrity so often overshadows true achievement. Sometimes, after they’re dead, such visionaries and pioneers are remembered and memorialized, but while that’s great for posterity, it really doesn’t show much appreciation for the real living person, and Betty certainly deserves that appreciation.

So… what about a Hugo for Betty Ballantine in Denver next year? A real Hugo, voted on by all those whose reading was made possible and affordable by Betty and by those whose writing, and cinematic and video achievements might not ever have come to be without her efforts?

And… for the record, and the skeptics, I’ve never been published by any imprint even vaguely related to those created by Betty… and I strongly doubt I ever will be. I just happen to think it’s a good idea.

The Under-Recognized Passion… and Its Future

Most of us, when someone mentions passion, think of sex, at least first. But an article in New Scientist got me to thinking about another passion that is far stronger and far less recognized than sex — greed.

In January 1820, a transplanted German who had taken the British name of Frederick Accum published a book, Treatise on Adulterations of Food, and Culinary Poisons. The book provided an expose of how those in London’s food trade adulterated their wares and poisoned their consumers. Accum named names and spared no one, illustrating how bakers used gypsum and pipe clay in bread, how lemonade was flavored with sulfuric acid, how new wines were aged with sawdust, how phony green tea was created by using poisonous copper carbonate.

And what was the reaction to Accum’s book? It sold out, and, then, there were anonymous threats against him. Those who didn’t like what he wrote followed him around until he was observed ripping several pages containing formulae from a book in the Royal Institute library. He was immediately charged with theft, and his reputation attacked and destroyed, all for the sake of profit, however obtained. Although the charges were dismissed, Accum was forced to return to Germany. Not until thirty years later did the British medical journal, The Lancet, and Dr. Arthur Hill Hassail address the problem, and Parliament finally passed the Food Adulteration Act in 1860. It took far longer in the United States, until after the muckraking of the early 1900s.

You think that’s all in the past? Flash forward to today.

We have had the experience of cheap pet food from China being contaminated, and almost every week, some food manufacturer is recalling something. It’s not just food, either. It goes well beyond food.

Enron built a phony trading room in order to further its energy shell game, and then left all the shareholders and employees holding the bag. Similar shenanigans occurred with WorldCom and Global Crossings. And what about all the sleazy mortgage brokers who sold naive homeowners mortgages that they wouldn’t be able to afford once the “teaser” rates vanished? Or the payday lenders who charge effective interest rates of 100% and more?

Even in “legitimate” commerce, greed has its place, from the hedge fund traders who make hundreds of millions of dollars for shifting paper… a number of whom just lost hundreds of millions, if not billions of dollars… to the airline industry.

As just one example, airlines have scheduled 61 flights to depart from New York’s JFK International Airport between 8:00 A.M. and 9:00 A.M. every morning. There’s the small problem that existing systems and technology only allow for 44 departures. The Federal Aviation Administration has suggested either: (1) charging airlines more for “prime” take-off slots or (2) limiting the number of flights per hour. The Airline Transport Association, representing the major carriers, finds both of these options unacceptable and states that the FAA needs to adopt new GPS-based and high-tech radar control systems. The FAA probably will have to do this sooner or later, but there’s a small problem. It’s called funding. The airlines don’t want to pay for improving a system that’s already highly subsidized by the taxpayers; the Congress doesn’t want to; and passengers don’t want to.

What else is greed besides not wanting to supply honest goods — in this case, on-time departures — for a reasonable price? Instead of trying to solve the problem, the airlines and the politicians will ensure we’ll get more delays because everyone wants a service more cheaply than it can be provided… and that’s also a form of greed.

Oh… and by the way, in 1820, the last section of Accum’s Treatise concluded by recommending that “the painting of toys with colouring substances that are poisonous, therefore, ought to be abolished.”

So why are we still seeing children poisoned by lead paint, almost 200 years later? And why this will still be a problem fifty or a hundred years or more into the future?

Tell me again why greed isn’t stronger than sex. Except… sex sells more books, and I keep trying to ignore that, because sex is transitory, and greed isn’t.

The "Singularity" or "Spike" That Won’t Be

Over the past decade, if not longer, there have been more than a few futurists who have predicted that in a decade or so from now, modern technology will change human society on a scale never before seen or imagined, from implementing the linked society envisioned in Gibson’s Neuromancer to wide-scale nanotech and practical AIs.

It won’t happen. Not even close. Why not? First, because such visions are based on technology, not on humanity. Second, they’re based on a western European/North American cultural chauvinism.

One of the simplest rules involved in implementing technology is that the speed and breadth of such implementation is inversely proportional to the cost and capital required to implement that technology. That’s why we don’t have personal helicopters, technically feasible as they are. It’s also why, like it or not, there’s no supersonic aircraft follow-on to the Concorde. It’s also why iPods and cellphones are ubiquitous, as well as why there are many places in the third world where cellphones are usable, but where landlines are limited or non-existent.

A second rule is that while new technology may well be more energy efficient than older technology, its greater capabilities result in greater overall energy usage, and greater energy usage is getting ever more expensive. A related human problem is that all the “new” technology tends to shift time and effort from existing corporate and governmental structures back onto the individual, sometimes back on higher-paid professionals. For example, the computer has largely replaced secretaries and typists, and this means that executives and attorneys spend more time on clerical types of work. Interestingly enough, both the hours worked/billed and the rates of pay for junior attorneys are way up. Another example is how financial institutions at all levels are pushing for their customers to “go paperless.” I don’t know about everyone else, but I need hard copy of a number of those documents. So if I “go paperless,” all it means is that I spend time, energy, and paper to print them out.

In short, technology is expensive, and someone has to pay for it, and it’s doubtful that we as a world have the resources to pay for all that would be required to create the world of the spike or singularity.

Another factor involved in tying all one’s bills and payments to automated systems is that one loses control — as my wife and I discovered in trying to unscramble all the automated payments her father had set up. After his death, in some cases, it was impossible to even discover where the payments were going. A number of companies kept charging for services he obviously didn’t need and siphoning money from his bank account, despite the fact that he was dead. It took the threat of legal action and the actual closure of some accounts to get the banks to stop honoring such automatic withdrawals.

Technology has also enabled a greater range of theft and misrepresentation than was ever possible before the internet and computers.

The other factor is cultural. The idea of a spike or a singularity assumes that everyone on the planet wants to be plugged in, all the time, and on call continuously, while working harder and harder for the same real wages in employment positions that seem increasingly divorced from what one might call the real physical world. While those in the upper echelons of the professions and management may find this useful, even necessary, exactly how are the vast numbers of service workers employed at Wal-Mart, MacDonalds, Home Depot, etc., even going to afford such services when they’re far more worried about basic health care?

Am I saying the world won’t change? Heavens, no. It will change. More people will in fact have cellphones, and, like it or not, it’s possible that they’ll replace location-fixed telephones for the majority of the population. Portable devices such as the iPhone will change entertainment, and fewer books will be printed and read, and more of what will be read, either in print or on screen, will be “genre” fiction, how-to, or religion. Published poetry and “mainstream literature” will decline further. More and more “minor” lawbreaking will be detected by technology in industrialized societies. “Major” lawbreaking may even be treated and handled by some form of cranial implant and locator devices. Various forms of environmentally less damaging power generation will doubtless be adopted.

But for even a significant minority of the world’s population, or even that of the USA, to engage in a “post-singularity” world will require more and more other people take care of support services, such as real-world, real-time small child-care, medical services, the physical production, transportation, and distribution of food. And don’t tell me that we’ll have duplicators for food. That’s most unlikely because to make such devices nutritionally practical would require analytical and formulation technology that we won’t have, not to mention the requirement for a large “stockpile” of the proper sub-ingredients. And, of course, a great deal more energy at a time when energy is becoming ever more expensive.

That doesn’t even take into account the cost and technological requirements for medical services and maintenance… and that’s a whole other story.

Economics and the Future of Biotech

Recently, I exchanged several emails with a newer writer– David Boultbee — on the subject of plants genetically engineered to remove toxins from land and water, and the exchange got me to thinking. A number of years ago, when I was a full-time environmental regulatory consultant, a number of cities were experimenting with various ways in which growing plants could be used to filter and purify sewage and waste water, including removing heavy metals and various types of organic and bacterial contamination.

That was twenty years ago, and there’s been surprisingly little progress in his area, particularly given the need. That brings up the question as to why such progress is so slow… and the answer, I believe, is quite simple. It’s not a question of biology or even development costs, but the structure of our economic system.

Growing plants in large concentrations effectively constitutes agriculture. These days, agriculture is largely unprofitable on anything but a large scale, and the greatest amount of profit doesn’t usually lie in producing and selling the raw material, but in the distribution and end-point sales. That’s why orange growers, almond growers, and others form grower cooperatives that attempt to control the product all the way from production to final [or next-to-final] sales.

Now… even if a genius biologist does produce an oilseed plant that’s got a huge amount of oil that could be refined, where does the profit lie? With the refiner and distributor, who need to build an enormous infrastructure in order to make profits competitive with other industries in order to obtain the capital necessary to build that infrastructure. And in what industries do the highest profits lie? In those that produce small goods with low production costs with a high demand and an existing market.

Agricultural products seldom fit that market. Take wheat. It’s practically ubiquitous, world-wide, and while different varieties have been developed for different uses and climates, within those climates any competent farmer can grow it. The entire U.S. farm subsidy program was developed because too much of too many agricultural products were being grown, with the result that the prices were so low that too many farmers went bankrupt, to the point that, as noted above, only large farms — or specialty farms — remain profitable.

So… what happens if the biologists develop miracle plants? Before long, the entire world has them, and they cost less, and the profit margin is low — and they’ve either replaced products that had a higher profit margin, or they replace pollution control technology that does. And whole industries lose substantial profits. You can see why certain industries just might not be exactly supportive of really effective large-scale and widespread biotech. Biotech is just fine in making new high-margin pharmaceuticals, but fungible energy supplies or pollution control remedies, those are a different matter.

This isn’t a new story in human history. Way back when, sometime before, say, 200 B.C., there was a plant that grew in the Middle East, well-documented in more than a few writings, paintings, and even sculptures. Taken in some oral form, it was apparently a reliable contraceptive. It became extinct before the Christian era. Why? Because it filled a social need, a desperate one for women in poor societies who felt they could not afford more children, but no one could see a profit in growing or preserving it. Now, whether this plant was as effective as the various writings claim isn’t really the point. The point is that people thought it was, and yet there was no profit in cultivating it, and thus, it was hunted out and used until there were no more left.

So… I have grave doubts that we’ll see many biological solutions to our energy and environmental problems until someone can figure out a way to make mega-profits out of any new biological developments.

Sometimes We Get it Right

Although we science fiction writers like to claim that we predict or foreshadow the future in our work, historically our record isn’t really as great as we’d like to think, for a number of reasons, some of which I’ve discussed in previous blogs.

Arthur C. Clarke predicted communications satellites and the like very early and effectively, something like 60 years ago, but he also predicted we’d have cities on the moon and be able to travel to Jupiter by 2001. That was six years ago, and the way things are going, it may be sixty before any of that occurs — if it does at all. In The Forever War, Joe Haldeman predicted that we’d have interstellar travel by now. Isaac Asimov did all right in anticipating the hand-held computer/calculator [as he said, he even got the colors of the display for the first calculators right], but we’re nowhere close to his pocket-size fusion generators, intelligent humanoid robots, or even affordable automatic irising doors. Most of my incorrect speculations lie in my early short stories, and I’m content to let them remain there in obscurity. I tend not to have made as many incorrect speculations in recent years, not because I’m necessarily brighter than other writers, but because all of my SF novels are set far enough in the future that enough time has not yet passed to reveal where I may have been wrong. Writing the near future is indeed a humbling experience, and I prefer not to be humbled in that fashion.

For one reason or another, many of the past staples of science fiction have never come to be. We don’t have wide-scale use of personal hovercraft or helicopters, and likely never will. Despite quantum mechanics and linked electrons, it’s doubtful that we’ll ever have instant doors or transporters to other locales, even on earth. And for all the speculations about genetic engineering [or natural mutations] that will bring agelessness or immortality to us, research to date seems to suggest that while life spans can be extended and physical health as we age greatly improved, there are several biological stone walls to attaining great age, let alone immortality, one of which is that greater cellular regenerative capacity appears to be linked to greater carcinogenic propensity. As for a cloned copy of you — or me — that’s not going to happen anytime soon, either, if ever, because recent research appears to indicate that even identical twins aren’t, due to prenatal conditions, genetic “expression,” and other factors.

Against this backdrop, I am pleased to announce that astronomers have just discovered a billion light-year long void in the universe, a space absolutely devoid of normal matter, without stars or galaxies. A full report will appear in a future edition of Astrophysical Journal. For those of you who have read The Eternity Artifact, you will understand my pleasure at having one of my speculations proved right. At this point, however, since the locale is more than 6 billion light years away, there is no way to ascertain whether the reason for this void is as I postulated in the book. But… I did put it in print almost three years before the void was discovered.

“Coincidence” or not, sheer undeserved good fortune or not, I’ll take consolation in having at least one of my far-fetched speculative postulates being confirmed.