Archive for the ‘General’ Category

Feminism, Social Change, and Speculative Fiction

The other day I received an interesting response to my blog about the impact of social change in science fiction on readership. The respondent made the point that she felt, contrary to my statements, that fantasy had more social change depicted in it because at least there were more strong female characters in fantasy. Depending on which authors one reads, this is a debatable point, but it raises a more fundamental question. Exactly what are social change — and feminism — all about, both in genre literature and society?

The other day there was an interesting article in the Wall Street Journal, which reported on the study of performance of mutual fund managements. The study concluded that the results from funds managed by all-male teams and those by all-female teams were essentially the same. The funds managed by mixed-gender teams reported significantly less profitable returns. The tentative rationale reported for such results was that mixed-gender teams suffered “communications difficulties.” Based on my years as a consultant and additional years as an observer of a large number of organizations, I doubt that “communications” are exactly the problem. In mixed-gender organizations, where both sexes have some degree of power and responsibility, I have noted that, almost inevitably, men tend to disregard women and their advice/recommendations to the degree possible. If their superior is a woman, a significant number tend to try to end-run or sabotage the female boss. If the superior is a male, because women professionals’ suggestions tend to get short shrift, the organization is handicapped because half the good ideas are missing, either because they’re ignored, or because women tend not to make them after a while. Maybe one could call that communications difficulties, but, as a male, I’d tend to call it male ego and insecurity.

What does this have to do with feminism in speculative fiction? A great deal, it seems to me, because merely changing who’s in control doesn’t necessarily change the dynamics below the top. This is one of the issues I tried to highlight in my own Spellsong Cycle, as well as in some of my science fiction. In “Houston, Houston, Do You Read,” the solution proposed by James Tiptree, Jr., [Alice Sheldon] was to eliminate the conflict by eliminating males. As a male, I do have a few problems with that particular approach.

In Sheri Tepper’s Gate to Women’s Country, the males get to choose to be “servitors” to women or warriors limited to killing each other off, while the “violence” gene [if not expressed in quite those terms] is bred out of the male side of the population.

Ursula K. LeGuin addressed the dynamics of gender/societal structure in The Left Hand of Darkness, suggesting, it seems to me, that a hermaphroditic society would tend to be just as ruthless as a gender polarized-one, if far more indirect, and not so bloodthirsty in terms of massive warfare.

In the end, though, the question remains. In either fiction or life, is feminism, or societal change, about a restructuring of the framework of society… or just about which sex gets to be in charge?

Notes to Would-Be Reviewers

Heaven — or something — save us writers from the amateur reviewers, and some professionals, who pan a book with phrases similar to “trite plot” or “worn-out character type” or “overused plot device,” “all too typical young hero,” “standard PI,” etc., ad infinitum.

Far be it for me to be the one to say that all books all writers write are good. They aren’t. Nor will every book I write appeal to those who read my work. It won’t, and probably shouldn’t. But… those of you who are reviewers or who aspire to be reviewers, please, please, don’t display your ignorance by basing your judgments on “worn-out” character types or “overused plots.”

As Robert A. Heinlein noted in his “Channel Markers” speech to the U.S. Naval Academy more than 35 years ago, there are NO new plots. There are only a limited number of basic plots. As a result, there are no overused or trite plots. There are writers who handle plots badly, for a myriad of reasons, just as there are writers who handle them well. There are writers whose characters do not fit the plots, but the problems don’t lie in the “plot.” They lie in how the plot was or was not handled.

Almost every plot Shakespeare used in his plays was cribbed from somewhere else or someone else, but his work remains “fresh” and “original” after more than four centuries because of the way in which he handled those very common plot elements.

The same type of analysis applies to characters. Certain archetypes or types appear and reappear in novels, not because they’re tired or the authors are lazy, but because they’re necessary. If one writes a courtroom drama, there will be good attorneys and bad attorneys and brilliant attorneys. There may even be marginally competent attorneys and evil ones, but there won’t be moronic ones because they can’t pass the bar. Mercenaries will almost always be ex-military types, because that’s where one gets that kind of experience. Private investigators will almost always be ex-police or ex-military, or possibly disbarred attorneys, for the same reasons. In fantasy, knights should almost always be either wealthy or older retainers of the wealthy who have worked their way up from common armsmen, or professional military, because in any half-realistic society, those are the only way to gain the resources and experience. Pilots need to have a high degree of training and education and good reactions — and good judgment, because they’re in charge of rather expensive equipment and lives.

All too often both critics and social reformers tend to forget that stereotypes arise for a reason. They’re real. There are “good cops” and “bad cops.” And whether one likes it or not, if you see a large minority male in gang-like attire emerging from an alley and heading in your direction at night, discretion is indeed the better part of valor, stereotype or no stereotype. The same is true of the sharp-dressing WASP male who wants to sell you a large bridge for the smallest of sums. Obviously, stereotypes and archetypes can be and are overused, but slavish avoidance of such is as much a contrivance as overuse.

Likewise, try not to criticize a writer because he or she writes a particular kind of book. I don’t see reviewers trashing mystery writers, or “literary” writers, or romance writers because they write the same type of book time after time. One can note that the writer continues to write a particular type of book — but if you say that, make sure that’s all that writer writes. You can certainly point out that the writer didn’t handle it as well as in the past — or that the writer improved, but don’t trash it because you wanted the writer to write something different.

So… if you want to review… go ahead. Just try to do it with a touch of professionalism and understanding.

F&SF Short Fiction

Recently, Steven King wrote an essay that appeared in The New York Times suggesting, at least as I read it, that one of the reasons for the decline of short fiction was that all too many short works of fiction were written for the editors and the critics, and not necessarily for the readers. Among the host of those who have commented, Scott Edelman, the editor of SciFi Weekly, has just written a response that points out that, while it wasn’t King’s intention, effectively King has said to readers that there are so few good short fiction stories that all of the good ones are in King’s anthology and that readers really didn’t have to look farther.

Both King and Edelman are correct in noting that the short fiction market is “broken.” After all, eighty years ago, F. Scott Fitzgerald was paid as much for any number of his stories sold to popular magazines that just two story sales in a year earned him more than the average annual earnings of either doctors or U.S. Congressmen — and he sold far more than two stories a year. Even then it took money to live in Paris.

There are some gifted short fiction writers in F&SF, and so far as I know, not a one of them can make a living purely off short fiction. By some counts more than a thousand short speculative fiction stories are published annually. This sounds impressive, unless you know that around a thousand original speculative fiction novels are published every year, and novels pay quite a bit more. The sales of major F&SF print magazines have been declining for years, and until the advent of Jim Baen’s Universe last year, the rates paid for short fiction have been low, and essentially static.

It’s also a well-known, if seldom-stated, fact that the majority of F&SF magazines are edited as much to promulgate and further the editorial preferences of the editors as to appeal to the full range of potential readers.

Jim Baen was well aware of these facts, and so is Eric Flint. That, as I understand it, was the reason why they created Jim Baen’s Universe, the online magazine. In fact, Eric once told me that his goal was not to publish stories designed to win awards, but to publish outstanding stories that would entertain and challenge readers, and that he felt that too many editors had lost sight of that goal. So far as I’ve been able to determine, Universe has a higher rate scale for writers than any of its F&SF contributors, and Eric and Mike Resnick are obviously working hard to create a magazine that will boost the F&SF short fiction market and increase reader interest.

Yet, interestingly enough, neither King nor Edelman ever mentioned Universe, and how it came to be, and Edelman certainly ought to have been aware of it. Why didn’t he mention it? I don’t know, but I do know that it’s a part of the debate/issue that shouldn’t be ignored.

Science Fiction… Why Doesn’t Society Catch Up?

As I noted in passing in my earlier blog, various “authorities” have suggested for at least close to twenty years that one of the reasons why science fiction readership has dropped off, and it has, at least in relative terms as a percentage of the population, and even possibly in absolute terms, is because all the themes that were once the staple of science fiction are now scientifically possible and have often been done. We have put astronauts in orbit and sent them to the moon, and the reality is far less glamorous than the “Golden Age” SF writers made it seem. We have miniaturized computers of the kind that only Isaac Asimov forecast in work published around 1940. We have lasers — and so far they don’t work nearly so well as the particle beams in Clarke’s Earthlight or the lasers in 2001. We’ve created a supersonic passenger aircraft and mothballed it.

These reasons all sound very plausible, but I’m not so certain that they’re why SF readership has dropped off and why fantasy readership has soared. Earlier, I also explored this in terms of the “magic society,” but my personal feeling is that there is also another reason, one that has to do with people — both readers and the people and societies depicted in much current SF… and that includes mine, by the way.

Socially, human beings are incredibly conservative. We just don’t like to change our societies and domestic arrangements. Revolutions do occur, but just how many of them really end up in radically changing society? When MacArthur “restructured” Japanese society after WWII, the economic and political bases were changed dramatically, but the domestic and social roles remained virtually unchanged for another forty years. It really wasn’t until the 1990s when significant numbers of Japanese young women decided they didn’t want to follow the roles laid out by their mothers. Corrupt as he may have been, one of the largest factors leading to the overthrow of the Shah of Iran was that he was pressing to change social and religious structures at a rate faster than his people could accept.

While at least some of us in the United States like to think that we’re modern and progressive, has anyone noticed that “traditional” marriage is making a come-back? It’s making so much of a come-back that gays and lesbians want the benefits and legal structure as well. Despite the growth of the number of women in the workplace, women still do the majority of domestic chores, even when they’re working the same hours as their husbands, and the vast majority of CEOs and politicians are still male.

Now… what does this have to do with SF readership?

I’d submit that there’s a conflict between what’s likely technically and what’s likely socially, and social change will be far slower than predicted. In fact, that’s already occurred.

When my book Flash was published several years ago, one of the reviewers found it implausible that private schools would still exist some 200 years in the future in North America. I’d already thought about this, but the fact is that the traditional school structure goes back over 2,000 years. The structure works, if it’s properly employed, as many, many private schools and some charter schools can prove, and with 20 centuries of tradition, it’s not likely to vanish soon.

Yet more than a few books suggest the wide-spread growth of computerized learning, radical new forms of social engagement, and the like. Much of this will never happen. Look at such internet “innovations” as E-Harmony, Chemistry.com, etc. They aren’t changing the social dynamics, but using technology to reinforce them. Women still trade primarily on sex appeal and men on looks, power, and position. They just start the process electronically.

Most readers don’t really want change; they only want the illusion of change. They want the old tropes in new clothes or new technology, but most of them want old-style men in new garb, and brilliant women who are sexy, but still defer to men who sweep them off their feet.

Again… I’m not saying this is true of all readers, and it’s probably not true of the majority of SF readers. But, as a literature of ideas and exploration, the more that SF explores and challenges established social dynamics, the fewer new readers it will attract, particularly today, when it’s becoming harder and harder to create true intellectual challenge, because so few people want to leave their comfort zones. That’s an incredible irony, because our communications technologies have made it easier and easier for people to avoid having their preconceptions challenged.

Most fantasy, on the other hand, merely embellishes various existing social structures with magic of some sort, and it’s becoming increasingly popular every year. Perhaps that’s because, like it or not, technology has made one fundamental change in our economic and social structure, and that is the fact that physical strength is no longer an exclusively predominant currency in determining income levels. More and more women are making good incomes, often more than their husbands or other males with whom they interact. Sociological studies suggest that male-female relationships often reach a crisis at the point where the woman gains more income, power, or prestige than the man. It’s unsettling, and it’s happening more and more.

Enter traditional fantasy, with its comforting traditional structures. Now… isn’t that a relief?

The Popularity of Fantasy — Reflection of a "Magic World"?

When ANALOG published my first story, there really wasn’t that much of a fantasy genre. Oh, Tolkein had published the Lord of the Rings, and there were some Andre Norton witchworld novels, as I recall, and Jack Vance had published The Dying Earth, but fantasy books were vastly outnumbered by science fiction novels. Today, virtually every editor I know is looking for good science fiction. They can find plenty of decent, if not great, fantasy novels and trilogies to publish [good short fantasy stories are another matter].

What happened?

First, over the last forty years science got popular, and simultaneously more accessible and more complicated than ever. Second, technology complicated everyone’s life. Third, the computer made the physical process of writing easier than ever before in history. And fourth, the world became “magic.”

Science is no longer what it once was. Philo Farnsworth was a Utah farm boy, and effectively he invented television on his farm. RCA stole it from him, but that’s another story, and the important point is that one man, without a research laboratory, made the key breakthroughs. Likewise, Goddard did the same thing for the rocket. Earlier, of course, the Wright brothers made the airplane possible. Today, science breakthroughs that effectively change society require billions of dollars and teams of scientists and engineers. Writing about the individual in a meaningful sense in this context becomes difficult, and even if an author does it well, it’s usually not that entertaining to most readers. Add to that what science and technology have delivered to the average North American or European. We have near-instant world-wide communications, travel over thousands of miles in mere hours, pictures of distant galaxies and the moons orbiting distant planets in our own solar system, lights that can be turned on with a handclap, voice activated equipment… the list is seemingly endless. So much of what once was science fiction is now reality.

As I’ve noted in a previous blog, technology is no longer the wonder it once was. Too often technology becomes the source of strain and consternation, and for all that it delivers, most people want to escape from its stress and limitations. Admittedly, many of them use it for escape into forms of alternative reality, but more and more readers don’t want to read about technology.

Then there’s the impact of the computer, which makes the physical process of writing easier. It doesn’t, however, make the process of learning and understanding science and technology easier, and understanding science is generally useful for writing science fiction. So what do so many of those would-be speculative fiction writers concentrate on? Fantasy and its offshoots.

But the biggest factor, I believe, is that we now live in a “magic world.” A little more than a century ago, if one wanted light, it required making a candle or filling a lantern with expensive oil and threading a wick and using a striker or a new-fangled match to light the lantern or candle. Today… plug in a lamp and flip a switch. How does it work? Who knows? Most young people would have a hard time explaining the entire process behind that instant light. In a sense, it’s magic. Once transportation meant a long slow walk, or feeding, saddling, grooming a horse, taking care of the animals, breeding them, and still having to make or purchase bridles, saddles, and the like. Today, step into a car and turn the key. In more than 95% of all cars the transmission is automatic, and, again, how many people can even explain what a transmission or a differential does? It’s magic. You don’t have to understand it or explain it. I could go through example after example, but the process — and the results — would be the same.

As a society, we act as though almost all our physical needs are met by magic. Even the environmentalists believe in magic. How would many of them deal with the coal-fired power plants that fuel so much of our magic? By replacing them with solar and wind power, of course. But building solar cells creates much more pollution than using a coal-fired power plant for the same amount of power. And wind turbines, while helpful, cannot be counted on to provide a constant and continuing power source for our magic.

This mindset can’t help but carry over into what we do for entertainment. We act as though our society’s needs are met by magic, and we want to escape the incredible stress and complexity beneath the surface of our magic society. How many readers really want to deal with those factors, accelerated as they will be in the future? [And don’t tell me that technology will make things simpler. It never has. Physically easier, but not simpler. Allowing individuals to do more in the same amount of time, but only at the cost of more stress.]

To me, the “magic society” has far more to do with the comparative growth of the popularity of fantasy and the comparative decline of science fiction than the fact that we’ve reached the moon and surveyed planets and their satellites.

Technology and the Future of the Overstressed Society

Have you noticed how “stressed” everyone is today? Professionals, white collar workers, tech workers, sales workers, even high school and college students all complain about being stressed or overstressed. Many older Americans dismiss such complaints as the whining of a younger generation, a group that just can’t take it… but are these complaints mere whining… or do they have a basis in fact?

One fact is fairly clear. Americans today, on average, have a better life than did Americans seventy-five or a hundred years ago. Very, very few in the work force today had to live through the Great Depression. Nor do they have to worry about children dying of polio and whooping cough. The statistics show that most people are living longer and doing so in better health. There is a greater range of choice in occupations, and more Americans are able to get [and do obtain] higher education. The size of the average house is larger, and most houses have conveniences hardly imaginable a century ago. Although the average American work week is now longer than that of all other industrialized western nations, it’s far less physically arduous than the work of a century ago.

So why the complaints about stress?

Technology — that’s why. It’s everywhere, and it’s stressing us out in more ways than one. Those scanners in supermarkets and every other store? They not only ring up the sales and feed into inventory calculations, but they also rate the checkers on how fast and efficiently they handle customers. I knew this in the back of my head, so to speak, but it was brought home to me when a single mother who was a checker at a local store told me she’d been demoted to the bakery because she didn’t meet speed standards.

Computers, especially those with color graphics and associated high speed printers are another source of stress. Why? Because they successfully invite revision after revision by overcareful supervisors and clients. Do it over… and over… and over.

Then, there are instant messaging, emails, and texting. IMs and texting, especially among the young, lead to carelessness in spelling and grammar, and that feeds back into the need for those endless document revisions, because, believe it or not, those grammar and spell-checkers just don’t catch everything. Then… emails… which encourage everyone to get in on everything, until at times, it seems as though everyone is watching and looking for ways to make completing anything difficult. On top of that, add bosses who feel slighted if one doesn’t answer emails quickly, and all that answering and justifying and explaining doesn’t get the projects done. It just takes up time that can’t be used to do real work, a problem that some supervisors just don’t get.

As for students, keeping in touch through the technology of cell-phones, emails, and texting seems to occupy their every waking, walking, and driving moment. Add to that the allure of the wonders of hundreds of cable or satellite channels, and the need to earn money for an ever-more expensive education — or vehicle payments — and they’re stressed out.

The impact of technology pervades everything. Computerized legal databases and software make litigation ever more complex — not to mention expensive and stressful.

Healthcare has even more problems. We have more than 47 million Americans without health insurance, and the number is growing faster than the population. Why? Because expenses are growing, thanks to a proliferation of medical technology and drugs that raises costs. When my grandfather was a doctor, diagnostic technology was essentially limited to a few blood tests, a stethoscope, and an X-ray machine. Today, the average doctor’s office is filled with equipment, and that equipment creates an expectation of perfect medicine. That expectation, combined with the opportunism of the technologized legal system, leads to far more litigation. That leads to higher malpractice insurance, and more stress on doctors and more and expensive tests and procedures to make sure that nothing gets missed — or to cover the doctor from legal challenges. It’s not uncommon for some medical specialties to have annual malpractice premiums in excess of $200,000 a year. Assume that a doctor actually sees patients 5 hours a day in the office some 50 weeks a year, the other time being spent in things like hospital rounds, reviewing charts, etc. Under those conditions, an annual malpractice premium requires a charge of more than an $80 an hour. If the doctor has a million dollars in medical and office equipment and that’s not unusual either, the amortization will be excess of $100 per patient hour seen. Needless to say this creates stress and pressure, and for all the complaints about the medical profession, doctors have one of the lower life expectancies of professionals.

In higher education, computerization has led to ubiquitous on-line evaluations and anonymous ratings of professors, and the subsequent inevitable grade inflation, because tenure often depends on pleasing the students. It’s also led to a proliferation of policies and procedures, so easily printed on those handy-dandy computerized systems. In my wife’s university, the policies and procedures for rank advancement and tenure have been rewritten and changed once or twice every year over the past decade, with scores of drafts being circulated electronically before each revision was finalized.

In effect, the expectations of technology have created more stress for modern society than the wind, rain, and inconsistent weather ever did for our agricultural forebears — largely because technology also makes people more and more accountable, even when they can’t do anything about it. The way technology is used today also creates what my father called “being eaten to death by army ants.” No one wants to kill you, but everyone wants a little something — reply to these emails, revise that set of documents, change that phrase to please the attorneys, change this one for the boss’s supervisor — and when it’s all said and done, who has time to do actual new work?

Yet, if you ignore the army ants, everyone thinks you’re difficult and uncooperative, and you lose your job. Is it any wonder that American professionals are working longer and longer hours?

But… ah, the blessings of technology.

The "Literary Canon," Education, and F&SF

Roughly twenty years ago, Allan Bloom published an incendiary book entitled The Closing of the American Mind. In it, Bloom charged that abandoning the traditional literary canon in favor of multiculturism and gender- and ethnic-based literary selections effectively had gutted the American liberal arts education. I’m oversimplifying his charges, but they run along those lines.

During the 1960s and 1970s, and thereafter, but particularly in those turbulent years, there were numerous and loud cries for “relevance” in higher education. Those cries reverberate today in such legislation as the No Child Left Behind Act and the growing emphasis on institutions of higher education as a version of white collar and professional trade schools. Less than ten percent of U.S. collegiate undergraduates major in what might be called “liberal arts,” as compared to twenty percent in business, sixteen percent in health, nine percent in education and six to ten percent in computer science [depending on whose figures one uses]. Less than three percent major in English and history combined.

As a writer who effectively minored in English, I’ve thought about the writers and poets I had to study in the late 1950s and early 1960s and those studied by students today. Back then, for example, there was a fairly strong emphasis on poets such as T.S. Eliot, W.B. Yeats, W.H. Auden, and Wallace Stevens, none of whom are now listed as among the topmost poets assigned in college English classes. Now… times do change, but I realized that poets such as Eliot bring certain requirements that poets and writers such as Maya Angelou, Jane Austen, and Toni Morrison do not. For much of Eliot or Yeats to make sense, the student has to have a far wider grasp of literature and history. Much of the difference between those writers once assigned and those now assigned, from what I can tell, is that a far greater percentage of those now assigned are what one might call self-affirming writers. They affirm a set of values that are either explicitly contained in the work at hand, or they affirm current values. By contrast, poets such as Eliot and Yeats often question and use a wide range of references and allusions unfamiliar to most students, some of which are current and some of which are historical and few of which are “common” knowledge.

In that sense, the best of F&SF, in my opinion, is that which stretches the reader into considering old values in a new light and “new” values through the light of experience, accepting neither at face value. Many F&SF writers present the “new” in a way that proclaims its value uncritically, while others present and trash the “new,” as does Michael Crichton all so well. Then there are those who appear to believe that shocking readers is equivalent to making them think and stretching their horizons. Most of the time, it’s not.

According to Mark Lilla, a professor of political philosophy at Columbia, recently quoted in The New York Times, “What Americans yearn for in literature is self-recognition.” But struggling with unfamiliar themes and values, searching out allusions and references require work and can be an alienating to students, and certainly doesn’t boost self-recognition.

Particularly in the late 1960s and early 1970s, it seemed to me, there was a concerted effort in the SF field to raise issues while adhering to some degree to the tradition of the “literary canon,” and this effort continues with at least some authors today. This melding represents, again in my opinion, one of the great strengths of the field, but paradoxically, it’s also another reason why F&SF readership tends to be limited, at least for these types of F&SF, because a reader either has to be knowledgeable or willing to expand his or her comfort zone.

This gets down to an issue at the basis of education, primarily but not exclusively higher undergraduate education: Is the purpose of higher education to train people for jobs or to teach them to think so that they can continue to learn? Most people would ask why both are not possible. Theoretically, they are, but it doesn’t work that way in practice. Job training emphasizes how to learn and apply skills effectively and efficiently. Thinking training makes one very uncomfortable; it should, because it should force the student out of his or her comfort zone. At one time, that was one of the avowed goals of higher education, and part of the so-called literary canon was chosen so as to provide not only that challenge but also a cultural history of values as illustrated by literature, rather than a mere affirmation of current values.

In addition, today, with the smorgasbord approach to education, a student can effectively limit himself or herself to the courses that merely reinforce his or her existing beliefs and biases. It’s comfortable… but is it education?

Future Fact? Present Fraud? Or…?

Once more, just the other day, someone said to me and my wife, “We never really went to the moon. It was all a fraud.” This person is not uneducated. In fact, the individual has an earned graduate degree and spent some fifteen years as an executive in the financial industry.

It doesn’t seem to matter to this individual — or the millions that share such a belief — that scientists are bouncing laser and radio beams off the reflectors left on the moon by our astronauts. Nor do the photographs and records that could not have been obtained any other way count against this belief. Nor the fact that ground-based and space-based evidence agree. Nor does the fact that we and other countries have put dozens of astronauts into space matter.

Nope. To such people, the moon landings were all a fraud.

Maybe this kind of belief has something to do with the brain. A recent study confirmed that there is indeed a difference between the way “liberals” and “conservatives” process and react to information, and that that difference goes far beyond politics. Liberals tend to be more open to new experiences, conservatives more entrenched and unwilling to move away from past beliefs. And, of course, interesting enough, there are those who classify themselves as liberals who actually have a conservative mind-set, who will not deviate from what they believe regardless of evidence, and there are those who claim that they are conservative who are very open to new evidence and ideas.

Neither mindset is necessarily “good” or “bad.” As many conservatives would say, and have, “If you don’t stand for something, you’ll fall for anything.” That can be very true. On the other hand, no matter how hard one wants to believe that the earth is flat, I’m sorry. It just isn’t. When new information arrives that is soundly and scientifically based, regardless of opinion and past beliefs, a truly intelligent person should be willing to look at it objectively and open-mindedly.

In a sense, I think, most people are basically conservative. We really don’t want to change what we believe without very good reason. In evolutionary, historical, and social terms, there are good reasons for this viewpoint. Just as in mutations affecting an organism, most changes in social and political institutions are bad. Only a few are for the best.

The problem occurs when the probability of danger from an event is not absolute, or unitary, as some economists put it, but still likely to occur, and when that occurrence would be catastrophic to the human race. Over the history of homo sapiens, some hundreds of thousands of years, or millions, depending on one’s definition of exactly when our forebears became thinking human beings, this kind of situation has not occurred until the past half century. While it might be unthinkable and improbable to most, a nuclear war would be devastating to the human race. So, it appears, will runaway global warming, regardless of cause.

The “conservative” view is to wait and let things sort themselves out. After all, hasn’t this worked throughout history? Well… not always, but in terms of survival and civilization, there was always someone else to carry on. When the Mayan civilization fell because they hadn’t planned well enough for unforeseen droughts, other human civilizations carried on. The same was true of the Anasazi, and now recent satellite measurements and photographs suggest that the same occurred to the Cambodian peoples who built Angkor Wat, then a city complex of over a million people, when drought struck in the 1500s.

But what happens when we as a race face a potential climate catastrophe as devastating as global warming could be? One that affects an entire world, and not just a continent or subcontinent? Can we afford to be conservative? Or is it a situation where, in reacting, we could fall for anything?

Is global warming a fraud perpetrated by scientists, as those who deny the moon landings believe about that? Or is it a real and present danger? Or is it over-hyped, the way all the warnings about DDT appear to have been – a real danger in limited areas and to certain species, but truly not the harbinger of a universal silent spring? And how should we react, whether conservative or liberal?

Flash and Substance in F&SF

As some of you know, I’ve been involved in fantasy and science fiction for some time — otherwise known as “too long” by those who don’t like what I write and “please keep writing” by those who do. For almost as long as I’ve been writing, I’ve wondered why a number of good solid, inventive, and talented writers failed to be recognized — or when recognized, were essentially “under-recognized” or recognized late. That’s not to take away from some who were recognized, like Jim Rigney [Robert Jordan], but to point out that sometimes recognition is not necessarily fair or just.

One of them was, of course, Fred Saberhagen. Another, I believe, was Gordy Dickson, as was Murray Leinster. Among writers still living and writing who haven’t received their due, in my opinion, I might include Sheri Tepper. There are certainly others; my examples are far from all-inclusive.

But why has this happened, and why has it continued to go on?

One of the problems in the F&SF genre and, indeed, in every field of writing — and, as I discovered over nearly 20 years in Washington, D.C., also in politics — is that the extremists among the fans, reviewers, academics, and critics have a tendency to monopolize both the dialogue and the critical studies. And, for better or worse, extremists generally tend to praise and support, naturally, the extremes. In writing, from what I’ve seen, the extremes tend to be, on one end, extra-ordinary skill in crafting the individual sentence and paragraph, usually to the detriment of the work as a whole and, on the other, incredible action and pseudo-technical detail and devices and/or magical applications in totally unworkable societies and situations.

While I can certainly appreciate the care and diligence involved in the construction of the Gormenghast trilogy, books whose “action” moves at the speed of jellied consume, uphill — and that may overstate the pacing — that trilogy is not a work of literature, regardless of all the raves by the extremists. Likewise, month after month, I see blogs and reviews which praise books, which, when I read them, seem not to have much depth and rely on action and clever prose to disguise that lack; or on well-crafted words and not much else; or almost totally on humor, often at such basic levels as to be embarrassing; or… the list of sins is long. What I don’t see much of is reviews which note books with deep and quiet crafting, relying neither too much nor too little upon words, actions, inventions, or humor, but balancing all in a way to create a realistic world with people and situations which draw in the reader in a way to engage both emotion and thought and provoke a reconsideration of some aspect of what we call reality.

Now… I have no problem with brilliant unrealism, or incredibly moving prose. I do have great difficulty with books being termed good or great solely on such criteria, particularly when the critics of the extremes often tend to overlook excellent prose, plotting, and even incredibly credible devices and societies because the author has presented them so quietly and convincingly.

In a determined but comparatively quiet way, by creating Jim Baen’s Universe, Jim and Eric Flint attempted to create a sold-paying market for good stories that appealed to a wide range of readers, and not primarily to the extremists. Will this effort work? I hope so, and it looks promising, but it’s still too early to tell.

Shock value and novelty do indeed attract readers. Sometimes they even sell books. I won’t contest that. Nor will I contest the fact that much of what doesn’t appeal to me is obviously very appealing to others. What I will point out is that work which engages readers on all levels and raises fundamental issues tends to sell and stay in print over the years [so… maybe I was wrong about Gormenghast… or maybe it’s the exception that proves the point].

Calling All Tenors, Baritones, and Basses

For those young men who have a good voice and the ability and desire to learn music… how would you like a job where you can travel the world — or at least the United States — and get paid for it, and where adoring young women often follow your every word and note? If so… have you considered being a collegiate-level professor of voice?

While the openings in full-time, tenure-track university positions for female singers with doctorates are almost non-existent, universities and colleges are always looking for qualified and talented male tenors, baritones, and basses. “All” you have to do is become a classical singer qualified to teach on the university level. This does require work in addition to talent, and getting a doctorate in music is not for everyone, nor is it without some cost, but the very top positions in the field can earn close to $100,000 a year, and that doesn’t count fees for performing outside the university. Now… admittedly, a starting salary for a male tenure-track junior voice faculty member is “only” $35,000-$50,000, but a full-time position usually includes health care and one of the best and most portable retirement pension systems in the country.

More than a few times, when my wife has suggested that male students might have a future by majoring in music, the usual response is that “I won’t make enough money.”

And exactly how true is that? The latest data from the Census Bureau notes that the median income of men working full-time in the USA is slightly over $42,000. The median for men with professional degrees [law, medicine, science, MBA] is around $72,000. Of course, all young men and women will be above average, just as the children in Lake Wobegon are all above average, and all will make fantastic salaries.

But what is fantastic? The average veterinarian makes $65,000, the average architect $57,000, the average accountant $41,000, the average secondary school teacher $47,000. For every junior attorney making $130,000, there are many more making $40,000-$60,000. With the median salary of attorneys around $80,000, that means half are making less than that, often after years of practice.

So how unaffordable is the possibility of a $75,000 a year income after 15 years, for a nine month contract, with all sorts of fringe benefits — such as health care, retirement, tickets to sports and cultural events, and even free or subsidized parking?

But don’t apply if you’re female. Because schools can legally discriminate by specifying voice type, there are on average at least twice as many positions for men, despite the fact that most voice students are female, and on average, you’ll only make 75% of what the men do.

Deception and Greed

A century or so ago, and certainly earlier, the general consensus, both among the public and the scientific community was that homo sapiens was the only tool-using creature, and certainly the only one who had self-consciousness. But recent studies of various primates, Caledonian jays, and other species have proved that mankind is not the only tool-user, merely the most advanced of tool-users. More recent studies also suggest that some primates and jays, and possibly even elephants, have at least a form of self-consciousness.

What led to this conclusion? Experiments in the use of deception and self-imagery. In essence, certain species hide food and deceive others as to where they’re hiding the food. The way in which they used deception, and the varying levels of deception, depending on the closeness and relationship of those nearby, suggests that they are aware of themselves as individuals, and are also aware of others as individuals.

What I find intriguing about these studies is that there appears to be a link between intelligence and greed and deception. Now… a wide range of species accumulate food and other items, but only a handful exhibit what might be called “greed.” Greed can be defined as the drive to acquire and maintain possession of more physical goods or other assets than the individual or his family/clan could possibly ever physically utilize, often to the detriment of others.

One thing that’s interesting about human beings is that we also possess the greatest degree of concentrated greed and deception of any species. No other species comes close. This raises an intriguing question: To what degree is intelligence linked to greed and deception?

Are greed and deception by-products of intelligence, or are they the driving force to develop intelligence?

While the evolutionary/historical record suggests that species capable of greed and deception tend to be more successful in attaining control of their environment, what happens next? Intelligence develops tools, and individuals with greed and deception put those tools to use in varying ways to enhance their own power to the detriment of other members of the species. As the tools become more powerful, their use by those who possess them also tends to concentrate power and wealth, yet almost every successful society has also incorporated deception of some sort into its social framework.

Kurt Vonnegut made the observation in Slaughterhouse Five — through a Nazi character, of course — that the greatest deception perpetrated by the American system was that it was easy to make money. Because it was then thought to be so, income inequality was justified, because anyone who wanted to work hard could “obviously” become wealthy.

Historical institutional “deceptions” include the divine right of kings, the caste system of India, Aryan racial supremacy, the communist “equality of all” myth, and on and on.

But what does this bode in an increasingly technological information age, where hacking, phishing, and all other manner of informational deception has increased, involving not just the criminal element, but industry, politics, and entertainment on all levels? Does it mean that the survivors will have to be even more intelligent, or that social structures will come crashing down because no one can trust anyone about anything? Or will we manage to muddle through? Will survival of deception be the ultimate Darwinian test of the fittest? Maybe… there’s an idea for a book…

The Weaker Sex… Revisited

Years ago, James Tiptree, Jr. [Alice Sheldon] wrote a novella entitled “Houston, Houston, Do You Read,” in which present-day male astronauts were catapulted into a future where there are no men. The implications of the story are that, despite their greater physical strength, men were indeed the weaker sex and perished, albeit with a little “help.” Some recent analyses of educational achievement by gender suggest that Sheldon just might have been on to something.

Over the past few years, sociologists, psychologists, teachers, and even politicians have been raising an increasing number of questions about the gender and educational implications of a high-tech society. Three generations ago, women students were a clear minority in higher education and in almost all “professional” occupations. Today, female students comprise the majority of undergraduate college students and graduates, with a nationwide ratio running in excess of 56% to 44%, a ratio that is becoming more unbalanced every year. So many more women are attending college that upon many campuses, particularly at elite universities and liberal arts colleges, they’re being subjected another form of discrimination. In order to keep a gender balance, many schools effectively require female students to meet higher academic standards than male students.

A recent report [Gender Equity in Higher Education: 2006] reported that in college men spent more time watching television, playing video games, and partying, while women had better grades, held more leadership posts, and claimed far more honors and awards.

The trend of women seeking professional education is also continuing in many graduate fields, such as law and medicine, where women outnumber men at many institutions.

In more and more state universities, women outnumber men, in some cases composing sixty percent of the student body. Even at Harvard, the latest freshman class has more women than men. The only areas where men are numerically dominant are in the hard sciences and in engineering, but even there, a greater and greater percentage of science and engineering students are female. Not only that, but in the vast majority of institutions, the majority of awards and honors are going to women. Now, admittedly, this female expertise hasn’t yet managed to reach and shatter the “glass ceiling” prevalent in the upper reaches of the faculty in higher education or in corporate America, but it’s coming, one way or another. There are female CEOs, but many women are simply choosing to create their own businesses, rather than play the “good old boy game.” Others become consultants.

Another factor that I’ve noted in my occasional university teaching, and one also noted by various family members, three of whom are professors at different universities, is that a decreasing percentage of college-age men are inclined to apply themselves in any degree that requires more than minimal effort, both physically and intellectually. This is a disturbing trend for society in a world where education and intellectual ability have become increasingly important, both to hold society together and to achieve a comfortable lifestyle suited to personal satisfaction and raising children. Even more disturbing is that this gender-based educational disparity becomes greater and greater as the income of the parents decreases. In short, men from disadvantaged homes are often half as likely to get a college degree as women from the same cultural and economic background.

Both my wife [who is a full professor] and I have watched male students turn down full-tuition scholarships because the requirements were “too hard,” while talented women literally had to claw their way through the same program.

Do these trends represent a failure of our educational system.. or is it that too many men can’t really compete with women once the playing field starts to get more level? Or that men need special treatment to achieve on an equal basis? After all, the real hunters are the lionesses, not the “noble” lion.

Reading… Again

In the headlines recently have been more stories about how little Americans read. According to an AP-IPSOS study, twenty-seven percent of all adults in the United States have not read a book in the past year. The remainder — those who claimed to have read at least one book over the past 12 months — averaged seven books. According to another earlier Gallup poll, some 57% of Americans had not read a book [besides the Bible] in the previous year.

I’m not troubled by the fact that there are those who haven’t read any books. In any society, there are people who just aren’t readers. But I am troubled by the numbers and the way they fall out.

I wasn’t surprised that the readers of fantasy and science fiction comprised less than 5% of all readers. Nor was I exactly astounded to discover that, over the past 15 years, book-reading percentages are down for the 18-25 age group, from close to 90% to less than 60%. More than half of all frequent readers are over age 50, and more than 55% of all books are purchased by those over 50. The highest concentrations of readers are among those who are older and college-educated.

Yet book sales are up. Exactly what does that mean? I’m reminded of a study done for the National Opera Association several years ago. Sales of opera tickets were up, and everyone was pleased until they looked closely at the numbers — which showed that while the number of tickets sold was up, the actual number of patrons was down, and that the average age of patrons was increasing.

The statistics on book reading seem to be following a similar pattern, and for years now, various pundits and social scientists have been worried that Americans are losing their reading skills – and that a smaller and smaller percentage of Americans are highly literate. Yet the U.S. economy still dominates the world stage, and, despite the difficulties in the Middle East, our military machine has no equal — even in situations where we have to deal with sectarian civil wars. So what’s the problem?

The problem is information-processing. To make intelligent decisions, human beings need information. They can obtain that information in one of three ways: direct personal experience, listening and watching, or reading. The first two means, while often necessary, share one basic problem. They’re slow, and the information flow is very restricted. Even slow readers generally can process written information several times faster than auditory information, and they can always refer back to it. That’s one reason, often forgotten, why stable civilizations did not emerge until written languages developed. The invention of the printing press in Europe provided a tremendous informational advantage to western European civilization, which, until that time, had lagged behind the Asiatic cultures, particularly the Chinese. The Chinese culture effectively used an elaborate written pictograph-based character language to restrict social and political access to a comparatively minute fraction of the population, which resulted in a tremendous information gap once western cultures combined alphabet-based languages with widespread use of the printing press and the comparative decline of Chinese power and influence.

In its own fashion, an auditory-visual media culture limits and shapes information flow, first by selectively choosing what information to promulgate and second by tying that information to visual images. Now, immediately, someone will question this by pointing out the multiplicity of media outlets and the different media channels. There are hundreds of cable and satellite channels; there are thousands of blogs and web-sites. How can I claim this is limiting? First, on every single cable and satellite station, the information flow is effectively limited to less than one hundred words a minute. That’s the top rate at which most people can process auditory input, and most facts have to be put in words. Second, while the internet remains primarily text-based, the vast majority of internet users is limited to what best might be called “common access” — and that is extremely limited in factual content. If you don’t believe me, just search for Mozart or Einstein or anything. In most cases, you’ll find hundreds, if not thousands, of references, but… you’ll seldom find anything beyond a certain “depth.” Oh… I’m not saying it’s not there. If you’re a university student, or a professor using a university library computer, or if you want to pay hundreds or thousands of dollars in access fees, or if you live in a very affluent community with integrated library data-bases, you can find a great deal… but Joe or Josephine on the home computer can’t.

In reality, the vast majority of internet users circulate and re-circulate a relatively small working data-base… and one which contains far less “real” information than a very small college library, if that.

Then add to that the fact that close to 60% of college graduates, according to a Department of Education study published last year, are at best marginally literate in dealing with and understanding information written at the level of a standard newspaper editorial.

These lead to the next question. Why does all this matter?

I’d submit that it matters because we live in the most highly technical age in human history, where no issue is simple, and where understanding and in-depth knowledge are the keys to our future… and possibly whether we as a species have a future. Yet the proliferation of an auditory-visual media culture is effectively limiting the ability of people, particularly the younger generations, to obtain and process the amount of information necessary for good decision-making and replacing those necessary reading and researching skills with simplistic audio-visuals and an extremely limited common informational data-base. This makes for a great profit for all the media outlets, but not necessarily for a well-informed citizenry.

Like it or not, there isn’t a substitute for reading widely and well, not if we wish what we have developed as western semi-representative governments to continue. Oh… some form of “civilization” will continue, but it’s far more likely to resemble a westernized version of the pre-printing press Chinese society, with a comparatively small elite trained in true thought and mental information processing, all the while with the media and communications systems types enabling “sound-byte” politicians with simplistic slogans while trumpeting freedom of expression and banking greater and greater profits.

Come to think of it… I wrote a story about that. It’s entitled “News Clips from the NYC Ruins.” If you didn’t read it in The Leading Edge, you can read it in my forthcoming story collection — Viewpoints Critical — out next March from Tor.

The Anecdotal Species?

A recent article in the Economist suggested that the U.S. space program was headed further astray by concentrating on manned missions when so much more knowledge could be obtained at a lower cost from instrumented unmanned missions. After reading that, my first reaction was to disagree — on the grounds that unmanned missions keep getting their funding cut because they’re “just” research, and research always tends to get cut first in any consensus budgeting process, either in the corporate world or in government. In addition, I had the feeling that most people don’t identify with space cameras, comet probes, asteroid probes, near-earth orbit surveys, and the like, despite the cries and protests from many in the scientific community that “science missions,” as opposed to astronaut missions, are being under-funded, even though they provide far more information per dollar spent.

But then, ever since the Soviet space program collapsed, much of the impetus for U.S. space development seems to have collapsed as well, whether for manned or unmanned projects. Only when Americans perceive an immediate and real threat do we seem able to press for research in technological or military areas.

As I considered these points, I began also to reflect upon the time I spent at the U.S. EPA, when there was a great furor over the possible contamination from leaking Superfund sites. Now, there was no question that a considerable number of abandoned waste sites were in fact leaking and contaminating the environment near those locations, and public opinion was clear and decisive. Fixing Superfund sites was top priority. Somewhat later on, the Agency looked further into the environment priorities, and issued several reports. The gist of the findings was that, in general, that the public’s priorities for environmental improvement were almost inversely related to the real dangers to people and health. The actual illnesses and deaths from leaking Superfund sites were far lower than those from at least five other major environmental issues. How could this be? It happened, I believe, because we are an anecdotal and egocentric species. Those dangers we see and hear personally, those we can understand easily, and those which can be related to us personally by those we know and trust — these are the dangers we believe in. When chemically-caused cancer occurs in a local community because of groundwater contamination, we react. But when the EPA or a state health agency notes that fatalities are rising from exposure to natural radon or skin cancer caused by the thinning of the ozone layer, we don’t. When health agencies point out that smoking is a far greater health hazard than any of the environmental concerns, such notice has a comparatively small effect. Even when the massive damage claims arrive from increased hurricane activities, we tend not to put as much personal priority in looking into why hurricanes seem to be more of a problem — and we just want someone else to pay for the repairs.

We all want our problems solved first. Then, and only then, will we devote resources to other people’s difficulties. This is a practical and natural approach from a Darwinian and historical point of view. If we don’t solve our problems first, we and our children may not be around to solve anyone else’s problems. But what happens when a non-immediate problem could become a far-larger problem threatening us and everyone else?

This difficulty isn’t a new problem in American society, and it’s not a problem confined to the U.S. Prior to roughly 1938, no one wanted to consider the implications of what Stalin was doing in the USSR or Hitler in Germany, or Mussolini in Italy. No one in the “western” world paid all that much attention to the Japanese “rape of Nanking” and what it foreshadowed. Today, because the area has no oil and no strategic import, and because few Americans have seen or experienced the brutality and continual death, most Americans don’t really pay much real attention to the genocide in Darfur.

This same mindset applies to the exploration of space — or the issues surrounding global warming. If something doesn’t pose an imminent danger or have an immediate entertainment or economic value… one that can be exploited quickly, why bother?

Then… add one more complicating factor. In neither space exploration nor global warming do we truly have a certain solution. While we’ve reached the point where it appears that a majority of the knowledgeable scientific community believes that there is some form of global warming occurring, there is no strong consensus on what might be called a workable strategy. What one group calls workable is called punitive by another. Reducing carbon emissions is one possibility, but that will penalize third world and developing nations disproportionately, if carried out. Unilateral action by industrial nations means their citizens bear higher costs. Reducing greenhouse gases is another possible approach, but that cost falls more heavily on the high-tech economies. Requiring more fuel efficient cars increases costs and decreases choices more for those who require cars to get to their jobs… And so it goes.

The bottom-line question might well be: Can a species that’s been hard-wired over a million years to be short-term, personally/familially-oriented, and anecdotal cope with problems that require long-term planning and wide-spread consensus?

Belief?

Believing in something does not make it true. Disbelieving in something does not mean that it cannot exist. Admittedly, on the quantum level, the act of observing often changes or fixes what is, but so far, at least, the question is not whether a particle or wave or photon exists, but in what form and exactly where.

The problem in human behavior is that belief has consequences, often deadly ones. I cannot imagine that a Supreme Being, should one exist, could possibly care whether the correct prophet happened to be the son or the nephew, or whatever, of the old prophet. Nor do I think that it is at all rational that rigid belief in one set of rituals about a God will give one worshipper eternal favor while rigid belief in another set of rituals about that same God will damn a different worshipper eternally. And I have great difficulty in thinking that any deity will grant one eternal and blissful life for slaughtering those who believe differently, particularly those who have done nothing to offend one except not to share the same beliefs.

All that said, in human affairs, it doesn’t matter much whether I or others have difficulty understanding why people would care about such differences passionately enough to kill to attempt to force their beliefs on those who would choose to believe differently — or not to believe in a deity at all. The fact is that, both now and throughout history, millions upon millions of people have been killed over beliefs, not just religious beliefs, but political beliefs, cultural beliefs, and even economic beliefs.

Yet there is no true proof behind these beliefs, especially religious beliefs. Oh, there are books, and testimonies, and prophets, and visions, and unexplained phenomena, but true proof, in the scientific sense, is missing. Even for some well-accepted political beliefs, solid verifiable proof of their efficacy is scant or lacking altogether.

Science, at least in theory, is supposed to test propositions and to verify them. We apply such methodology to every physical aspect of life in modern society, yet there is no comparable test for “belief.” All one has to do is to say, “I believe.”

And so, despite astronomical, atomic, chemical, and geologic evidence that the universe is close to 15 billion years old, there are those who choose to believe that it was created far more recently than that. Despite a long fossil record documenting evolution, creationists cite lapses and faults in the fossil chronology, yet dismiss the counter-argument that there is no physical record at all suggesting “instant” divine creation. Nor is there any true physical evidence suggesting an afterlife.

So… what’s the problem with belief? Everyone has some belief or another.

Beliefs have consequences, and not just the obvious ones. Take the widely held belief in some form of an afterlife, a belief held by close to seventy percent of all Americans and eighty percent of Americans over 50, according to recent surveys. What does that mean? One of the greatest dangers of this commonly held belief is that it allows cruelty in the name of all manner of rationales. After all, if there is a supreme deity, if there is an afterlife, well… all those folks who got slaughtered have another chance to repent and redeem themselves. It’s not like it’s forever.

But… what if it is? What if one life is all anyone gets? There’s lots of belief about eternal life, but there’s no proof, not scientific proof. We want all sorts of tests about whether our food is safe to eat, whether our cars are safe to drive, whether our water is pure, whether our air is clean. Yet, we believe in an afterlife without a single shred of scientific proof. Are there two standards in life? Those for the physical world, where everything must be proven and proven again, where lawsuits abound over the tiniest discrepancies… and those for beliefs, where, regardless of the consequences, we accept something totally unproven?

Is that because we can’t face, and don’t want to face, the truly toxic aspect of belief in an afterlife — that it allows us all sorts of justifications for cruelty, for oppression, for suppression? If the life we have now is the only one we will ever have, and if we accept that, could we live with all that we have done to others?

Then, too, the truly terrifying possibility is that we could, and that the results would be worse, far worse. Does that mean that belief in unproven deities is preferable to the alternative? If so, what does that say about us as a species?

Thoughts on Reader Commentaries

Like many authors, I do read at least some of the posts and commentaries about my work, not so much for ego-surfing, because one nasty comment wounds more than a score of positive ones heal, but to see what some of the reactions [if any] to what I wrote are. After many years, there are certain patterns which have become obvious.

First, a number of readers believe that whatever my protagonists say and do is what I believe. So do I believe in pre-emptive action [as do Jimjoy Wright, Nathaniel Firstborne Whaler, and Gerswin], in semi-preemptive action [ala Lerris, Lorn, Trystin, or Van Albert], or reaction in massive force [Ecktor deJanes, Anna, or Secca]?

Because different protagonists react in different fashions, I find that this occasionally engenders one of two reactions from readers. The first reaction is that I am being inconsistent. The second reaction, which is far more common, is when the reader fixates on a particular type of hero or behavior and ignores all the others. For example, many readers believe that I only write coming of age stories about young heroes, But even in the Recluce Saga, of the fourteen books published [or about to be published], exactly half deal with “coming-of-age.” None of the Spellsong Cycle novels use that approach, and only one of the Corean Chronicles is really a coming-of-age tale. Almost none of my science fiction novels deal with “coming of age” themes. By these figures, less than twenty percent of my work is true “coming of age” work.

Then there is the charge that I write the “same” book, over and over. To this charge, I plead “partly guilty,” in that there are common sub-themes in every book I write: the hero or heroine learns something and accomplishes something and there’s some form of romantic interest. I’m not terribly interested in writing books where the protagonist learns nothing and/or accomplishes nothing. In practice, a protagonist either learns or doesn’t learn, accomplishes something or doesn’t. Now, in the James Bond books, and in many of the endless series with the same cast of characters, a great deal of action takes place, but when it’s all over, what exactly has happened? Isn’t the norm that one set of disposable characters has been killed or jailed, or been made love to and discarded, only to be replaced by another set for the next episode? Has the real structure of the world or the political system changed — or has the scenery just been replaced, so to speak, and made ready for another series of adrenaline-filled [or lust-filled or whatever-filled] adventures?

Nor am I interested in writing nihilistic or “black” fiction. Years ago, in my closest approach to the dark side, I did write one classical tragedy in three volumes, and sales of the third volume plummeted. Interestingly enough, now that The Forever Hero has been reprinted in a single fat trade paperback, it has continued to sell modestly… but reader reaction has been more than a little mixed. Even so, I seldom write books with unabashedly “everything is well” endings. Most of what I write has what I’d call “bittersweet” endings, those where the protagonists achieve their goals, but end up paying more, if not far more, than they dreamed possible. I’ve also discovered that, because I often don’t make that explicit, a number of readers don’t always catch the darkness veiled within the ending.

In a larger sense, however, ALL writers write the same books over and over. As Heinlein pointed out over 35 years ago, there are only a handful of plots, presented in many guises, but limited in “formula,” if you will, to those basic plots.

Oh… and then there’s the reader reaction to the food. More than a few times, there have been questions and comments about why my books have so many scenes where the characters eat. With those comments and questions have come observations about the food, ranging from why it’s so simple in some books to why it’s so elaborate in others. Why the meal scenes? Because, especially in low-tech societies, meals are about the only opportunity for conversations and decisions involving more than two people. As for the fare served, I try to make it appropriate to the time and culture, as well as to the economic status of those at the table.

Finally, as exemplified by the reaction of some few readers to my comments and amplifications on why most readers don’t like or aren’t interested in F&SF, there are always those who believe that, by what I have written, I am attacking their most cherished beliefs, and that because I am, I’m clearly an idiot. By this standard, I suspect all of us are idiots to someone, and writers more so because writers who continue to be published have to say something, and something will always offend someone. My personal belief is that a writer who offends no one usually has little to offer.

Most professional writers do offend someone, and in that sense, you as readers can judge authors not only by their supporters and friends, but also by those who dislike them , although I would also suggest, based on experience, that most readers who dislike an author cannot be impartial in evaluating their dislike. Why? Because most writers published by major publishing houses produce an acceptable technical product [even if editors must proof and correct it in some cases], when someone claims they dislike a writer because his work is “badly written,” “excessively verbose,” “so spare you can’t follow the action,” “filled with cliches,” and the like, all too often this sort of criticism veils a deeper dislike within the reader, and one based more upon conflicting values than upon the writer’s technical deficiencies. Now, I am far from claiming that we as writers do not make technical mistakes or that we do not occasionally manifest such deficiencies, but any writer who has glaring technical deficiencies, as cited by some readers, will not get book after book published. In the end, most criticism reflects as much, if not more, about the critic as about the author.

More on the "Death" of Science Fiction

A recent article/commentary in Discover suggested that science fiction, if not dead, was certainly dying, and one of the symbols the author used was the implication that the prevalence of middle-aged [and older] writers at the Nebula/SFWA awards suggested a lack of new ideas and creativity. Needless to say, as a moderately established writer who is certainly no longer young, I find such an “analysis” not only irritating, but fallacious, on two counts.

First, age, per se, is no indicator of creative ability in science fiction or any other literary form, and it never has been, contrary to Bruno Maddox’s apparent assumptions. If one looks at the record of the past, Robert Heinlein was 52 the year Starship Troopers was published and 54 when Stranger in a Strange Land came out. At 31, Roger Zelazny wasn’t exactly a callow youth when Lord of Light was published. Arthur C. Clarke was in his early thirties when his first novel [Against the Fall of Night] was published as serial. William Gibson was 36 when Neuromancer was published. Even today, the “hot new” SF writers, such as Jo Walton, Alastair Reynolds, Charles Stross, Ken MacLeod, and China Mieville, while not old by any stretch, are in their late thirties or early forties.

Second, those talented and even younger writers who have not yet been recognized widely are often at the stage of having stories and first and second novels published. They are not generally not exactly the most prosperous of individuals, or they have demanding “day jobs” and tend not to attend in as great a proportion the more expensive and distant conventions and conferences. Nonetheless, they exist, even if most weren’t at the Nebula awards.

Science fiction may not always get it right, but the writers are still in there pitching, with far more ideas than Mr. Maddox, who seems to equate experience and flowery Hawaiian shirts with a lack of creativity and inspiration.

MediaPredict — The End of "Literature"… Or Even Just "Good" Books?

The New Yorker recently reported on Simon & Schuster’s efforts with MediaPredict to develop what would amount to the “collective judgment of readers to evaluate books proposals” by reading selections presented on a website. The reason why any bottom-line oriented publisher would attempt to discover a better way of determining what books will be commercially successful is obvious to anyone familiar with the publishing industry — something like seven out of every ten books published lose money. Needless to say, more than a few people responded with comments suggesting this “American Idol” approach would doom the publishing industry to institutionalized mediocrity.

As those of you who have read more than a few of my books know, I believe that, with a few well-cited exceptions, extremely popular works of art in any field tend not to be excellent, and many of the few that are both popular and excellent tend to be those from earlier historical periods that have been propagandized by well-established cultural and social institutions. This is the way it is and has always been… and it may well continue. In the publishing industry charges and countercharges have flown back and forth for years, on subjects such as editorial elitism, genre segregation, reviewer bias, critical prejudice against commercially successful authors… and on and on.

For all that, the publishing industry has managed a remarkable diversity in publication, and in the F&SF field, small and niche publishers have broadened that diversity, as have more recent internet publishers.

What bothers me more about the MediaPredict approach is that it substitutes the judgment of one small group — those who enjoy reading off electronic displays and have time to read online — for that of another smaller group — editors and agents. Since my work has been far more popular with readers than with editors and agents — with the notable exception of one long-time editor — I certainly have always questioned the collective judgment of editors and agents. Any competent editor or agent can certainly tell what kind of novels are selling. On the other hand, it takes a truly outstanding editor to determine what kind of book that isn’t currently being published will sell, and there are very few editors who can make an accurate judgment like that on a consistent basis.

But will the MediaPredict approach make any better judgment on the commercial potential of a book? I doubt it… and here’s why.

Both online readers and editors are largely self-selected groups, if self-selected for different reasons, and this reflects the larger problem I see with the MediaPredict approach. The self-selection criteria for being an online reader effectively eliminate large groups of individuals from the selection process. Even some twenty years into the wide-scale personal computing/cellphone/PDA age, the majority of novel readers doesn’t read and doesn’t want to read books off a screen… any kind of screen. It takes a certain mindset to enjoy doing this, and I suspect that mindset is different from non-screen-readers. MediaPredict might do quite well at determining what kind of books appeal to that audience, but that audience is currently a minority of readers– especially outside the F&SF and possibly the thriller fields.

Editors, for all their short-comings, and they do have many, are held responsible over time for the success of their selections, and editors who tend to have too many commercial failures generally have short careers. There’s not even that check over the MediaPredict approach, nor has anyone asked one other critical question: Do the “screen-readers” predict accurately not only who likes the books being previewed, but whether they represent actual buyers? In short, will those on-screen preferences translate accurately into bottom-line profits? Because, in the end, that’s how the industry measures success.

If It’s Not in the Database…

The other day, my wife attempted to book a hotel room online, a relatively simple task even for those of us who had to learn computers at an advanced age, say, over thirty when we first encountered what then passed for personal computation. Everything went fine until she attempted to enter our home address.

Her reservation was rejected because our actual street name did not match the one in U.S. Postal Service database. The Postal Service address eliminates the word “south” and runs together the last two words. We did manage to figure it out and get the reservation, but, frankly, this sort of hassle could foreshadow a far greater problem.

After the momentary hassle was resolved, I looked at my driver’s license. It shows the correct street address, and not the one that the Postal Service database says is “correct.” Then I went outside and looked at the street sign. The name on the sign matches my driver’s license and the legal description on our property tax. But… the government database gives the wrong address.

Does that mean at sometime in the future, if we have more security crackdowns at airports, I — or my wife or some other unfortunate soul whose address does not match — will be dragged aside because the database used by the government is flawed, and because computers aren’t smart enough to figure out that a phrase like “West Ridge” might be the same as “Westridge?”

So long as the mail gets here, I don’t much care whether it’s addressed to the equivalent of West Ridge or Westridge, but I do care when the wrong terms get put in a computer that may well affect my personal freedom because the correct address is flagged as being “wrong” in a federal database. As we all know, computers aren’t that “smart.” If it doesn’t match, it doesn’t match. Now… all one has to do is to combine a literal-minded security official with a faulty database before the difficulties begin. We’ve already had the spectacle of a five-year-old [as I recall] boy being denied passage on an airliner because his name matched that of a suspicious person.

Years ago, I thought the story [whose name and author I’ve forgotten] that had an innocent man being executed because a computer glitch turned a citation for overdue library books into a murder conviction was amusing… and far-fetched. Now… I’m beginning to worry that such a possibility is neither. What concerns me even more is that I haven’t seen much written about such cases as an indication of a systemic problem, rather than isolated instances that will just go away or be the problem of a few individuals. But what if I — or you — happen to be those individuals?

Just how are we going to prove that the database is wrong — especially in time to catch the flight or avoid detention as a suspicious individual?

The "Facts" We All Know

A recent scientific article reported the results of a study of the conversational patterns of men and women. The results? That men and women actually utter almost the same number of words daily. The topics talked about did differ by gender — men talked more about tools, gadgets, and sports, women more about people — but the difference in the volume of conversation indicated that, on average, women talked only about three percent more on a daily basis. More interesting was the fact that the “extreme” talkers were male.

What I found most interesting about the study was its genesis. One of the researchers kept coming across references to a “fact” that women talked three times as much as men did, but he could never find any research or statistics to support that contention. I can’t help but wonder how many more such facts are embedded in our culture… especially in the science fiction and fantasy subculture.

Science fiction, in very general terms, is supposed to be based on what is theoretically possible in the sciences, and over the years I’ve heard more than a few authors talk, both sotto voce and loud and boisterously, about how they wrote “hard science fiction,” solid stuff, based on science. And to give them their due, most of them did. But with a tradition of such “hard” SF going back over seventy years, why is it that SF writers have had such a poor record of predicting the future?

The first reason, I submit, is that many of the “facts” accepted by writers don’t stay facts. They were theories or assumptions based on science that was either already outdated or which soon became outdated, yet was still widely accepted. For example, Tom Goodwin’s “classic” story [“The Cold Equations”] basically suggested that there was absolutely no flexibility in oxygen supplies in a spacecraft, largely, I believe, because he did not anticipate oxygen recycling and the like, or the kind of human engineering and ingenuity that allowed Apollo 13 to make a miraculous return to earth. The other problem with “The Cold Equations” is that Goodwin combined the “laxity” of long-accepted technology with the totally tight margins of experimental and pioneering craft. The only prevention for intrusion into a spacecraft about to launch was a sign? For a culture sending a ship across interstellar space? Yet, so far as I can tell, few if any writers or critics ever noted this at the time.

Also, like the “conversation” fact uprooted by this recent study, there are other cultural facts and myths, so deeply part of our society [as well as different “facts” deeply rooted in other cultures] that we seldom question them. There is the “fact” that the ace pilot is tall, lean, and rangy. In fact, usually the pilots best able to take high gee forces are shorter and less rangy.

A second reason is that technology — or magic, if we’re considering fantasy — is only one of many factors. Costs and economic viability usually trump technology. That’s the principal reason why there’s no follow-on aircraft to the Concorde. And why we don’t all have those personal helicopters predicted at the 1939 New York World’s Fair. Yet still, all too many SF authors don’t consider the economics of their cultures or futures.

So… if you really want to write something that’s accurate, consider those “facts” you have tucked away very carefully… and don’t forget about the cost of implementing that nifty technology.