Archive for the ‘General’ Category

The Instant Society… and Rise of Stress and the Decline of Forethought

Final examinations are nearing at Southern Utah University, and student stress is building to incredible levels, as it does near the end of every semester these days.

Every day, my wife, who is a full professor at S.U.U., is deluged by students who are “so stressed” that they’re having trouble coping. They have great trouble dealing with the term papers, the projects, the juries, the performances, and the examinations that all come due in the last week of the semester. Now… such requirements aren’t exactly new. They’ve been a part of collegiate curricula literally for generations, and my wife and other professors continually warn students not to procrastinate and attempt to get them to think ahead. But very few of them do, and this generation seems to have far more difficulty in dealing with the situation than any previous generation. Yet the world that awaits them beyond school is filled with deadlines and pressures, and eliminating or reducing such pressures from college, as some institutions are apparently attempting to do, hardly seems a good way to prepare students for “real” life.

Why? Is just that they’re more verbal about the pressures? No… I don’t think so. There are too many other indications that they actually do feel stressed out. But why? Why should these college students be so stressed? They have the highest standard of living of any group of students in history and the most opportunities. When I was their age, the country was in turmoil, and there were riots about the Vietnam War, and a goodly percentage of young men faced the draft or military service in the service of their “choice” before the draft claimed them for the Army. When my parents were students, it was the middle of the Great Depression, and Germany was turning to Nazism, and World War II loomed. When their parents were students, the era of the Robber Barons was in full swing, and the nation was heading into World War I.

The vast majority of problems faced by today’s students are internal, arising out of their own chosen life-style and habit patterns. Yes, there is a drug problem, but they don’t have to use or abuse; that’s a matter of choice. Even war, for them is a matter of choice, given that we have an all-volunteer armed services. HIV, AIDS… those too are essentially a matter of choice, except in very rare cases. Whether one gets into the “right” university or graduate school is not a matter of survival, unlike being conscripted for WWI, WWII, Korea, and Vietnam. And while the “right” school may confer greater opportunities, those opportunities don’t come down to actual survival, but to a higher level of income and prosperity.

Yet “stress” and college counselors abound, and most students seem to complain about being “stressed out.”

I’d submit that this wide-spread epidemic of stress is the result of our “instant society.” Back before the age of computers, doing something like a term paper required a certain amount of forethought. Papers, strangely enough, were far longer then, and required more research, with extensive footnotes and bibliographies. Typing them required more time, and anything more than punctuation revisions could not be made without retyping the entire page. Tables had to be carefully measured and hand-typed. Graphs were hand-drawn. What can be done in minutes today on a computer took hours and then some.

Today’s students are used to getting everything “instantly.” When I was a student, unless you were wealthy, telephone calls required either lots of quarters and a pay phone [now nearly obsolete] or a recipient who would accept the charges. That necessitated at least some forethought. Today, it’s just flip open the cellphone and call. There was exactly one fast food restaurant in the town where my alma mater is located, and it was a long walk from campus, and the college grill closed at 10:00 p.m. And late late or Sunday shopping for paper or supplies… forget it.

Now… I’m not praising the “good old days.” I’m just saying that they were different, and that difference required a basic understanding that you couldn’t do everything at the last moment, because very little in society was “instant.” Even so, some students procrastinated… and flunked out. Today, they can procrastinate, and technology sort of allows them to throw something together… but it’s often a mess… and they end up stressed out.

No matter what anyone says, it just doesn’t occur to most of them to plan ahead. Why should it? Between watered-down high school curricula where last minute preparation usually suffices, especially for the brighter students, and a society that caters to instant gratification on all levels, very few of them have ever had to plan ahead in terms of dealing with day-to-day work and studies.

They’re intelligent; they’re incredibly quick at some things, like video and computer games and tasks and internet searches. What they aren’t good at is foreseeing the convergence of the mundane into a barrier that can’t be surmounted at the last minute. Nor are they all that good at seeing beyond the immediate visual superficiality and assessing how what they see may play out in the long run.

So… we have stressed-out students, many of whom will turn into adults who will end up even more stressed out when it turns out that neither technology nor the instant society have an instant solution for their lack of forethought… when they truly have run out of time.

The Commentator Culture

Last weekend, as with almost every weekend this fall, the college football pundits were proven wrong once more as Oklahoma upset Missouri and West Virginia lost. The commentators were wrong. All this got me to thinking about just that — commentators.

We have sports commentators, who are “experts” on everything from bowling, golf, and football to anything that appears on some form of television — and that’s anything that’s professional, in additional to the collegiate “money” sports. We have financial commentators. We have political commentators. We have news analysts and commentators. We have religious commentators. We even have F&SF reviewers and commentators.

Yet all too many of these commentators are really just dressed-up versions of Monday morning quarterbacks, with explanations of why things happened after they already did. Pardon me, but anyone with a certain amount of intelligence and knowledge about a field ought to be able to explain what did happen. But how many of them, particularly outside of sports, have that good an average in predicting what will happen?

Besides, what about the old idea of thinking for one’s self? Doesn’t anyone think out their own views — by themselves — any more?

While it’s always been obvious that a certain percentage of any population is unable to formulate coherent and logical opinions about much of anything, I have to wonder whether many are even trying these days. Oh, I’m certain that people retain that capability, but with instant polls on everything from whether anyone agrees with what Celebrity X is doing to who leads in what Presidential primary state or whether the results of the Hugo voting are superior to the results of the World Fantasy Awards or whether some other writers and books really deserved the “award,” we’re inundated with commentary and interpretation of news, polls, and events, so much so that it’s often hard to find a complete set of facts by which one might, just might, have the opportunity to make a judgment based on facts, rather than on commentary.

It almost seems that, in more and more fields, commentary is replacing facts and news about the events, as if readers and viewers could not be bothered with learning the facts and deciding by themselves. I know that I have to take and read more and more periodicals, often more obscure ones, just to find information. Even news stories in the local papers are filled with speculations and commentaries on why something happened, so much so that it’s difficult, if not sometimes impossible, to discover the facts.

I’m dating myself, but I really miss the attitude of Jack Webb on the old Dragnet, when he’d say, “Just the facts, sir, just the facts.”

That’s one reason why I’ve been so pleased with the unpredictability of the college football season. At least somewhere, real life is destroying the false image of the infallibility of “professional” commentators.

Writers: Is It Overused "Theme"or Truthful Observation?

Over the years, I’ve noticed that various readers and reviewers have remarked on the fact that I seemed obsessed with the “theme” of power, and sometimes the “theme” of gender and sexual politics. Other writers get identified with these or other “themes,” and usually, but not always, the noted identification carries the implication that the writer under discussion should get on with it and stop pounding at that theme.

But… is there a distinction between observation of human nature and a theme that underlies human behavior? Or is it just a matter of reader and reviewer opinion? Is it a repetitive and unnecessary theme when the reader or reviewer doesn’t want to accept the observations, but merely life-like when they do?

For better or worse, before I became a full-time writer, I spent almost thirty years in the worlds of the military, business, and government and politics, and in these worlds I received a thorough education in how power is used and abused in all fashions by human beings. As many others before me have noted, and as doubtless many others after me will note, very few people really understand and know how to use power effectively, and even fewer use it for what might be called the “greater good.” This is not a “theme.” It’s an observed fact, and if I include fictionalized versions and variations on what I’ve observed, as an author, I’m being true to human nature.

This issue applies to other aspects of writing science fiction and fantasy as well.

In the Spellsong Cycle, Anna continues to use the same tactics, often in battle after battle. So do various others of my characters in other books, and some readers have complained that was “unrealistic,” that such tactics wouldn’t continue to work. In combat, effective tactics are based on the abilities of the combatants, the weapons at hand, the geography, and various other limited factors. The range of effective tactics is indeed limited, and tactics are used effectively over and over again. This is why military strategists study ancient and modern campaigns. In addition, weapons change their form, but their functions change slowly over time, and sometimes not at all over centuries. Today, the function of the vast majority of modern weapons is the same as two centuries ago — to apply various destructive and explosive devices to the most vulnerable aspects of the enemy. We’ve gone from musket balls to cluster-bombs and RPVs, but the function remains the same. Even in science fiction, this observation holds true.

Likewise, so does another human variable — the slowness of human beings, especially in groups — to learn from experience. Even after WWI, the armies of most industrialized nations, including the U.S., still retained cavalry units — with horses — despite the clear knowledge that mounted cavalry was ineffective and counter-productive against such weapons as the machine gun. Castles took a long time to vanish after the development of artillery. Yet I can’t count the number of times I’ve had readers complain about — and even some reviewers comment on — why one side or the other doesn’t learn how to cope with something after one or two battles. Borrowing from another media… Duhhh!

After a certain amount of experience, I learned that fights of all kinds consist of short and violent action, punctuated by far longer periods of comparatively little action. As a beginning Naval aviator, I was told that flying was “99% boredom and one percent sheer terror.” In a sense, it’s true. Most time in the air is spent getting to a place where intense action occurs or is undertaken before you return. Some missions are designed to have no action; you’re either gathering information or waiting on station in the event something might happen. Yet far too many books depict only the action and all action… and more action. To me, that’s incredibly unrealistic.

Yes, fiction has to offer entertainment, and no one wants to read, and I certainly don’t want to write, something as boring as a moment-by-moment adaptation of boring reality. By the same token, not taking into account the crux of human nature and human brilliance and stupidity — and at least some of the waiting in between — can only result in the written version of a high-speed video game.

I don’t write those, but they do get written, and that’s part of the marketplace. I don’t mind that, either, believe it or not, but what I do mind is when readers and reviewers with a “video-game” mindset criticize those authors who are trying to enlighten and educate, as well as entertain, because their books are more true to life. Some themes are true, both in life and fiction, and ignoring them is one reason why conflicts like Vietnam and Iraq, or the Middle East, or… [pick your favorite politico-military morass] have occurred and will continue to happen.

Overused theme or time-tested observation? In the end, it still depends on the reader’s viewpoint.

Living Forever — Fact, Faith, or F&SF?

The other day, my wife made an interesting observation. She asked, “If so many people believe in Heaven and an afterlife, and Heaven is so wonderful, why is everyone trying to live forever?” At that, I got to thinking about the associations and corollaries. According to the polls and statistics, the United States is the most “religious” nation in the world. And from what I read and can determine, we’re also the nation that spends by far the most money on medical research and procedures to keep older people young and to extend life-spans. We’re also the nation where talk of practical immortality and agelessness holds great sway, where the singularity will lead to practical agelessness, if not immortality. The entire issue of immortality has been one of the staples of both science fiction and fantasy from the beginning, with the immortal land of faerie or such works as Zelazny’s This Immortal.

Yet, if the true believers are right, what’s the point? Heaven is obviously a far better place than here on earth. If it weren’t, how could it be Heaven? So why are we spending billions to keep the most elderly barely alive, if that, when they could be in a better place… that is, if you’re a believing and practicing Christian or Muslim? And why have so many books and stories centered on immortality?

Now, I’m not disabusing medicine or medical research. People shouldn’t have to suffer horrible diseases or die of infections or be paralyzed for life or otherwise incapacitated when medicine can cure them or improve their life or condition. Yet, the plain fact of medicine is that, in the United States, the vast majority of medical care and expense goes to those who are in their last year of life, and far, far, less money in research and treatment goes to children and infants.

If those dying of old age are going to a better life anyway, wouldn’t it make much more sense to spend more of that medical funding on finding cures for children’s ailments… or providing better nutrition and preventative care for the young?

But then, do all those true believers really believe in Heaven and the afterlife? It’s often been said that actions speak louder than words and that people put their money in what they believe… and they read that which interests them. If that’s so, all the medical scrambling to extend lives and find immortality might suggest a certain, shall we say, shallowness of belief. Even hypocrisy, perhaps? Or, too, perhaps they do indeed believe in an afterlife, and subconsciously don’t want to face the theological nether regions reserved for those whose actions are less than charitable and worthy.

Either way, I find it food for thought. Exactly why does a society with so many true believers support medical age-extension and the quest for physical and earthly immortality anyway? And why is there now such an increase in books about immortal vampires and werewolves and the like? Are the two trends connected… and if they are… how do they square with the fact that the fastest growing religions are those which are best described as fundamentalist evangelical… with the attendant belief in an afterlife?

Genre Chaos

This past week saw yet another group of reviewers post their “best books of the year” listings, and there will be more yet to come. In times to come in Locus, at least four or five respected gurus will list their choices. Why, if an author can’t get something somewhere into something labeled as best, he or she obviously hasn’t been trying hard enough.

But what constitutes “the best?” According to my outdated Funk & Wagnall’s, “best” means “of the highest quality.” This doesn’t help much, because also according to that same dictionary, “quality” is defined as “the degree of excellence,” and “excellence,” in turn is defined as of “superior quality.” When you get a definitional circular argument in meaning such as this, it’s a fairly good indication of subjectivity. “Excellence” or “quality” falls into that category that might be described as, “I can’t really quantify or objectively explain why this is good, but by [the appropriate deity] I know excellence when I see it.”

Compared to what? To other books just like it? To all fiction? To a selected body of work based on the subjective criteria of the reviewer?

As all too many readers of speculative fiction know, a number of writers of “mainstream” fiction, or thrillers, or romances, or mysteries have adopted SF themes in their work, the majority, sadly to say, often badly, if not totally ineptly. The critique has often been that, first, yes, they were writing science fiction and, second, they did it badly. The real critique should have been more direct — they wrote bad books.

This basic issue of quality has been obscured by the “dictates of the marketplace” and aided and abetted by the growth of the book chain superstores [yes, yet another great sin laid at the feet of the evil empires of book marketing]. In their zeal to sell as many books as possible as easily as possible, clearly in competition with the comparative mindlessness of broadcast/satellite/multimedia entertainment, the publishers and the book chains have broken fiction into genres and sub-genres, and sub-sub genres. We have whole sections of bookstores devoted to media-spin-off-teenage vampire series or Star Trek spin-offs, or… the list is long and getting longer.

Yet for all this splintering of fiction into genres or sub-genres — perhaps better identified as niche marketing opportunities that require less thought and consideration by would-be readers — what we’re seeing is a lower and lower percentage of the population remaining as serious readers. One explanation for this is that reading is “merely” entertainment, and with the proliferation of other venues of entertainment — video and online gaming, satellite and cable television, multiplex theatres, DVDs, year-round broadcast sports, etc. — the proportion of readers is bound to decline. I certainly can’t argue with that, because it’s exactly what’s appeared to have occurred.

When I was growing up, back in the dark ages when television meant three network channels and one independent local TV station, reading was effectively subdivided into non-fiction, fiction, and magazines and comics. Comics were for kids, and reading only magazines suggested a certain lack of intellectual perseverance, which may have been why there were book clubs where people really read the books and discussed them. And… more people of all ages used to read books.

All the multiplicity of fiction genres, and their accompanying awards, to my way of thinking, puts more and more emphasis on following genre or sub-genre rules than on writing a good, intriguing, entertaining, well-written, and logically internally consistent work. Yet, every time I look around, it seems as though there’s another set of awards, based on another media offshoot, or another genre or sub-genre. As all these awards have proliferated, and as the book marketing empires segregate books, often quite artificially, a smaller and smaller percentage of the general population reads. Amid all the genre-chaos and confusion, a few handfuls of authors succeed in establishing themselves as “brands,” which is one of the few ways in which a writer can transcend the limitations of genre-niche-marketing taken to extremes. Others work on media or gaming tie-ins. The rest… well, the numbers are showing there are more and more writers being published, and most of them are selling fewer copies of individual titles than their predecessors of a generation earlier. Yet the multiplicity of awards continues to proliferate.

But, no matter, if I get an award for the best novel dealing with alternate history of an established fantasy series universe [if it’s my own universe, anyway], based on the greatest logical constructions of fantasy improbabilities, I’ll take it… graciously and gratefully, and my publicist will probably find a way to get it on the cover of my books after that.

The Wrong/Incomplete Data

Several years ago, an acquaintance made a comment that almost caused me to take his head off. He said, “Your wife has a really cushy job. She doesn’t even leave for work until 9:30 every morning.” I refrained from homicide and tried to explain that, first, that because she is both a college professor and an opera director, as well as a performer, she seldom got home before nine or ten o’clock at night, and usually it was later, far later, that she worked four out of five weekends at the university, and that overtime compensation was non-existent. He replied by pointing out that she only had to work nine months out of the year. I just shook my head and walked away, because that wasn’t true, either. Generally, she only gets paid for nine months, but she works between eleven and twelve months a year — admittedly “only” about forty hours a week in the summer to catch up on what won’t fit in the year, to research and often write the shows for the coming year, to conduct job searches, and to write the required scholarly articles. And for all that, with all of her graduate work and international expertise, and as a tenured full professor, she makes far less money than do almost any of our offspring — only one of whom has more degrees.

I’m not writing this to say how down-trodden professors are — I do know some who truly skate by, although they’re a tiny minority, and that could be yet another example — but to offer the first instance of what might be called “data abuse.”

The second example is that of the Mars probe that crashed several years ago, because its systems clashed. One system had been programmed for “English” measurements, the other for metric. A third example is NASA itself, and the fact that manned space exploration has actually declined in scope and in accomplishments ever since the Apollo missions of more than 30 years ago.

A fourth example is the issue of school voucher programs, a proposal that was just defeated in Utah. Proponents argued that providing vouchers for roughly $3,000 a year per student for those who wished to go to private schools would actually allow more money for those students who remained. Mathematically, this would have been true, but the most salient points were minimized and never addressed in all the sound-bite coverage. First, even if every student received the maximum voucher amount, on average families would have to come up with an additional $4,000 per student. Exactly how many families making less than the Census Bureau’s “middle-class” income of $42,000 are going to be able to come up with an additional $8,000 in after-tax income [assuming two children in school]? Currently, only about 15% of all private school students receive financial aid, and that means that schools cannot afford to grant significant additional aid, not without raising tuition. Second, a great many communities in the state have no private schools at all. Third, the program did not provide additional funding to pay for the voucher program, but would have diverted it from existing [and inadequate] public school funds. So, in effect, the voucher program would not have benefited low-income students, or most middle-class students, but, for the most part, would have subsidized the tuition of those who could already afford such schools. Certainly, the program would have done little for the public school system, even though the supporters claimed that it would have.

Another example is the “core” inflation version of the Consumer Price Index, which is supposed to measure the rate of price inflation, and is the index used by government to measure how inflation affects consumers. Several years ago, however, the changes in the prices of food and energy were removed because they were too “volatile.” Yet 67% of all petroleum products go to transportation, and the majority goes into the tanks of American cars. So, as we have seen a price increase of almost 60%, as measured by the cost of a barrel of oil, over the past year or so, that increase doesn’t appear as part of inflation measurements. Thirty-three percent of all the petroleum we use goes into making industrial products, such as rubber and plastic, and chemicals. But those costs are reduced by “hedonics” or implied quality improvements. If your new car has better disc brakes or cruise control, or automatic stability, the CPI auto component for durable goods is adjusted downward to reflect quality improvement. The only problem is that the price paid by the consumer doesn’t go down, but up, yet the statistics show a decline the durable goods index.

These are all examples of what I’d loosely term “using the wrong data.” At times, as in the case of the Mars probe, such usage can be truly accidental. At other times, as in the case of my acquaintance, such incorrect data usage is because the user fits a prejudice into existing data and doesn’t really want to seek out conflicting and more accurate data.

In other cases, as exemplified by the NASA budget, other data, chosen to exploit other political priorities, take precedence. And, as illustrated by the voucher issue or the CPI measurements, all too often those with a political agenda have no real interest in using or examining the full and more accurate range of data.

What is often overlooked in all of these cases, however, is that in none of them did those involved use “incorrect” data. The figures used were accurate, if often selective. Yet in political and policy debates; in inter-office and intra-office, or departmental budget or resource allocation tussles; even in conversation; what people focus on all too often is whether the numbers are accurate, rather than whether they’re the numbers that they should be considering at all. Seeking accuracy in irrelevant data isn’t exactly a virtue.

It’s not just whether the data is accurate, but whether it’s the right data at all.

More on the Hugos

Several people have contacted me about my proposal for a Hugo for Betty Ballantine, and one pointed out that the World Science Fiction Society Constitution limits what Hugos can be given, and further stated that the special award given to Betty in 2006 was probably the only practical kind of recognition possible.

After reviewing the WSFS Constitution, I will note that section 3.3.15 states:

Additional Category. Not more than one special category may be created by the current Worldcon Committee with nomination and voting to be the same as for the permanent categories. The Worldcon Committee is not required to create any such category; such action by a Worldcon Committee should be under exceptional circumstances only; and the special category created by one Worldcon Committee shall not be binding on following Committees. Awards created under this paragraph shall be considered to be Hugo Awards.

I note the last sentence: Awards created under this paragraph shall be considered to be Hugo Awards.

Now… was Betty Ballantine’s special award in 2006 a Hugo under the rules? I honestly do not know, but, given the comments I’ve received, it doesn’t appear to be, and more than a few life-time professionals in the field have declared that what Betty received is not a Hugo.

I still believe that Betty deserves a Hugo, but in studying the WSFS Constitution, I discovered what I believe to be a serious fault, and the fact that Betty has not received a Hugo is just one example of that fault.

The fault is simple, but basic and obvious. There is no single standing and permanent award for achievement in a body of work, whether in writing, editing, art, or publishing. Every single award is for work appearing in the previous year. Now, for authors who have a substantial body of work, and who have not received a Hugo, at some point, there is a chance that a “late-in-career” book will receive a nomination and a Hugo, one that it probably does not merit, in order for the voters to recognize, if belatedly, someone who has been overlooked in the annual popularity contest. The same is true of artists, and under the revisions involving editors, for them as well.

Wouldn’t it be far better simply to create an on-going Hugo for life-time achievement, the way the World Fantasy Convention has [horror of horrors] than to keep ignoring those whose contributions may have been less spectacular in any given year, but whose overall achievements dwarf those of many one-time Hugo award winners?

If the WSFS does not wish to address this, then perhaps the Constitution should be amended to read — “The Hugo awards reflect only popularity among a limited number of readers in the previous year and do not attempt to reflect continued and sustained excellence by members of the speculative fiction community.”

Is this an issue that members of the WSFS wish to address, one way or another, or is everyone happy with the continuation of the annual popularity polls and the ignoring of long-standing contributions to the field?

A Hugo for a True F&SF Pioneer

When I was at the World Fantasy Convention earlier this month, I had the privilege of having breakfast with Betty Ballantine, whom I had never met before. Even at 88, she’s sprightly and has a cheerful and feisty wit, but after that breakfast, I realized that only a comparative handful of people truly know or understand the contribution that Betty, along with her late husband Ian, made to western literature and publishing, and particularly to science fiction and fantasy.

Betty and Ian began importing mass-market paperbacks from the United Kingdom in 1939 before helping to form Bantam Books and then launching their own firm, Ballantine Books. Prior to the Ballantines’ efforts, there were virtually no paperback books in the United States, except those already imported by the Ballantines. Ballantine Books became one of the earliest publishers of original science fiction books, publishing such authors as Arthur C. Clarke, Anne McCaffrey, and H.P. Lovecraft. They even published the first “authorized” edition of J.R.R. Tolkien’s works. By their efforts, they effectively lifted science fiction and fantasy from the pulp magazines to paperback books and created a commercially viable genre that in turn laid the groundwork for the media take-offs for such television shows as Star Trek and movies such as Star Wars, not to mention such later bestsellers as The Wheel of Time and Harry Potter.

Of course, one of the reasons why Betty was at the convention was that she had been selected to be the recipient of a Lifetime Achievement award from the World Fantasy Convention. But, as noted by many, it did seem rather strange, in retrospect, that this woman, who has done so much for both science fiction and fantasy, has never been honored with a Hugo — the most recognized popular award in speculative fiction.

While I understand that L.A. Con IV did offer a “special committee” award to Betty Ballantine in 2006, a special committee award is almost a slap in the face for someone to whom every speculative fiction author and reader owes so much.

All too often, those who pioneered and made something possible are forgotten in the glare of the successes of other people, successes that the pioneers made possible. That’s particularly true today, where fame is even more fleeting than ever and where celebrity so often overshadows true achievement. Sometimes, after they’re dead, such visionaries and pioneers are remembered and memorialized, but while that’s great for posterity, it really doesn’t show much appreciation for the real living person, and Betty certainly deserves that appreciation.

So… what about a Hugo for Betty Ballantine in Denver next year? A real Hugo, voted on by all those whose reading was made possible and affordable by Betty and by those whose writing, and cinematic and video achievements might not ever have come to be without her efforts?

And… for the record, and the skeptics, I’ve never been published by any imprint even vaguely related to those created by Betty… and I strongly doubt I ever will be. I just happen to think it’s a good idea.

The Under-Recognized Passion… and Its Future

Most of us, when someone mentions passion, think of sex, at least first. But an article in New Scientist got me to thinking about another passion that is far stronger and far less recognized than sex — greed.

In January 1820, a transplanted German who had taken the British name of Frederick Accum published a book, Treatise on Adulterations of Food, and Culinary Poisons. The book provided an expose of how those in London’s food trade adulterated their wares and poisoned their consumers. Accum named names and spared no one, illustrating how bakers used gypsum and pipe clay in bread, how lemonade was flavored with sulfuric acid, how new wines were aged with sawdust, how phony green tea was created by using poisonous copper carbonate.

And what was the reaction to Accum’s book? It sold out, and, then, there were anonymous threats against him. Those who didn’t like what he wrote followed him around until he was observed ripping several pages containing formulae from a book in the Royal Institute library. He was immediately charged with theft, and his reputation attacked and destroyed, all for the sake of profit, however obtained. Although the charges were dismissed, Accum was forced to return to Germany. Not until thirty years later did the British medical journal, The Lancet, and Dr. Arthur Hill Hassail address the problem, and Parliament finally passed the Food Adulteration Act in 1860. It took far longer in the United States, until after the muckraking of the early 1900s.

You think that’s all in the past? Flash forward to today.

We have had the experience of cheap pet food from China being contaminated, and almost every week, some food manufacturer is recalling something. It’s not just food, either. It goes well beyond food.

Enron built a phony trading room in order to further its energy shell game, and then left all the shareholders and employees holding the bag. Similar shenanigans occurred with WorldCom and Global Crossings. And what about all the sleazy mortgage brokers who sold naive homeowners mortgages that they wouldn’t be able to afford once the “teaser” rates vanished? Or the payday lenders who charge effective interest rates of 100% and more?

Even in “legitimate” commerce, greed has its place, from the hedge fund traders who make hundreds of millions of dollars for shifting paper… a number of whom just lost hundreds of millions, if not billions of dollars… to the airline industry.

As just one example, airlines have scheduled 61 flights to depart from New York’s JFK International Airport between 8:00 A.M. and 9:00 A.M. every morning. There’s the small problem that existing systems and technology only allow for 44 departures. The Federal Aviation Administration has suggested either: (1) charging airlines more for “prime” take-off slots or (2) limiting the number of flights per hour. The Airline Transport Association, representing the major carriers, finds both of these options unacceptable and states that the FAA needs to adopt new GPS-based and high-tech radar control systems. The FAA probably will have to do this sooner or later, but there’s a small problem. It’s called funding. The airlines don’t want to pay for improving a system that’s already highly subsidized by the taxpayers; the Congress doesn’t want to; and passengers don’t want to.

What else is greed besides not wanting to supply honest goods — in this case, on-time departures — for a reasonable price? Instead of trying to solve the problem, the airlines and the politicians will ensure we’ll get more delays because everyone wants a service more cheaply than it can be provided… and that’s also a form of greed.

Oh… and by the way, in 1820, the last section of Accum’s Treatise concluded by recommending that “the painting of toys with colouring substances that are poisonous, therefore, ought to be abolished.”

So why are we still seeing children poisoned by lead paint, almost 200 years later? And why this will still be a problem fifty or a hundred years or more into the future?

Tell me again why greed isn’t stronger than sex. Except… sex sells more books, and I keep trying to ignore that, because sex is transitory, and greed isn’t.

The "Singularity" or "Spike" That Won’t Be

Over the past decade, if not longer, there have been more than a few futurists who have predicted that in a decade or so from now, modern technology will change human society on a scale never before seen or imagined, from implementing the linked society envisioned in Gibson’s Neuromancer to wide-scale nanotech and practical AIs.

It won’t happen. Not even close. Why not? First, because such visions are based on technology, not on humanity. Second, they’re based on a western European/North American cultural chauvinism.

One of the simplest rules involved in implementing technology is that the speed and breadth of such implementation is inversely proportional to the cost and capital required to implement that technology. That’s why we don’t have personal helicopters, technically feasible as they are. It’s also why, like it or not, there’s no supersonic aircraft follow-on to the Concorde. It’s also why iPods and cellphones are ubiquitous, as well as why there are many places in the third world where cellphones are usable, but where landlines are limited or non-existent.

A second rule is that while new technology may well be more energy efficient than older technology, its greater capabilities result in greater overall energy usage, and greater energy usage is getting ever more expensive. A related human problem is that all the “new” technology tends to shift time and effort from existing corporate and governmental structures back onto the individual, sometimes back on higher-paid professionals. For example, the computer has largely replaced secretaries and typists, and this means that executives and attorneys spend more time on clerical types of work. Interestingly enough, both the hours worked/billed and the rates of pay for junior attorneys are way up. Another example is how financial institutions at all levels are pushing for their customers to “go paperless.” I don’t know about everyone else, but I need hard copy of a number of those documents. So if I “go paperless,” all it means is that I spend time, energy, and paper to print them out.

In short, technology is expensive, and someone has to pay for it, and it’s doubtful that we as a world have the resources to pay for all that would be required to create the world of the spike or singularity.

Another factor involved in tying all one’s bills and payments to automated systems is that one loses control — as my wife and I discovered in trying to unscramble all the automated payments her father had set up. After his death, in some cases, it was impossible to even discover where the payments were going. A number of companies kept charging for services he obviously didn’t need and siphoning money from his bank account, despite the fact that he was dead. It took the threat of legal action and the actual closure of some accounts to get the banks to stop honoring such automatic withdrawals.

Technology has also enabled a greater range of theft and misrepresentation than was ever possible before the internet and computers.

The other factor is cultural. The idea of a spike or a singularity assumes that everyone on the planet wants to be plugged in, all the time, and on call continuously, while working harder and harder for the same real wages in employment positions that seem increasingly divorced from what one might call the real physical world. While those in the upper echelons of the professions and management may find this useful, even necessary, exactly how are the vast numbers of service workers employed at Wal-Mart, MacDonalds, Home Depot, etc., even going to afford such services when they’re far more worried about basic health care?

Am I saying the world won’t change? Heavens, no. It will change. More people will in fact have cellphones, and, like it or not, it’s possible that they’ll replace location-fixed telephones for the majority of the population. Portable devices such as the iPhone will change entertainment, and fewer books will be printed and read, and more of what will be read, either in print or on screen, will be “genre” fiction, how-to, or religion. Published poetry and “mainstream literature” will decline further. More and more “minor” lawbreaking will be detected by technology in industrialized societies. “Major” lawbreaking may even be treated and handled by some form of cranial implant and locator devices. Various forms of environmentally less damaging power generation will doubtless be adopted.

But for even a significant minority of the world’s population, or even that of the USA, to engage in a “post-singularity” world will require more and more other people take care of support services, such as real-world, real-time small child-care, medical services, the physical production, transportation, and distribution of food. And don’t tell me that we’ll have duplicators for food. That’s most unlikely because to make such devices nutritionally practical would require analytical and formulation technology that we won’t have, not to mention the requirement for a large “stockpile” of the proper sub-ingredients. And, of course, a great deal more energy at a time when energy is becoming ever more expensive.

That doesn’t even take into account the cost and technological requirements for medical services and maintenance… and that’s a whole other story.

Economics and the Future of Biotech

Recently, I exchanged several emails with a newer writer– David Boultbee — on the subject of plants genetically engineered to remove toxins from land and water, and the exchange got me to thinking. A number of years ago, when I was a full-time environmental regulatory consultant, a number of cities were experimenting with various ways in which growing plants could be used to filter and purify sewage and waste water, including removing heavy metals and various types of organic and bacterial contamination.

That was twenty years ago, and there’s been surprisingly little progress in his area, particularly given the need. That brings up the question as to why such progress is so slow… and the answer, I believe, is quite simple. It’s not a question of biology or even development costs, but the structure of our economic system.

Growing plants in large concentrations effectively constitutes agriculture. These days, agriculture is largely unprofitable on anything but a large scale, and the greatest amount of profit doesn’t usually lie in producing and selling the raw material, but in the distribution and end-point sales. That’s why orange growers, almond growers, and others form grower cooperatives that attempt to control the product all the way from production to final [or next-to-final] sales.

Now… even if a genius biologist does produce an oilseed plant that’s got a huge amount of oil that could be refined, where does the profit lie? With the refiner and distributor, who need to build an enormous infrastructure in order to make profits competitive with other industries in order to obtain the capital necessary to build that infrastructure. And in what industries do the highest profits lie? In those that produce small goods with low production costs with a high demand and an existing market.

Agricultural products seldom fit that market. Take wheat. It’s practically ubiquitous, world-wide, and while different varieties have been developed for different uses and climates, within those climates any competent farmer can grow it. The entire U.S. farm subsidy program was developed because too much of too many agricultural products were being grown, with the result that the prices were so low that too many farmers went bankrupt, to the point that, as noted above, only large farms — or specialty farms — remain profitable.

So… what happens if the biologists develop miracle plants? Before long, the entire world has them, and they cost less, and the profit margin is low — and they’ve either replaced products that had a higher profit margin, or they replace pollution control technology that does. And whole industries lose substantial profits. You can see why certain industries just might not be exactly supportive of really effective large-scale and widespread biotech. Biotech is just fine in making new high-margin pharmaceuticals, but fungible energy supplies or pollution control remedies, those are a different matter.

This isn’t a new story in human history. Way back when, sometime before, say, 200 B.C., there was a plant that grew in the Middle East, well-documented in more than a few writings, paintings, and even sculptures. Taken in some oral form, it was apparently a reliable contraceptive. It became extinct before the Christian era. Why? Because it filled a social need, a desperate one for women in poor societies who felt they could not afford more children, but no one could see a profit in growing or preserving it. Now, whether this plant was as effective as the various writings claim isn’t really the point. The point is that people thought it was, and yet there was no profit in cultivating it, and thus, it was hunted out and used until there were no more left.

So… I have grave doubts that we’ll see many biological solutions to our energy and environmental problems until someone can figure out a way to make mega-profits out of any new biological developments.

Sometimes We Get it Right

Although we science fiction writers like to claim that we predict or foreshadow the future in our work, historically our record isn’t really as great as we’d like to think, for a number of reasons, some of which I’ve discussed in previous blogs.

Arthur C. Clarke predicted communications satellites and the like very early and effectively, something like 60 years ago, but he also predicted we’d have cities on the moon and be able to travel to Jupiter by 2001. That was six years ago, and the way things are going, it may be sixty before any of that occurs — if it does at all. In The Forever War, Joe Haldeman predicted that we’d have interstellar travel by now. Isaac Asimov did all right in anticipating the hand-held computer/calculator [as he said, he even got the colors of the display for the first calculators right], but we’re nowhere close to his pocket-size fusion generators, intelligent humanoid robots, or even affordable automatic irising doors. Most of my incorrect speculations lie in my early short stories, and I’m content to let them remain there in obscurity. I tend not to have made as many incorrect speculations in recent years, not because I’m necessarily brighter than other writers, but because all of my SF novels are set far enough in the future that enough time has not yet passed to reveal where I may have been wrong. Writing the near future is indeed a humbling experience, and I prefer not to be humbled in that fashion.

For one reason or another, many of the past staples of science fiction have never come to be. We don’t have wide-scale use of personal hovercraft or helicopters, and likely never will. Despite quantum mechanics and linked electrons, it’s doubtful that we’ll ever have instant doors or transporters to other locales, even on earth. And for all the speculations about genetic engineering [or natural mutations] that will bring agelessness or immortality to us, research to date seems to suggest that while life spans can be extended and physical health as we age greatly improved, there are several biological stone walls to attaining great age, let alone immortality, one of which is that greater cellular regenerative capacity appears to be linked to greater carcinogenic propensity. As for a cloned copy of you — or me — that’s not going to happen anytime soon, either, if ever, because recent research appears to indicate that even identical twins aren’t, due to prenatal conditions, genetic “expression,” and other factors.

Against this backdrop, I am pleased to announce that astronomers have just discovered a billion light-year long void in the universe, a space absolutely devoid of normal matter, without stars or galaxies. A full report will appear in a future edition of Astrophysical Journal. For those of you who have read The Eternity Artifact, you will understand my pleasure at having one of my speculations proved right. At this point, however, since the locale is more than 6 billion light years away, there is no way to ascertain whether the reason for this void is as I postulated in the book. But… I did put it in print almost three years before the void was discovered.

“Coincidence” or not, sheer undeserved good fortune or not, I’ll take consolation in having at least one of my far-fetched speculative postulates being confirmed.

Feminism, Social Change, and Speculative Fiction

The other day I received an interesting response to my blog about the impact of social change in science fiction on readership. The respondent made the point that she felt, contrary to my statements, that fantasy had more social change depicted in it because at least there were more strong female characters in fantasy. Depending on which authors one reads, this is a debatable point, but it raises a more fundamental question. Exactly what are social change — and feminism — all about, both in genre literature and society?

The other day there was an interesting article in the Wall Street Journal, which reported on the study of performance of mutual fund managements. The study concluded that the results from funds managed by all-male teams and those by all-female teams were essentially the same. The funds managed by mixed-gender teams reported significantly less profitable returns. The tentative rationale reported for such results was that mixed-gender teams suffered “communications difficulties.” Based on my years as a consultant and additional years as an observer of a large number of organizations, I doubt that “communications” are exactly the problem. In mixed-gender organizations, where both sexes have some degree of power and responsibility, I have noted that, almost inevitably, men tend to disregard women and their advice/recommendations to the degree possible. If their superior is a woman, a significant number tend to try to end-run or sabotage the female boss. If the superior is a male, because women professionals’ suggestions tend to get short shrift, the organization is handicapped because half the good ideas are missing, either because they’re ignored, or because women tend not to make them after a while. Maybe one could call that communications difficulties, but, as a male, I’d tend to call it male ego and insecurity.

What does this have to do with feminism in speculative fiction? A great deal, it seems to me, because merely changing who’s in control doesn’t necessarily change the dynamics below the top. This is one of the issues I tried to highlight in my own Spellsong Cycle, as well as in some of my science fiction. In “Houston, Houston, Do You Read,” the solution proposed by James Tiptree, Jr., [Alice Sheldon] was to eliminate the conflict by eliminating males. As a male, I do have a few problems with that particular approach.

In Sheri Tepper’s Gate to Women’s Country, the males get to choose to be “servitors” to women or warriors limited to killing each other off, while the “violence” gene [if not expressed in quite those terms] is bred out of the male side of the population.

Ursula K. LeGuin addressed the dynamics of gender/societal structure in The Left Hand of Darkness, suggesting, it seems to me, that a hermaphroditic society would tend to be just as ruthless as a gender polarized-one, if far more indirect, and not so bloodthirsty in terms of massive warfare.

In the end, though, the question remains. In either fiction or life, is feminism, or societal change, about a restructuring of the framework of society… or just about which sex gets to be in charge?

Notes to Would-Be Reviewers

Heaven — or something — save us writers from the amateur reviewers, and some professionals, who pan a book with phrases similar to “trite plot” or “worn-out character type” or “overused plot device,” “all too typical young hero,” “standard PI,” etc., ad infinitum.

Far be it for me to be the one to say that all books all writers write are good. They aren’t. Nor will every book I write appeal to those who read my work. It won’t, and probably shouldn’t. But… those of you who are reviewers or who aspire to be reviewers, please, please, don’t display your ignorance by basing your judgments on “worn-out” character types or “overused plots.”

As Robert A. Heinlein noted in his “Channel Markers” speech to the U.S. Naval Academy more than 35 years ago, there are NO new plots. There are only a limited number of basic plots. As a result, there are no overused or trite plots. There are writers who handle plots badly, for a myriad of reasons, just as there are writers who handle them well. There are writers whose characters do not fit the plots, but the problems don’t lie in the “plot.” They lie in how the plot was or was not handled.

Almost every plot Shakespeare used in his plays was cribbed from somewhere else or someone else, but his work remains “fresh” and “original” after more than four centuries because of the way in which he handled those very common plot elements.

The same type of analysis applies to characters. Certain archetypes or types appear and reappear in novels, not because they’re tired or the authors are lazy, but because they’re necessary. If one writes a courtroom drama, there will be good attorneys and bad attorneys and brilliant attorneys. There may even be marginally competent attorneys and evil ones, but there won’t be moronic ones because they can’t pass the bar. Mercenaries will almost always be ex-military types, because that’s where one gets that kind of experience. Private investigators will almost always be ex-police or ex-military, or possibly disbarred attorneys, for the same reasons. In fantasy, knights should almost always be either wealthy or older retainers of the wealthy who have worked their way up from common armsmen, or professional military, because in any half-realistic society, those are the only way to gain the resources and experience. Pilots need to have a high degree of training and education and good reactions — and good judgment, because they’re in charge of rather expensive equipment and lives.

All too often both critics and social reformers tend to forget that stereotypes arise for a reason. They’re real. There are “good cops” and “bad cops.” And whether one likes it or not, if you see a large minority male in gang-like attire emerging from an alley and heading in your direction at night, discretion is indeed the better part of valor, stereotype or no stereotype. The same is true of the sharp-dressing WASP male who wants to sell you a large bridge for the smallest of sums. Obviously, stereotypes and archetypes can be and are overused, but slavish avoidance of such is as much a contrivance as overuse.

Likewise, try not to criticize a writer because he or she writes a particular kind of book. I don’t see reviewers trashing mystery writers, or “literary” writers, or romance writers because they write the same type of book time after time. One can note that the writer continues to write a particular type of book — but if you say that, make sure that’s all that writer writes. You can certainly point out that the writer didn’t handle it as well as in the past — or that the writer improved, but don’t trash it because you wanted the writer to write something different.

So… if you want to review… go ahead. Just try to do it with a touch of professionalism and understanding.

F&SF Short Fiction

Recently, Steven King wrote an essay that appeared in The New York Times suggesting, at least as I read it, that one of the reasons for the decline of short fiction was that all too many short works of fiction were written for the editors and the critics, and not necessarily for the readers. Among the host of those who have commented, Scott Edelman, the editor of SciFi Weekly, has just written a response that points out that, while it wasn’t King’s intention, effectively King has said to readers that there are so few good short fiction stories that all of the good ones are in King’s anthology and that readers really didn’t have to look farther.

Both King and Edelman are correct in noting that the short fiction market is “broken.” After all, eighty years ago, F. Scott Fitzgerald was paid as much for any number of his stories sold to popular magazines that just two story sales in a year earned him more than the average annual earnings of either doctors or U.S. Congressmen — and he sold far more than two stories a year. Even then it took money to live in Paris.

There are some gifted short fiction writers in F&SF, and so far as I know, not a one of them can make a living purely off short fiction. By some counts more than a thousand short speculative fiction stories are published annually. This sounds impressive, unless you know that around a thousand original speculative fiction novels are published every year, and novels pay quite a bit more. The sales of major F&SF print magazines have been declining for years, and until the advent of Jim Baen’s Universe last year, the rates paid for short fiction have been low, and essentially static.

It’s also a well-known, if seldom-stated, fact that the majority of F&SF magazines are edited as much to promulgate and further the editorial preferences of the editors as to appeal to the full range of potential readers.

Jim Baen was well aware of these facts, and so is Eric Flint. That, as I understand it, was the reason why they created Jim Baen’s Universe, the online magazine. In fact, Eric once told me that his goal was not to publish stories designed to win awards, but to publish outstanding stories that would entertain and challenge readers, and that he felt that too many editors had lost sight of that goal. So far as I’ve been able to determine, Universe has a higher rate scale for writers than any of its F&SF contributors, and Eric and Mike Resnick are obviously working hard to create a magazine that will boost the F&SF short fiction market and increase reader interest.

Yet, interestingly enough, neither King nor Edelman ever mentioned Universe, and how it came to be, and Edelman certainly ought to have been aware of it. Why didn’t he mention it? I don’t know, but I do know that it’s a part of the debate/issue that shouldn’t be ignored.

Science Fiction… Why Doesn’t Society Catch Up?

As I noted in passing in my earlier blog, various “authorities” have suggested for at least close to twenty years that one of the reasons why science fiction readership has dropped off, and it has, at least in relative terms as a percentage of the population, and even possibly in absolute terms, is because all the themes that were once the staple of science fiction are now scientifically possible and have often been done. We have put astronauts in orbit and sent them to the moon, and the reality is far less glamorous than the “Golden Age” SF writers made it seem. We have miniaturized computers of the kind that only Isaac Asimov forecast in work published around 1940. We have lasers — and so far they don’t work nearly so well as the particle beams in Clarke’s Earthlight or the lasers in 2001. We’ve created a supersonic passenger aircraft and mothballed it.

These reasons all sound very plausible, but I’m not so certain that they’re why SF readership has dropped off and why fantasy readership has soared. Earlier, I also explored this in terms of the “magic society,” but my personal feeling is that there is also another reason, one that has to do with people — both readers and the people and societies depicted in much current SF… and that includes mine, by the way.

Socially, human beings are incredibly conservative. We just don’t like to change our societies and domestic arrangements. Revolutions do occur, but just how many of them really end up in radically changing society? When MacArthur “restructured” Japanese society after WWII, the economic and political bases were changed dramatically, but the domestic and social roles remained virtually unchanged for another forty years. It really wasn’t until the 1990s when significant numbers of Japanese young women decided they didn’t want to follow the roles laid out by their mothers. Corrupt as he may have been, one of the largest factors leading to the overthrow of the Shah of Iran was that he was pressing to change social and religious structures at a rate faster than his people could accept.

While at least some of us in the United States like to think that we’re modern and progressive, has anyone noticed that “traditional” marriage is making a come-back? It’s making so much of a come-back that gays and lesbians want the benefits and legal structure as well. Despite the growth of the number of women in the workplace, women still do the majority of domestic chores, even when they’re working the same hours as their husbands, and the vast majority of CEOs and politicians are still male.

Now… what does this have to do with SF readership?

I’d submit that there’s a conflict between what’s likely technically and what’s likely socially, and social change will be far slower than predicted. In fact, that’s already occurred.

When my book Flash was published several years ago, one of the reviewers found it implausible that private schools would still exist some 200 years in the future in North America. I’d already thought about this, but the fact is that the traditional school structure goes back over 2,000 years. The structure works, if it’s properly employed, as many, many private schools and some charter schools can prove, and with 20 centuries of tradition, it’s not likely to vanish soon.

Yet more than a few books suggest the wide-spread growth of computerized learning, radical new forms of social engagement, and the like. Much of this will never happen. Look at such internet “innovations” as E-Harmony, Chemistry.com, etc. They aren’t changing the social dynamics, but using technology to reinforce them. Women still trade primarily on sex appeal and men on looks, power, and position. They just start the process electronically.

Most readers don’t really want change; they only want the illusion of change. They want the old tropes in new clothes or new technology, but most of them want old-style men in new garb, and brilliant women who are sexy, but still defer to men who sweep them off their feet.

Again… I’m not saying this is true of all readers, and it’s probably not true of the majority of SF readers. But, as a literature of ideas and exploration, the more that SF explores and challenges established social dynamics, the fewer new readers it will attract, particularly today, when it’s becoming harder and harder to create true intellectual challenge, because so few people want to leave their comfort zones. That’s an incredible irony, because our communications technologies have made it easier and easier for people to avoid having their preconceptions challenged.

Most fantasy, on the other hand, merely embellishes various existing social structures with magic of some sort, and it’s becoming increasingly popular every year. Perhaps that’s because, like it or not, technology has made one fundamental change in our economic and social structure, and that is the fact that physical strength is no longer an exclusively predominant currency in determining income levels. More and more women are making good incomes, often more than their husbands or other males with whom they interact. Sociological studies suggest that male-female relationships often reach a crisis at the point where the woman gains more income, power, or prestige than the man. It’s unsettling, and it’s happening more and more.

Enter traditional fantasy, with its comforting traditional structures. Now… isn’t that a relief?

The Popularity of Fantasy — Reflection of a "Magic World"?

When ANALOG published my first story, there really wasn’t that much of a fantasy genre. Oh, Tolkein had published the Lord of the Rings, and there were some Andre Norton witchworld novels, as I recall, and Jack Vance had published The Dying Earth, but fantasy books were vastly outnumbered by science fiction novels. Today, virtually every editor I know is looking for good science fiction. They can find plenty of decent, if not great, fantasy novels and trilogies to publish [good short fantasy stories are another matter].

What happened?

First, over the last forty years science got popular, and simultaneously more accessible and more complicated than ever. Second, technology complicated everyone’s life. Third, the computer made the physical process of writing easier than ever before in history. And fourth, the world became “magic.”

Science is no longer what it once was. Philo Farnsworth was a Utah farm boy, and effectively he invented television on his farm. RCA stole it from him, but that’s another story, and the important point is that one man, without a research laboratory, made the key breakthroughs. Likewise, Goddard did the same thing for the rocket. Earlier, of course, the Wright brothers made the airplane possible. Today, science breakthroughs that effectively change society require billions of dollars and teams of scientists and engineers. Writing about the individual in a meaningful sense in this context becomes difficult, and even if an author does it well, it’s usually not that entertaining to most readers. Add to that what science and technology have delivered to the average North American or European. We have near-instant world-wide communications, travel over thousands of miles in mere hours, pictures of distant galaxies and the moons orbiting distant planets in our own solar system, lights that can be turned on with a handclap, voice activated equipment… the list is seemingly endless. So much of what once was science fiction is now reality.

As I’ve noted in a previous blog, technology is no longer the wonder it once was. Too often technology becomes the source of strain and consternation, and for all that it delivers, most people want to escape from its stress and limitations. Admittedly, many of them use it for escape into forms of alternative reality, but more and more readers don’t want to read about technology.

Then there’s the impact of the computer, which makes the physical process of writing easier. It doesn’t, however, make the process of learning and understanding science and technology easier, and understanding science is generally useful for writing science fiction. So what do so many of those would-be speculative fiction writers concentrate on? Fantasy and its offshoots.

But the biggest factor, I believe, is that we now live in a “magic world.” A little more than a century ago, if one wanted light, it required making a candle or filling a lantern with expensive oil and threading a wick and using a striker or a new-fangled match to light the lantern or candle. Today… plug in a lamp and flip a switch. How does it work? Who knows? Most young people would have a hard time explaining the entire process behind that instant light. In a sense, it’s magic. Once transportation meant a long slow walk, or feeding, saddling, grooming a horse, taking care of the animals, breeding them, and still having to make or purchase bridles, saddles, and the like. Today, step into a car and turn the key. In more than 95% of all cars the transmission is automatic, and, again, how many people can even explain what a transmission or a differential does? It’s magic. You don’t have to understand it or explain it. I could go through example after example, but the process — and the results — would be the same.

As a society, we act as though almost all our physical needs are met by magic. Even the environmentalists believe in magic. How would many of them deal with the coal-fired power plants that fuel so much of our magic? By replacing them with solar and wind power, of course. But building solar cells creates much more pollution than using a coal-fired power plant for the same amount of power. And wind turbines, while helpful, cannot be counted on to provide a constant and continuing power source for our magic.

This mindset can’t help but carry over into what we do for entertainment. We act as though our society’s needs are met by magic, and we want to escape the incredible stress and complexity beneath the surface of our magic society. How many readers really want to deal with those factors, accelerated as they will be in the future? [And don’t tell me that technology will make things simpler. It never has. Physically easier, but not simpler. Allowing individuals to do more in the same amount of time, but only at the cost of more stress.]

To me, the “magic society” has far more to do with the comparative growth of the popularity of fantasy and the comparative decline of science fiction than the fact that we’ve reached the moon and surveyed planets and their satellites.

Technology and the Future of the Overstressed Society

Have you noticed how “stressed” everyone is today? Professionals, white collar workers, tech workers, sales workers, even high school and college students all complain about being stressed or overstressed. Many older Americans dismiss such complaints as the whining of a younger generation, a group that just can’t take it… but are these complaints mere whining… or do they have a basis in fact?

One fact is fairly clear. Americans today, on average, have a better life than did Americans seventy-five or a hundred years ago. Very, very few in the work force today had to live through the Great Depression. Nor do they have to worry about children dying of polio and whooping cough. The statistics show that most people are living longer and doing so in better health. There is a greater range of choice in occupations, and more Americans are able to get [and do obtain] higher education. The size of the average house is larger, and most houses have conveniences hardly imaginable a century ago. Although the average American work week is now longer than that of all other industrialized western nations, it’s far less physically arduous than the work of a century ago.

So why the complaints about stress?

Technology — that’s why. It’s everywhere, and it’s stressing us out in more ways than one. Those scanners in supermarkets and every other store? They not only ring up the sales and feed into inventory calculations, but they also rate the checkers on how fast and efficiently they handle customers. I knew this in the back of my head, so to speak, but it was brought home to me when a single mother who was a checker at a local store told me she’d been demoted to the bakery because she didn’t meet speed standards.

Computers, especially those with color graphics and associated high speed printers are another source of stress. Why? Because they successfully invite revision after revision by overcareful supervisors and clients. Do it over… and over… and over.

Then, there are instant messaging, emails, and texting. IMs and texting, especially among the young, lead to carelessness in spelling and grammar, and that feeds back into the need for those endless document revisions, because, believe it or not, those grammar and spell-checkers just don’t catch everything. Then… emails… which encourage everyone to get in on everything, until at times, it seems as though everyone is watching and looking for ways to make completing anything difficult. On top of that, add bosses who feel slighted if one doesn’t answer emails quickly, and all that answering and justifying and explaining doesn’t get the projects done. It just takes up time that can’t be used to do real work, a problem that some supervisors just don’t get.

As for students, keeping in touch through the technology of cell-phones, emails, and texting seems to occupy their every waking, walking, and driving moment. Add to that the allure of the wonders of hundreds of cable or satellite channels, and the need to earn money for an ever-more expensive education — or vehicle payments — and they’re stressed out.

The impact of technology pervades everything. Computerized legal databases and software make litigation ever more complex — not to mention expensive and stressful.

Healthcare has even more problems. We have more than 47 million Americans without health insurance, and the number is growing faster than the population. Why? Because expenses are growing, thanks to a proliferation of medical technology and drugs that raises costs. When my grandfather was a doctor, diagnostic technology was essentially limited to a few blood tests, a stethoscope, and an X-ray machine. Today, the average doctor’s office is filled with equipment, and that equipment creates an expectation of perfect medicine. That expectation, combined with the opportunism of the technologized legal system, leads to far more litigation. That leads to higher malpractice insurance, and more stress on doctors and more and expensive tests and procedures to make sure that nothing gets missed — or to cover the doctor from legal challenges. It’s not uncommon for some medical specialties to have annual malpractice premiums in excess of $200,000 a year. Assume that a doctor actually sees patients 5 hours a day in the office some 50 weeks a year, the other time being spent in things like hospital rounds, reviewing charts, etc. Under those conditions, an annual malpractice premium requires a charge of more than an $80 an hour. If the doctor has a million dollars in medical and office equipment and that’s not unusual either, the amortization will be excess of $100 per patient hour seen. Needless to say this creates stress and pressure, and for all the complaints about the medical profession, doctors have one of the lower life expectancies of professionals.

In higher education, computerization has led to ubiquitous on-line evaluations and anonymous ratings of professors, and the subsequent inevitable grade inflation, because tenure often depends on pleasing the students. It’s also led to a proliferation of policies and procedures, so easily printed on those handy-dandy computerized systems. In my wife’s university, the policies and procedures for rank advancement and tenure have been rewritten and changed once or twice every year over the past decade, with scores of drafts being circulated electronically before each revision was finalized.

In effect, the expectations of technology have created more stress for modern society than the wind, rain, and inconsistent weather ever did for our agricultural forebears — largely because technology also makes people more and more accountable, even when they can’t do anything about it. The way technology is used today also creates what my father called “being eaten to death by army ants.” No one wants to kill you, but everyone wants a little something — reply to these emails, revise that set of documents, change that phrase to please the attorneys, change this one for the boss’s supervisor — and when it’s all said and done, who has time to do actual new work?

Yet, if you ignore the army ants, everyone thinks you’re difficult and uncooperative, and you lose your job. Is it any wonder that American professionals are working longer and longer hours?

But… ah, the blessings of technology.

The "Literary Canon," Education, and F&SF

Roughly twenty years ago, Allan Bloom published an incendiary book entitled The Closing of the American Mind. In it, Bloom charged that abandoning the traditional literary canon in favor of multiculturism and gender- and ethnic-based literary selections effectively had gutted the American liberal arts education. I’m oversimplifying his charges, but they run along those lines.

During the 1960s and 1970s, and thereafter, but particularly in those turbulent years, there were numerous and loud cries for “relevance” in higher education. Those cries reverberate today in such legislation as the No Child Left Behind Act and the growing emphasis on institutions of higher education as a version of white collar and professional trade schools. Less than ten percent of U.S. collegiate undergraduates major in what might be called “liberal arts,” as compared to twenty percent in business, sixteen percent in health, nine percent in education and six to ten percent in computer science [depending on whose figures one uses]. Less than three percent major in English and history combined.

As a writer who effectively minored in English, I’ve thought about the writers and poets I had to study in the late 1950s and early 1960s and those studied by students today. Back then, for example, there was a fairly strong emphasis on poets such as T.S. Eliot, W.B. Yeats, W.H. Auden, and Wallace Stevens, none of whom are now listed as among the topmost poets assigned in college English classes. Now… times do change, but I realized that poets such as Eliot bring certain requirements that poets and writers such as Maya Angelou, Jane Austen, and Toni Morrison do not. For much of Eliot or Yeats to make sense, the student has to have a far wider grasp of literature and history. Much of the difference between those writers once assigned and those now assigned, from what I can tell, is that a far greater percentage of those now assigned are what one might call self-affirming writers. They affirm a set of values that are either explicitly contained in the work at hand, or they affirm current values. By contrast, poets such as Eliot and Yeats often question and use a wide range of references and allusions unfamiliar to most students, some of which are current and some of which are historical and few of which are “common” knowledge.

In that sense, the best of F&SF, in my opinion, is that which stretches the reader into considering old values in a new light and “new” values through the light of experience, accepting neither at face value. Many F&SF writers present the “new” in a way that proclaims its value uncritically, while others present and trash the “new,” as does Michael Crichton all so well. Then there are those who appear to believe that shocking readers is equivalent to making them think and stretching their horizons. Most of the time, it’s not.

According to Mark Lilla, a professor of political philosophy at Columbia, recently quoted in The New York Times, “What Americans yearn for in literature is self-recognition.” But struggling with unfamiliar themes and values, searching out allusions and references require work and can be an alienating to students, and certainly doesn’t boost self-recognition.

Particularly in the late 1960s and early 1970s, it seemed to me, there was a concerted effort in the SF field to raise issues while adhering to some degree to the tradition of the “literary canon,” and this effort continues with at least some authors today. This melding represents, again in my opinion, one of the great strengths of the field, but paradoxically, it’s also another reason why F&SF readership tends to be limited, at least for these types of F&SF, because a reader either has to be knowledgeable or willing to expand his or her comfort zone.

This gets down to an issue at the basis of education, primarily but not exclusively higher undergraduate education: Is the purpose of higher education to train people for jobs or to teach them to think so that they can continue to learn? Most people would ask why both are not possible. Theoretically, they are, but it doesn’t work that way in practice. Job training emphasizes how to learn and apply skills effectively and efficiently. Thinking training makes one very uncomfortable; it should, because it should force the student out of his or her comfort zone. At one time, that was one of the avowed goals of higher education, and part of the so-called literary canon was chosen so as to provide not only that challenge but also a cultural history of values as illustrated by literature, rather than a mere affirmation of current values.

In addition, today, with the smorgasbord approach to education, a student can effectively limit himself or herself to the courses that merely reinforce his or her existing beliefs and biases. It’s comfortable… but is it education?

Future Fact? Present Fraud? Or…?

Once more, just the other day, someone said to me and my wife, “We never really went to the moon. It was all a fraud.” This person is not uneducated. In fact, the individual has an earned graduate degree and spent some fifteen years as an executive in the financial industry.

It doesn’t seem to matter to this individual — or the millions that share such a belief — that scientists are bouncing laser and radio beams off the reflectors left on the moon by our astronauts. Nor do the photographs and records that could not have been obtained any other way count against this belief. Nor the fact that ground-based and space-based evidence agree. Nor does the fact that we and other countries have put dozens of astronauts into space matter.

Nope. To such people, the moon landings were all a fraud.

Maybe this kind of belief has something to do with the brain. A recent study confirmed that there is indeed a difference between the way “liberals” and “conservatives” process and react to information, and that that difference goes far beyond politics. Liberals tend to be more open to new experiences, conservatives more entrenched and unwilling to move away from past beliefs. And, of course, interesting enough, there are those who classify themselves as liberals who actually have a conservative mind-set, who will not deviate from what they believe regardless of evidence, and there are those who claim that they are conservative who are very open to new evidence and ideas.

Neither mindset is necessarily “good” or “bad.” As many conservatives would say, and have, “If you don’t stand for something, you’ll fall for anything.” That can be very true. On the other hand, no matter how hard one wants to believe that the earth is flat, I’m sorry. It just isn’t. When new information arrives that is soundly and scientifically based, regardless of opinion and past beliefs, a truly intelligent person should be willing to look at it objectively and open-mindedly.

In a sense, I think, most people are basically conservative. We really don’t want to change what we believe without very good reason. In evolutionary, historical, and social terms, there are good reasons for this viewpoint. Just as in mutations affecting an organism, most changes in social and political institutions are bad. Only a few are for the best.

The problem occurs when the probability of danger from an event is not absolute, or unitary, as some economists put it, but still likely to occur, and when that occurrence would be catastrophic to the human race. Over the history of homo sapiens, some hundreds of thousands of years, or millions, depending on one’s definition of exactly when our forebears became thinking human beings, this kind of situation has not occurred until the past half century. While it might be unthinkable and improbable to most, a nuclear war would be devastating to the human race. So, it appears, will runaway global warming, regardless of cause.

The “conservative” view is to wait and let things sort themselves out. After all, hasn’t this worked throughout history? Well… not always, but in terms of survival and civilization, there was always someone else to carry on. When the Mayan civilization fell because they hadn’t planned well enough for unforeseen droughts, other human civilizations carried on. The same was true of the Anasazi, and now recent satellite measurements and photographs suggest that the same occurred to the Cambodian peoples who built Angkor Wat, then a city complex of over a million people, when drought struck in the 1500s.

But what happens when we as a race face a potential climate catastrophe as devastating as global warming could be? One that affects an entire world, and not just a continent or subcontinent? Can we afford to be conservative? Or is it a situation where, in reacting, we could fall for anything?

Is global warming a fraud perpetrated by scientists, as those who deny the moon landings believe about that? Or is it a real and present danger? Or is it over-hyped, the way all the warnings about DDT appear to have been – a real danger in limited areas and to certain species, but truly not the harbinger of a universal silent spring? And how should we react, whether conservative or liberal?