Archive for the ‘General’ Category

The Vanishing/Vanished Midlist?

Several weeks ago, I attended a science fiction convention where the guest of honor was a writer who spent some 20 years as what one might call a “high mid-list author,” someone able to work full-time as a writer and pay the bills.  Except… several years ago, this came to an end for the writer.  Oh… the writer in question still publishes two books a year, but they aren’t selling as well as earlier books, although those who read the books claim they’re as good, if not better, than earlier work, and now to make ends meet requires outside additional work as a consultant and educator.  To make matters worse, at least from my point of view, this writer produces work that is more than mere entertainment and mental cotton-candy.

Interestingly enough, more and more of the books cited by “critical” reviewers in the F&SF field [with whom I have, as most know, certain “concerns”] seem to come from smaller presses.  This is creating, I believe, an almost vicious cycle in F&SF publishing. The more the books praised by reviewer come from small presses, the more larger publishers get the message that “good” or “edgy” or “thoughtful” books don’t sell as well, and the greater the almost subconscious pressure to opt for “fiction-fun” or “fiction-light.”  To their credit, certain publishers, including mine, thankfully, are resisting this trend, but I’m still seeing more of those novels that are gaming and media tie-ins or endless series.  And yes, the Recluce Saga is long, but… as I keep pointing out, no character has more than two books.  I don’t have eight or ten or fifteen books endlessly spinning improbable stories and extensions about the same character or characters.

With the drastic changes in wholesale distribution over the past decade or so, virtually no mid-list books receive such distribution, except perhaps lower-selling titles of big-name authors.  As a result of these trends, the midlists of at least some large publishers that were once the home of “thoughtful” books are shrinking. Some such midlist writers have found homes with the smaller presses, but small press distribution systems often are not as extensive. That has resulted in lower sales for the authors who wrote those books, and lower sales means lower incomes, and either cutting back on writing or holding down more other jobs… or… trying to re-invent one’s self with another form of “fiction-light.”

I’ve heard many who believe that e-book sales can help here, but the sales figures I’ve seen suggest that e-books do more for those books that have high sales levels and wide distribution in hardcover and paperback – and those aren’t the midlist books.

It almost appears that the midlist F&SF titles are going to become a ghetto within a genre… and that concerns me, and it’s certainly affecting all authors, but particularly those who once wrote good midlist books and made a living at it… and now can’t.

Electronic Free-Loading… and Worse

Even with spam “protection,” the amount of junk email that my wife and I receive is astronomical – less than one in fifty emails is legitimate.  The rest are spam and solicitations.  Now I’m getting close to a hundred attempted “spam” comments on the website daily, all of them with embedded links to sell or promote something. That’s just one facet of the problem.  Another facet is the continual proliferation of attempts at phishing and identity theft.  It makes one want to ask – have there always been so many people trying to make a buck, rupee, ruble, Euro, or whatever by freeloading or preying on others?

I know that con artists have been around since the beginning of history, but never have such numbers been so obvious and so intrusive to so many.  Is this the inevitable result of an electronic technology that makes theft, fraud, and blatant self-promotion at the expense and effort of others a matter of keyboarding at a distance?  At one time, these types of offenses had to be carried out in person and embodied a certain amount of risk and a probability of detection and usually criminal punishment.  Now that they can be accomplished via virtually untraceable [for practical purposes] computer/internet access, they’ve proliferated to the point where virtually every computer connected to the net runs the risk of some sort of loss or damage – a form of computer Russian roulette.

But what I find the most disheartening about this is the fact that so many people, once the risk and criminal penalty factors were so dramatically reduced by technology, set out to exploit and fleece others.  Even those of us not yet fleeced or exploited have to take time, effort, and additional software to deal with these intrusions.  I have to sort through the potential comments quarantined by the system several times a day, because a few are legitimate, and deserve to be posted, and I still have to take time to delete all the unwanted email.  I have to pay for protective software, and so forth.  In effect, every computer user is being taxed in terms of time, money, and risk by this radical expansion of the unscrupulous.

Now… those who are extreme technophiles will claim that the downsides of our technologically based communications/computing systems are negligible… or at least that the benefits far outweigh the downsides.  But the problem here is that most of the benefits, especially in terms of costs, go to large institutions and the unscrupulous, while the downsides fall on the rest of us.  I don’t see that, for example, that the internet enables more good writers; it enables writers who are better self-promoters, and some good writers are, and a great many aren’t.  In trying to evaluate honestly what I do on the net, I suspect that my internet presence is similar to treading water.  I’m not losing much ground to the blatant self-promoters, but for all the effort it requires, I’m not gaining either, and it’s time spent when I can’t be writing.  Yet if I don’t do it, especially with, I have to admit after looking at recent sales figures [and yes, some of you were right] the recent spurt in the growth of e-books, my sales will suffer.

I don’t see that the internet is that useful in enabling small businesses, because there are so many, and the effort and ingenuity require to attract customers is considerable, but it certainly allows large ones to contact everyone.  And it certainly allows every variety of cyber-criminal potential access to a huge variety of victims with almost no chance of getting detected, let alone prosecuted and punished.  The idea of privacy has become almost laughable, even for those of us who don’t patronize social networking sites.

Cynical as I may be, my hopes have always been that technology would be employed to enable the best to be better, and the rest to improve who and what they are.  Yet… I have this nagging feeling that, more and more, technology, particularly communications technology, is dragging down far more people than it is improving, especially ethically… and, even if it isn’t, it’s creating a tremendous diversion of time from actual productive work.  That diversion may be worthwhile in manufacturing-based industries, but it’s a definite negative force in areas such as writing and other creative efforts.  In a society that is becoming ever more dependent on technology, unless matters change, this foreshadows a future in which marketing and hype become ever more present and dominant, even as the technophiles are claiming communications technology makes life better and better.

Better and better for whom?  And what?

Fantasy… Should be Fun?

The other day, when reading a blogger’s review of The Soprano Sorceress, I came across an interesting question, clearly meant to be rhetorical – what point was there to reading a fantasy if the reader didn’t like the fantasy world created by the author?  It’s a good question, but not necessarily in the way that the reviewer meant, because his attitude was more along the lines of wanting to avoid reading about worlds he didn’t like, particularly because he asked another question along the lines of what fun was there in reading about such a world.

Yet… I have to confess that there are authors I probably won’t read again because I don’t care that much for their worlds, just as there are authors I won’t read again because I don’t care for their characters.  In particular, I don’t care for characters who make mistakes and errors that would prove fatal in any “realistic” world situation, yet who survive for book after book [I presume, because the series continues, even if I’m no longer reading them].  Obviously, those kinds of books have great appeal, because millions upon millions of them sell, and maybe that’s the “fun” in reading them.

But there’s a distinction between “good” and “fun,” and often one between “entertaining” and “thought-provoking,” and there are readers who prefer each type, although sales figures suggest that “fun” and “entertaining” are the categories that tend to outsell others significantly, often by orders of magnitude.

The question the blogger reviewer asked, however, holds within it an assumption that all too many of us have – that “our” view is the only reasonable way of looking at a particular book… and that, I think, is why I tend to be reluctant in reading reviews, either those considered “professional” or those less so, because the vast majority of reviewers start from the unconscious presupposition that theirs is the only “reasonable” way of looking at a given book.  The more “professional” the reviewer is, the less likely this presupposition occurs, but there are still well-known reviewers and review publications that fall regularly into this mind-set.  The problem lies in not only in the expectations of the reviewer, but also in the knowledge base – or the lack of knowledge – that the reviewer possesses.  A novel that uses allusions heavily to disclose character will seem shallow to the reader or reviewer who does not understand those referents.  A reader unfamiliar with various “sub-cultures,” such as the corporate or legal worlds, politics, the military, academia, is likely to miss many subtleties of the type where explanation would destroy the effect.  Because of this “sub-culture” blindness, certain books, or parts of certain books, tend to be less entertaining – or even boring – to those unfamiliar with the subculture, whereas a reader who understands those subcultures may be smiling or even howling with laughter.

As a side note, despite the impression that some bloggers have apparently gained from this site, I do read blog reviews of my work and that of other authors on a continuing basis, if sometimes reluctantly.  Why reluctantly?  Because it’s more often painful than not.  As a writer, for me such blogs often raise the question of why the reader didn’t understand certain matters that appear so obvious to me.  Could I have done something better, or was the matter presented well and the reader didn’t get it?  Half and half?  Such questions and second-guessing, I feel, are necessary if any writer wants to improve, no matter how long he or she has been writing… but I suspect any author who claims the process is enjoyable or entertaining is either lying or a closet masochist.  As part of being a professional, an author should know, I personally believe, the range of reactions to his or her work, as well as the reasons behind those reactions, but, please, let’s not have commentators suggest that we’re somehow outdated, out of touch, or unreasonable when we suggest that the process isn’t always as pleasurable to us as it apparently is to those who take great delight in complaining about what they perceive as deficiencies in what we write.  Sometimes, indeed, the deficiencies are the writer’s, but many times the deficiencies lie in the reviewer, and where the deficiencies may lie, or even if there are such deficiencies, isn’t always obvious to most readers of either blog or professional reviews… or even of professional blog reviews.

Sometimes… Just Sometimes… We Get It Right

Way back in 1958, in the so-called “Golden Age” of science fiction, Jack Vance wrote a book called The Languages of Pao, in which he postulated that language drastically affects human thought patterns and, thus, the entire structure of a culture or civilization.  A more scholarly statement of this is the linguistic relativity principle, otherwise known as the Sapir-Whorf hypothesis, of which there are two versions.  One states that language limits and determines cognitive categories. A weaker version merely suggests that language  influences thought and certain non-linguistic behaviours.  The Sapir-Whorf hypothesis was thought to be discredited by color-related experiments in the 1960s, because researchers  found that language differentials did not seem to affect color perception or usage.

Recent studies of human brain patterns and linguistic development, reported in the June 1st edition of New Scientist, strongly suggest that, first, there is not, as previously thought, a  genetically-determined “universal” human instinct/hard-wired pattern for language that is common to all human beings, but that languages are in fact learned and used in often totally different ways by those speaking different tongues.  Thus, as speculated by Vance, languages do in fact shape the way we not only think, but the very way in which we see the world.  And, as occasionally happens, but not so much as we science fiction writers would like to think or claim, one of us has actually anticipated a fundamental discovery, and one that has profound implications for human civilization, implications that I don’t think most people have fully considered.

If this research is accurate, then, for example, intractible cultural differences may well lie in the linguistic patterns of a culture.  A language that offers many ways in which to accurately express the same concept or thought would likely promote more openness of thought than a language in which there is literally only one correct way in which that thought can be expressed.  A language/culture that allows rapid linguistic innovation may promote change and development… but it might well have the downside of undermining standards, because standards, as represented by language, are not seen as fixed or immutable.   We already know that words expressing concepts, such as “freedom”  or “equality,” do not “translate” into the exact same meanings in different cultures, and this research offers insights into why the differences go beyond mere semantics.

These possibilities have certainly been considered in human history, if only instinctively or subconsciously.  For centuries, the Roman Catholic church resisted the translation of the Bible into any other language, insisting it be read and taught only in Latin.  Since 1635, with a few years in abeyance during the French Revolution, L’Academie Francaise has policed usage and linguistic development in France, attempting to restrict or eliminate the use of Frenchified Anglicisms.  And languages do affect other aspects of human behavior.  Recent studies have shown that speakers of tonally-inflected languages have far, far, higher rates of perfect pitch than do speakers of languages that are not tonally inflected.  Not entirely coincidentally, it seems to me, speakers of such languages also appear to have more successful classical musicians.

A more disturbing aspect of the research is the possibility that linguistic differences may well create cultural “understanding” divides that are difficult, if not impossible, to bridge, simply because the languages create antithetical patterns of thought, so that a speaker of one language cannot literally comprehend emotionally the concepts and values behind the words of a speaker of another language.  The initial research suggests that the magnitude of variances in languistic learning patterns ranges from very slight to quite significant… and it will be interesting to see if such differences can ever be quantified.  But it does appear that speaking another language goes far beyond the words.

And a science fiction writer pointed out the cultural implications and ramifications for societies first.

Pressing the Limits

As both individuals and as a species, human beings have always had a tendency to press the limits, both of their societies and their technologies.  This tendency has good points and bad points… good because as a species we wouldn’t have developed and life would still be in the “natural state,” or “nasty, brutish, and short,” a pithy observation attributed to the philosopher Thomas Hobbes in Leviathan.  The “bad” side of pressing the limits has been minimized, because the advantages have been so much greater over time than the drawbacks.

Except… the costs and the consequences of pushing technology to the limit may now in some cases be reaching the point where they outweigh the overall benefits, and not just in military areas.

The latest and most dramatic evidence of this change is, of course, the current Gulf of Mexico oil rig explosion and the subsequent oil blowout.  Deep-sea drilling and production platforms are required to have in place redundant blow-out protectors… as did the BP rig.  But the blow-out protector failed.  Such failures are exceeding rare.  Repeated tests show these work over 99% of the time, but something like 60 have failed in tests of the equipment.  The Gulf oil disaster just happens to be one of the few times it’s happened in actuality and represents the largest such failure in terms of crude oil releases.  What’s being overlooked, except by the environmentalists, who, so far as I can tell, are operating more on a dislike of off-shore drilling than a reasoned technical analysis, is the fact the number of offshore drilling platforms is around 6,000 in service world-wide in some form or another, and increasing.  That number will increase whether the U.S. bans more offshore drilling or not.  From 1992 to 2006, the Interior Department reported 39 blow-outs at platforms in the Gulf of Mexico, and although none were as serious as the latest, that’s more than two a year, yet that represents a safety record of 99.93%.  In short, there’s not a lot of margin for error.  What makes the issue more pressing is that drilling technology is able to drill deeper and deeper – and the pressures involved at ever greater depths put increasing stress on the equipment to the point where, as is apparent with the BP disaster, stopping the flow of oil in the case of a failure becomes extraordinarily difficult and exceedingly expensive, as well as time-consuming.  Because crude oil is devastating to the environment, the follow-on damage to the ecosystems and the economy of the surrounding area will create far greater costs than capping the well.

Pushing technology beyond safe limits is nothing new to human beings.  When steam engines were first introduced, the desire for power and speed led to scores, if not hundreds, of boiler explosions.  Occasionally, disasters led to changes, such as the phasing out of hydrogen dirigibles after the Hindenburg fire and crash, but that change was also made easier by the improvements in aircraft, which were also far faster than dirigibles. The costs of other disasters are still with us – and we tend to overlook them.  The town of Centralia, Pennsylvania, has largely been abandoned because the coal seams in the mostly worked out mines beneath the town caught fire and have been smoldering away for more than forty years, causing the ground above to collapse and continually releasing toxic gases.  In Pennsylvania alone, there are more than 30 such subterranean fires.  World-wide there are more than 3,000 such fires, some of which release more greenhouse gases and other toxic fumes than some coal-fired power plants.  Yet few of these fires are more than watched, because the technology does not exist that can extinguish them in any fashion close to cost-efficient and in some cases, not at all because the fires burn so deep.

Pushing electronic technology to the limits, without regard for the implications, costs, and other downsides, has resulted in a world linked together in such a haphazard fashion that a massive solar flare – or a determined set of professional hackers – could conceivably bring down an entire nation’s communications and power distribution network – and that doesn’t even take into account the vast increase in the types and the amounts of exceedingly toxic wastes created on a world-wide scale, most of which is still not handled as it should be.  Another area where technology is being pressed to the limits is that of bio-tech, where scientists have reported creating the first synthetic cell.  While they engineered in considerable safeguards, once that technology is more available, will everyone?

As illustrated by the BP disaster, we when, as a society, push technology to its limits on a large scale, for whatever reason, the implications of a technological or systems failure are getting to the point where we require absolute safety in operation of those systems – and obtaining such assurance is never inexpensive… and sometimes not even possible.

But then again… if we tweaked existing technology just a bit more so that we could get even more out of it…. get more oil, more bandwidth, make more profit…

When to Stop Writing… [With Some “Spoilers”]

The other day I ran across two comments on blogs about my books.  One said that he wished I’d “finish” more books about characters, that he just got into the characters and then the books ended.  The other said that I dragged out my series too long.  While the comments weren’t about quite the same thing, they did get me to thinking.  How much should I write about a given character?  How long should a series be?

The simple and easy answer is that I should write as long as the story and the series remain interesting.  The problem with that answer, however, is… interesting to whom?

Almost every protagonist I’ve created has resulted in a greater or larger number of readers asking for more stories about that particular character, and every week I get requests or inquiries asking if I’ll write another story about a particular character.  That’s clearly because that reader identified with and/or greatly enjoyed that character… and that’s what every author likes to hear.  Unfortunately, just because a character is so memorable to readers doesn’t mean that there’s another good story there… or that another story about that character will be as memorable to all readers.

Take Lerris, from The Magic of Recluce.  By the end of the second book about him, he’s prematurely middle-aged as a result of his use of order and chaos to save Recluce from destruction by Hamor… and his actions have resulted in death and destruction all around him, not to mention that he’s effectively made the use of order/chaos magic impossible on a large or even moderate scale for generations to come.  What is left for him in the way of great or striking deeds?  Good and rewarding work as a skilled crafter, a happy family life? Absolutely… but there can’t be any more of the deeds, magic, and action of the first two books.  That’s why there won’t be any more books about Lerris.  If I wrote another book about Lorn…another popular character… for it to be a good book, it would have to be a tragedy, because the only force that could really thwart or even test him is Lorn himself.  After a book in which a favorite character died, if of old age after forty years of magic working – and all the flak I took from readers who loved her – I’m understandably reluctant to go the tragic route again.  So… for me, at least, I try to stop when the best story’s been told, and when creating an even greater peril or trial for the hero would be totally improbable for the world in which he or she lives.

For the same reason, because I’ve never written more than three books about a given main character, my “series” aren’t series in the sense of eight or ten books about the same characters, but groupings of novels in the same “world.”  Even so, I hear from readers who want more in that world, and I read about readers who think I’ve done enough [or too much] in that world.  Interestingly enough, very few of the complainers ever write me; they just complain to the rest of the world, and for me that’s just as well.  No matter what they say publicly, I don’t know a writer who wants to get letters or emails or tweets telling them to stop doing what they like to do… and I’m no different.

But those who complain about series being too long usually aren’t dealing with the characters or the stories. From what I’ve seen and read, they’re the readers who’ve “exhausted” the magic and the gimmicks.  They’re not there for characters and insights, but for the quicker “what’s new and nifty?”  And there’s nothing wrong with that, but it’s not necessarily a reason for an author to stop writing in that world; it’s a reason for readers who always want the “new” to move on.  There’s still “new” in the Recluce Saga; it’s just not new magic.  Sometimes, it’s stylistic.  I’ve written books in the first person, the third person past tense, the third person present tense.  I’ve connected two books with an embedded book of poetry.  I’ve told the novels from both the side of order and the side of chaos, and from male and female points of view.  Despite comments to the contrary, I’ve written Recluce books with teenaged characters, and those in their twenties, thirties, forties, and older. That’s a fair amount of difference, but only if the reader is reading for what happens to the characters… and virtually all the critics and reviewers have noted that each book expands the world of Recluce.  I won’t write another Recluce book unless I can do that, and that’s why there’s often a gap of several years between books.  The same is true of books set in my other worlds.

So… I guess, for me, the answer is that I stop writing about a character or a world when I can’t show something new and different, although it may be quiet new or character new.

Technology, Society, and Civilization

In today’s modern industrial states, most people tend to accept the proposition that the degree of “civilization” is fairly directly related to the level of technology employed by a society.  Either as a result or as a belief, then, each new technological gadget or invention is hailed as an advance. But… how valid is that correlation?

In my very first blog [no longer available in the archives, for reasons we won’t discuss], I made a number of observations about the Antikythera Device, essentially a clock-work- like mechanical computer dating to 100 B.C. that tracked and predicted the movement of the five known planets, lunar and solar eclipses, the movement of the moon, as well as the future dates for the Greek Olympics. Nothing this sophisticated was ever developed by the Roman Empire, or anywhere else in the world until more than 1500 years later.  Other extremely technological devices were developed in Ptolemaic Egypt, including remote-controlled steam engines that opened temple doors and magnetically levitated statues in those temples.  Yet both Greece and Egypt fell to the more “practical” Roman Empire, whose most “advanced” technologies were likely the invention of concrete, particularly concrete that hardened under water, and military organization.

The Chinese had ceramics, the iron blast furnace, gunpowder, and rockets a millennium before Europe, yet they failed to combine their metal-working skill with gunpowder to develop and continue developing firearms and cannon.  They had the largest and most advanced naval technology in the world at one point… and burned their fleet.  Effectively, they turned their backs on developing and implementing higher technology, but for centuries, without doubt, they were the most “civilized” society on earth.

Hindsight is always so much more accurate than foresight, but often it can reveal and illuminate the possible paths to the future, particularly the ones best avoided. The highest level of technology used in Ptolemaic Egypt was employed in support of religion, most likely to reinforce the existing social structure, and was never developed in ways that could be used by any sizable fraction of the society for societally productive goals.  The highest levels of Greek technology and thought were occasionally used in warfare, but were generally reserved for the use of a comparatively small elite.  For example, records suggest that only a handful of Antikythera devices were ever created.  The widest-scale use of gunpowder by the early Chinese was for fireworks – not weapons or blasting powder.

Today, particularly in western industrial cultures, more and more technology is concentrated on entertainment, often marketed as communications, but when one considers the time and number of applications on such devices, the majority are effectively entertainment-related.  In real terms, the amount spent on basic research and immediate follow-up in the United States has declined gradually, but significantly, over the past 30 years.  As an example, NASA’s budget is less than half of what it was in 1965, and in 2010, its expenditures will constitute the smallest fraction of the U.S. budget in more than 50 years.  For the past few years, the annual budget of NASA has been running around $20 billion annually.  By comparison, sales of Apple’s I-phone over 9 months exceeded the annual NASA budget, and Apple is just one producer of such devices.  U.S. video game software sales alone exceed $10 billion annually.

By comparison, the early Roman Empire concentrated on using less “advanced” technology for economic and military purposes.  Interesting enough, when technology began to be employed primarily for such purposes as building the coliseum and flooding it with water and staging naval battles with gladiators, subsidized by the government, Roman power, culture, and civilization began to decline.

More high-tech entertainment, anyone?

Sacred? To Whom?

I’ll admit right off the top that I have a problem with the concept that “life is sacred,” not that I don’t feel that my life, and that of my wife and children and grandchildren aren’t sacred to me.  But various religions justify various positions on social issues on the grounds that human life is “sacred.”  I have to ask the question why human life, as opposed to other kinds of life, is particularly special – except to us.

Once upon a time, scientists and others claimed that Homo sapiens were qualitatively different and superior to other forms of life.  No other form of life made tools, it was said.  No other form of life could plan logically, or think rationally.  No other form of life could communicate.  And, based on these assertions, most people agreed that humans were special and their life was “sacred.”

The only problem is that, the more we learn about life on our planet, the more every one of these assertions has proved to be wrong.  Certain primates use tools; even Caledonian crows do.  A number of species do think and plan ahead, if not in the depth and variety that human beings do.  And research has shown and is continuing to show that other species do communicate, from primates to gray parrots.  Research also shows that some species have a “theory of mind,” again a capability once thought to be restricted to human beings. But even if one considers just Homo sapiens, the most recent genetic research shows that a small but significant fraction of our DNA actually comes from Neandertal ancestors, and that genetic research also indicates that Neandertals had the capability for abstract thought and speech.  That same research shows that, on average, both Neandertals and earlier Homo sapiens had slightly larger brains than do people today.  Does that make us less “sacred”?

One of the basic economic principles is that goods that are scarce are more valuable, and we as human beings follow that principle, one might say, religiously – except in the case of religion.  Human beings are the most common large species on the planet earth, six billion plus and growing.  Tigers and pandas number in the thousands, if that.  By the very principles we follow every day, shouldn’t a tiger or a panda be more valuable than a human?  Yet most people put their convenience above the survival of an endangered species, even while they value scarce goods, such as gems and gold, more than common goods.

Is there somehow a dividing line between species – between those that might be considered “sacred” and those that are not?  Perhaps… but where might one draw that line?  A human infant possesses none of the characteristics of a mature grown adult.  Does that make the infant less sacred?  A two year old chimpanzee has more cognitive ability than does a human child of the same age, and far more than a human infant.  Does that make the chimp more sacred?  Even if we limit the assessment of species to fully functioning adults, is an impaired adult less sacred than one who is not?  And why is a primate who can think, feel, and plan less sacred than a human being?  Just because we have power… and say so?

Then, there’s another small problem.  Nothing on the earth that is living can survive without eating in some form or another something else that is or was living.  Human beings do have a singular distinction there – we’re the species that has managed to get eaten less by other species than any other species.  Yes… that’s our primary distinction… but is that adequate grounds for claiming that our lives, compared to the lives of other thinking and feeling species, are particularly special and “sacred”?

Or is a theological dictum that human life is sacred a convenient way of avoiding the questions raised above, and elsewhere?

Making the Wrong Assumption

There are many reasons why people, projects, initiatives, military campaigns, political campaigns, legislation, friendships, and marriages – as well as a host of others – fail, but I’m convinced that the largest and least recognized reason for such failures is that those involved in such make incorrect assumptions.

One incorrect assumption that has bedeviled U.S. foreign policy for generations is that other societies share our fundamental values about liberty and democracy.  Most don’t.  They may want the same degree of power and material success, but they don’t endorse the values that make our kind of success possible.  Among other things, democracy is based on sharing power and compromise – a fact, unfortunately, that all too many U.S. ideologues fail to recognize, which may in fact destroy the U.S. political system as envisioned by the Founding Fathers and as developed by their successors… until the last generation.  Theocratically-based societies neither accept nor recognize either compromise or power-sharing – except as the last resort to be abandoned as soon as possible.  A related assumption is that peoples can act and vote in terms of the greater good.  While this is dubious even in the United States, it’s an insane assumption in a land where allegiance to the family or clan is paramount and where children are taught to distrust anyone outside the clan.

On a smaller scale, year after year, educational “reformers” in the United States assume, if tacitly and by their actions, that the decline in student achievements and accomplishments can be reversed solely by testing and by improving the quality of teachers.  This assumption is fatally flawed because student learning requires two key factors – those who can and are willing to work to teach and those who can learn and who are willing to learn.  Placing all the emphasis on the teachers and testing assumes that a single teacher in a classroom can and must overcome all the pressures of society, the media, the social peer pressures to do anything but learn, the idea that learning should be fun, and all the other societal pressures that are antithetical to the work required to learn. There are a comparative handful of teachers who can work such miracles, but basing educational policy and reforms on those who are truly exceptional is both poor policy and doomed to failure.  Those who endorse more testing as way to ensure that teachers teach the “right stuff” assume that the testing itself will support the standards, which it won’t, if the students aren’t motivated, not to mention the fact that more testing leaves less time for teaching and learning.  So, in a de facto assumption, not only does the burden of teaching fall upon educators, but so does the burden of motivating the unmotivated, and disciplining the undisciplined at a time when society has effectively removed the traditional forms of discipline without providing any effective replacements.  Yet the complaints mount, and American education is failing, even as the “reformers” keep assuming that teachers and testing alone can stem the tide.

For years, economists used what can loosely be termed “the rational person” model for analyzing the way various markets operated.  This assumption has proved to be horribly wrong, as recent studies – and economic developments – proved, because in all too many key areas, individuals do not behave rationally.  Most people refuse to cut their losses, even at the risk of losing everything, and most continue uneconomic behaviors not in their own interests, even when they perceive such behaviors in others as irrational and unsound.  Those who distrust the market system assume that regulation, if only applied correctly, can solve the problems, and those who believe that markets are self-correcting assume that deregulation will solve everything.  History and experience would suggest both assumptions are wrong.

In more than a few military conflicts dating back over recent centuries, military leaders have often assumed that superior forces and weapons would always prevail.  And… if the military command in question does indeed have such superiority and is willing to employ it efficiently to destroy everything that might possibly stand in its way, then “superiority” usually wins.  This assumption fails, however, in all too many cases where one is unable or unwilling to carry out the requisite slaughter of the so-called civilian population, or when military objectives cannot be quickly obtained, because, in fact, in virtually every war of any length a larger and larger fraction of the civilian population becomes involved on one side or another, and “superiority” shifts.  In this regard, people usually think of Vietnam or Afghanistan, but, in fact, the same sort of shift occurred in World War II.  At the outbreak of WWII in 1939, the British armed forces had about 1 million men in arms, the U.S. 175,000, and the Russians 1.5 million.  Together, the Germans and Japanese had over 5 million trained troops and far more advanced tanks, aircraft, and ships.  By the end of the war, those ratios had changed markedly.

While failure can be ascribed to many causes, I find it both disturbing and amazing that seldom are the basic assumptions behind bad decisions ever brought forward as causal factors… and have to ask, “Why not?”  Is it because, even after abject failure or costly success that didn’t have to be so costly, no one wants to admit that their assumptions were at fault?

Ends or Means

By the time they reach their twenties, at least a few people have been confronted, in some form or another, with the question of whether the ends justify the means.  For students, that’s usually in the form of cheating – does cheating to get a high grade in order to get into a better college [hopefully] justify the lack of ethics?  In business, it’s often more along the lines of whether focusing on short-term success, which may result in a promotion or bonus [or merely keeping your job in some corporations], is justified if it creates long-term problems or injuries to others.

On the other hand, I’ve seldom seen the question raised in a slightly different context.  That is, are there situations where the emphasis should be on the means? For example, on vacation, shouldn’t the emphasis be on the vacation, not on getting to the end of it?  Likewise, in listening to your favorite music, shouldn’t the emphasis be on the listening and not getting to the end?

I suppose there must be some few situations where the end is so vital that the means don’t matter, but the older I get, the fewer examples of that I’ve been able to cite because I’ve discovered that the means so affect the ends that you can seldom accomplish the ends without a disproportionate cost in collateral damage.

This leads to those situations where one needs to concentrate on perfection in accomplishing the means, because, if you don’t, you won’t get to the end.  Some instances such as these are piloting, downhill ski racing, Grand Prix driving [or driving in Los Angles or Washington, D.C., rush hour traffic], or undertaking all manner of professional tasks, such as brain or heart surgery, law enforcement, or fire fighting.

The problem that many people, particularly students, have is a failure to understand that, in the vast majority of cases, learning the process is as critical [if not more so] as the result.  Education, for example, despite all the hype about tests and evaluations, is not about tests, grades, and credentials [degrees/certification].  Even if you get the degree or certification or other credential, unless you’ve learned enough in the process, you’re going to fail sooner or later – or you’ll have to learn all over what you should have learned the first time.  Unfortunately, because many entry-level jobs don’t require the full skill set those who were trying to provide the education were attempting to instill, that failure may not come for years… and when it does, the results will be far more catastrophic.  And, of course, some people will escape those results, because there are always those who do… and, unfortunately, for some reasons, those “evaders” are almost invariably the ones those who don’t want to do the work to learn the process pick as examples and reasons why they shouldn’t work on learning the processes behind the skills.

Studies done on college graduates two generations ago “discovered” that such graduates made far more income over their lifetimes than did those without a college degree.  Unfortunately, the message became that a degree was what mattered, not the skills represented by that degree, and ever since then people have focused on the credential, rather than on the skills, a fact emphasized by rampant grade and degree inflation and documental by the noted scholar Jacques Barzun, in his book, From Dawn to Decadence: 500 Years of Western Cultural Life, 1500 to the Present , where he observed that one of the reasons for the present and continuing decline of Western Civilization is the fact that our culture now exalts credentials over skills and real accomplishments.

One of the most notable examples of this is the emphasis on monetary gain, as exemplified by developments in the stock and securities markets over the past two years.  The “credential” of the highest profit at any cost has so distorted the process of underwriting housing and business investment that the profit levels reaped by various sectors of the economy bear no relationship to their contribution to either the economy or culture.  People whose decisions in pursuit of ever higher and unrealistic profit levels destroyed millions of jobs are rewarded with the “credential” of high incomes, while those who police our streets, fight our fires, protect our nation, and educate our children face salary freezes and layoffs – all because ends justify any means.

Hypocrisy… Thy Name Is “Higher” Education

The semester is over, or about over, in colleges and universities across the United States, and in the majority of those universities another set of rituals will be acted out.  No… I’m not talking about graduation.  I’m talking about the return of “student evaluations” to professors and instructors. The entire idea of student evaluations is a largely American phenomenon that caught hold sometime in the late 1970s, and it is now a monster that not only threatens the very concept of improving education, but it’s also a poster child for the hypocrisy of most college and university administrations.

Now… before we go farther, let me emphasize that I am not opposing the evaluation of faculty in higher education.  Far from it.  Such evaluation is necessary and a vital part of assuring the quality of faculty and teaching.  What I am opposed to is the use of student evaluations in any part of that process.

Take my wife’s music department.  In addition to their advanced degrees, the vast majority have professional experience outside academia.  My wife has sung professionally on three continents, played lead roles in regional operas, and has directed operas for over twenty years.  The other voice professor left a banking career to become a successful tenor in national and regional opera before returning to school and obtaining a doctorate in voice.  The orchestra conductor is a violinist who has conducted in both the United States and China.  The band director spends his summer working with the Newport Jazz Festival.  The piano professor won the noted Tchaikovsky Award and continues to concertize world-wide.  The percussion professor performs professionally on the side and has several times been part of a group nominated for a Grammy.  This sort of expertise in a music department is not unusual, but typical of many universities, and I could come up with similar kinds of expertise in other university departments as well.

Yet… on student evaluations, the students rate their professors on how effective the professors are at teaching, whether the curricula and content are relevant, whether the amount of work required in the course is excessive, etc.  My question/point is simple:  Exactly how can 18-24 year-old students have any real idea of any of the above?  They have no relevant experience or knowledge, and to obtain it is presumably why they’re in college.

Studies have shown that the closest correlation between high student evaluations is that the professors with the easiest courses and the highest percentage of As get the best evaluations. And, since evaluations have become near-universal, college level grades have experienced massive grade inflation.  In short, student evaluations are merely student Happiness Indices – HI!, for short.

So why have the vast majority of colleges and universities come to rely on HI! in evaluating professors for tenure, promotion, and retention?  It has little to do with teaching effectiveness or the quality of education provided by a given professor and everything to do with popularity.  In the elite schools, student happiness is necessary in order to keep student retention rates up, because that’s one of the key factors used by U.S. News and World Report and other rating groups, and the higher the rating, the more attractive the college or university is to the most talented students, and those students are most likely to be successful and eventually boost alumni contributions and the school’s reputation.  For state universities, it’s a more direct numbers game.  Drop-outs and transfers represent lost funds and inquiries from state legislatures who provide some of the funding.  And departments who are too rigorous in their attempts to maintain or [heaven forbid] upgrade the quality of education often either lose students or fail to grow as fast as other departments, which results in fewer resources for those departments.  Just as Amazon’s reader reviews greatly boosted Amazon’s book sales, HI! boost the economics of colleges and universities.  Professors who try to uphold or raise standards face an uphill and usually unsuccessful battle – as evidenced by the growing percentage of college graduates who lack basic skills in writing and logical understanding.

Yet, all the while, the administrations talk about the necessity of HI! [sanctimoniously disguised as thoughtful student evaluations] in improving education, when it’s really about economics and their bottom line… and by the way, in virtually every university and college across the country, over the past 20 years, the percentage growth in administration size has dwarfed the growth in full-time, tenure-track, and tenured faculty.  But then, why would any administration really want to point out that perceived student happiness trumps academic excellence in every day and in every way or that all those resources are going more and more to administrators, while faculties, especially at state universities, have fewer and fewer professors and more and more adjuncts and  teaching assistants?

Newer… Not Always Better

Somehow people, especially students, don’t get it.  As the title above suggests, just because something is newer, it isn’t necessary better – even in computers.  I have yet to find a commercial graphing program in existence today that is anywhere even close to the Boeing Graph program of some 25 years ago.  And as techno-historians know, the Beta videotape system was far superior to the VHS system.

What’s interesting now, though, is that for some applications – such as viewing student voice teachers and critiquing them – VHS tapes are far superior to DVDs.  Why?  Because the tapes can be paused at any given second, or rewound to a precise point.  Commercial DVDs and equipment can’t.  When a voice professor is studying vocal dynamics, that’s important.  Having to play through sections, even at high speed, takes time and often overshoots or undershoots the point in question.  Yet my wife’s pedagogy students complain that she uses “antiquated equipment” and makes them use old-fashioned tapes instead of new hip digital disks.  What they don’t seem to understand is that “new” isn’t better if it doesn’t do what you want it to, especially when “old” technology does.

This isn’t confined to the sometimes arcane area of vocal pedagogy, but applies across our techno-society. Typewriters do a far better job of filling in forms – at least those not available on one’s own computer – than do computers. Word Seven is a much faster word processing system for text than is the current version of Word [which I do have for the other applications], and the search capabilities of fifteen-year-old WordPerfect 6.0 still exceed those of any current version of Word.  As I noted in an earlier post, a keyed ignition is far more effective at turning off a runaway engine than a new high-tech keyless engine, not to mention safer.  My “old” color ink-jet printer delivers a far cleaner and clearer image than does the new and improved laser-jet printer, even if the laser is faster. And in terms of overall medical effectiveness, in terms of all factors, there’s no solid proof that the newer NSAIDs have any more benefits and more effectiveness than does good old aspirin, and although aspirin does have a slightly higher propensity to create gastro-intestinal bleeding, it also has many other benefits, such as reducing the risk of heart attacks and colon cancer – and it’s one of the oldest drugs around. Certainly, the now-retired Concorde passenger jet was far superior to any commercial aircraft now in service in getting passengers across the ocean quickly, and more than a few pilots still claim that the retired F-14 exceeds anything now flying for total air superiority.  Photographic film still provides a better image than does comparable digital photography.

Going back to recording equipment, if you happen to have a phonograph with a working needle, you can still play vinyl and other old records nearly a century old.  You certainly can’t do that with tapes even half that old, and a single light scratch effectively destroys the usefulness of a CD.  That’s fine for entertainment products that aren’t meant to outlast the current fad, but is it acceptable for recording data or information with a longer lifespan?

So why aren’t newer products always better?  The plain fact is that superiority is often far down the list in product qualities, usually behind cost of production/operation, novelty appeal, style, ease of operation, and profitability.  Another factor is that, especially in computer and communications products, manufacturers try to cram in as many applications as possible so as to appeal to the widest possible number of consumers. The multiplicity of applications generally results in the overall degradation of the capability of all functions, but that degradation usually isn’t perceptible, or relevant, to most users.

This often results in cheaper products, but the downside is that those products often don’t suit the needs of professionals in specialized fields… and because it’s getting harder and harder to develop or produce products for users with particular needs – such as my professorial wife – those users have to make do with either improvised or older equipment… and risk being termed dinosaurs and out of date,

In the end… newer isn’t always better; it’s always only newer.

Complete Piracy at Last

It’s now official.  According to my editor and Macmillan Company, the parent of Tor Books, every single one of my titles has now appeared somewhere as pirated edition, in some form or another.  I’d almost like to claim this as a singular distinction.  I can’t. Macmillan also believes that every single book they’ve published in recent years – something like the last three decades – has appeared in pirated editions of some sort.

I can’t say I’m surprised.  Every time I attempt to check up on how my books are doing, I discover website after website offering free downloads of everything I’ve ever written, including versions of titles that never were issued in electronic format and even those that haven’t been in print in those particular editions offered in more than twenty years.  I could spend every minute of every day trying to chase them down… without much success.  So I grit my teeth and bear it.

Ah… the wonders of the electronic age.

Coincidentally, and unsurprisingly, the sales of paperback mass-market fiction books have also begun to decline.  Part of this is likely due in part to the collapse of a section of the wholesale distribution system, but that shrinkage doesn’t account for most of it, because it’s also occurring in the case of titles and authors who were never distributed widely on a wholesale basis, and whose books were largely sold only through bookstores. This hasn’t been so obvious in the F&SF field, because, while the average paperback print run has decreased, the number of paperback titles has increased slightly, but according to knowledgeable editors, the decrease is happening pretty much across the board, and some very big name authors – far bigger names than mine – have seen significant decreases in paperback book sales… and that’s without a corresponding increase in e-book sales.  Obviously, this isn’t true for every single author, and it’s impossible to determine for newly published authors because, if they haven’t published a book before, how can one accurately determine if their paperback sales are falling off from those of their previous book?

Despite all the talk, it appears that the popular mantra that information and entertainment need to be free remains in force for a small but significant fraction of former book buyers – even if such “free editions”  reduce authors’ incomes and result in publishers eliminating yet more mid-list authors because declining sales have made them unprofitable, or even money-losing.

The other day I came across an outraged comment about the price of an e-book version of my own Imager’s Challenge. The would-be reader was outraged that the electronic version was “only” a few dollars less than the hard-cover edition, especially since the paperback edition won’t be out for four months or so.  Somehow, it doesn’t seem to penetrate that while paper may be the single largest component of “physical” publishing costs, it still only amounts to something like 10-15% of the publisher’s cost of producing a book, i.e., a few dollars. Even without paper, the other costs remain, and they’re substantial – and publishing remains, as I have written, time and time again, a very low margin business. That’s why publishers really don’t want to cannibalize their hardcover revenues by undercutting the hardcover prices before the paperback version is on the shelves, especially given the decline in paperback sales.

There are many problems with piracy, including the fact that authors essentially get screwed, but the biggest one for readers seems to be overlooked.  The more piracy exists and the wider-spread it becomes, the less the choice readers will have in finding well-written, well-edited books, and especially of books that are not popular best-sellers.  The multi-million selling popular books – and the “popcorn books,” as my wife calls them – will survive piracy.  The well-written books for smaller audiences won’t.  So readers could very well be left with dwindling choices… and scrambling through thousands of self-published e-volumes, most of which are and will be poorly written and unedited in search of that rare “gem” – a good and different book that doesn’t appeal to everyone.

But… after all, information and entertainment want to be free.

The Instant Disaster Society?

Last Thursday, the stock market took its single biggest one day drop in its history, somewhere slightly over a thousand points, as measured by the Dow Jones Industrial Index.  While the market recovered sixty to seventy percent of that drop before the close Thursday, the financial damage across the world was not inconsiderable.  Did this happen because Greece is still close to a financial meltdown, or because economic indicators were weak?   No… while the leading cause or precipitating factor may have been a typographical error – a trader entered a sell order for $16 BILLION of exchange futures, instead of a mere $16 million, there are a number of other possibilities, but the bottom line [literally] was that, whatever the cause, all the automated and computerized trading engines immediately reacted – and the market plummeted.  Later, the NASDEQ canceled a number of trades, but that was long after the damage had been done.

From the Terminator movies onward, there have been horror stories about computers unleashing doomsday, but the vast majority of these have concerned nuclear and military scenarios – not world economic collapse.  While I don’t fall into the “watch out for those evil computers” camp, I have always been and remained greatly concerned about the growth and uses of so-called “expert systems” – in all areas of society, largely because computers are the perfect servants – they do exactly what their programming tells them to do, even if the result will be disastrous.

For example, Toyota is now having all sorts of problems with runaway acceleration.  When this first occurred, my question was simple enough:  Why didn’t the drivers either shift into neutral or turn off the ignition.  Apparently, it turns out, at least some of them may not have been able to, not quickly, because they had keyless ignition systems.  Yet the automakers are talking about cars that will be not only keyless but also totally electronic, that is, even the shifting will be electronic and not physical/manual.  And if the electronics malfunction, exactly how will a driver be able to quickly “kill” the system?  Let’s think that one over for a bit.

President Obama and the health care reformers want all medical records to be electronically available, both for cost-saving purposes and for ease of access.  The problem with that kind of ease of access is that it also offers greater ease of hacking and tampering, and, I’m sorry, no system that offers the kind of ease the “reformers” are proposing can be made hacker-proof.  The access and security requirements are mutually antithetical. Years ago, Sandra Bullock starred in a movie called “The Net,” and while many of the computer references are outdated and almost laughable, one aspect of the movie was not and remains all too plausibly real.  At least two characters die because their medical records are hacked, and changed.  In addition, national databases are manipulated and identities switched.  Now… the computer experts will say that these sorts of things can be guarded against… and they can be, but will they?  Security costs money, and good security costs a lot of money, and people use computers to cut costs, not to increase them.

As far as economics go, now that an “accident” has shown just how vulnerable securities markets are to inadvertent manipulation, how long before some terrorist or other extremist group figures out how to duplicate the effect?  And then all the programmed trading computers will blindly execute their trades… and we’ll get an even bigger disaster.

Why?

Because we’ve become an instant-reaction society, and electronic systems magnify the effect of either system glitches or human error. Those programmed securities trading computers were designed to take advantage of market fluctuations on a micro if not a nano-second basis.  For better or worse, they make decisions faster than any human trader could possibly make them – and they do so based on data that may or may not be accurate.

We’re seeing the same thing across society.  Today’s young people are being trained to react, rather than to think.  Instead of letters or even email, they use Twitter.  Instead of bridge or old fashioned board games like Risk or Diplomacy, they prefer fast-acting, instant reaction videogames with a premium on speed.  More and more of the younger generation cannot form or express complex concepts, even as technology is taking us into an ever more complex world.  Business has a greater and greater emphasis on short-term gain and profits.  People want instant satisfaction.

The societal response to the increase in speed across society is to use computers and electronic systems to a greater and greater extent – but, as happened last Thursday, what happens when one’s faithful and obedient electronic servants do exactly what their inputs dictate that they’re supposed to do – and the result is disaster?

Do we really want – and can our society survive – a world where a few high-speed mistakes can destroy more than a trillion dollars worth of assets in seconds… or do even worse damage than that?  Not to mention one where thinking is passé… or for the old fogies of an earlier generation… and where all that matters is instant [and shallow] communications and short-term results that may well result in long-term disaster.

Stupid Questions/Bureaucratic Catch-22s

A few weeks ago, the Canadian science fiction writer Peter Watts was convicted of “assaulting” U.S. border guards because he failed to listen/heed instructions to remain in his car when he was pulled over for a search at a border crossing.  Although the guards’ testimony that Watts had physically assaulted them was refuted, Watts was found guilty because, under the law, failure to follow instructions constituted “assault,” although the only action he took was to be stupid enough to get out of his car when he was told not to.  While he was fined and given a suspended sentence, as a now-convicted felon, Watts will henceforth be denied entry to the United States, and, if he were careless enough to sneak in and were discovered, he’d be in much more serious trouble.  While more than a few readers and supporters were outraged at Watts’s treatment, Watts and others were even more outraged at a law that classes “failure to obey” the same as assault.

Unfortunately, this sort of legal trickery and legerdemain has a long and less than honorable history in the United States, and probably elsewhere in the world.  The American justice establishment has found a number of indirect ways to place people in custody and otherwise convict and sentence them.  Perhaps the most well-known was the conviction of the gangster Al Capone, not for the murders, fraud, and mayhem he perpetrated, but for, of all things, income tax evasion.

In 1940 the Congress passed, and the president signed the Alien Registration Act, otherwise known as the Smith Act, which made illegal, among other things, either the membership in any organization which advocated the violent overthrow of the U.S. government or even helping anyone who belonged to such an organization.  In effect, that meant the government could legally prosecute anyone who had ever been a member of the Communist party or anyone who ever helped anyone who had ever been a member of that party with any party-related activities, no matter how trivial. Initially, the Act was used only against those who had actually been involved in such activities, but in the late 1940s, the FBI and Senator Joe McCarthy and the House Committee on UnAmerican Activities charged thousands of Americans with violation of the provisions of the Smith Act. If someone admitted helping another who had belonged to the Communist Party, they could theoretically spend up to 20 years in jail.  If they denied it and proof was found otherwise, they were guilty of perjury and could also go to jail.  Eventually, the Supreme Court declared many of the more far-reaching interpretations and prosecutions under the law unconstitutional, but not before hundreds of people had been sent to jail or had their lives and livelihoods destroyed, either directly or indirectly, for what often amounted to association with friends and business associates.

Flash to the present.  According to the Salt Lake Tribune, the U.S. Customs and Border Protection Form No. 1651-0111 asks the following questions:

Have you ever been or are you now involved in espionage or sabotage, or in terrorist activities, or genocide, or between 1933 and 1945, were involved in persecutions associated with Nazi Germany or its allies?

Are you seeking entry to engage in criminal or immoral activities?

Now… it’s a safe bet that no one will ever check the “yes” box following either one of these questions, and many people will ask why the government bothers with asking such stupid questions.

The government knows no one will ever admit to either set or acts or intentions.  But… if anyone is ever caught even doing something immoral, not necessarily illegal, if the prosecutors can’t come up with as much evidence as they’d like to lock someone away, they can dig out the handy-dandy form and charge the “entrant” in question with perjury, etc.  It’s effectively a form of after-the-fact bureaucratic insurance.

Personally, I can’t say that it exactly reinforces my confidence in American law enforcement’s ability to find and prosecute the worst offenders when every immigrant who even shop-lifted or visited an escort service could be locked away.  But then, they did lock up Big Al, even if they couldn’t prove a thing against him on the worst crimes he ordered or committed.  So… maybe I shouldn’t complain.  Still… Peter Watts is now a felon for what amounts to stupidity, or at the least, lack of common sense, although he never threatened anyone or lifted a hand against either guard.

Conservative Suicide/Stupidity?

As many of you know, I live in Utah, and as most of you may not, I was the Legislative Director for William Armstrong, one of the most conservative congressmen and senators of his time, as well as the staff director for Ken Kramer, his successor in the House – also one of the most conservative congressmen, not to mention being Director of Legislation and Congressional Relations for the U.S. EPA during the first Reagan administration.  These days, however, even as a registered Republican, I seldom vote for Republicans, and what follows may explain one of the reasons why.

Utah’s two U.S. senators are Bob Bennett and Orin Hatch, both conservative Republicans, and according to the various political ratings, they’re among the most conservative in the Senate.  BUT… they’re not “perfect,” with Bennett receiving “only” an 84% rating and Hatch only an 88% rating from the ultra-conservative American Conservative Union. According to recent polls, over 70% of the GOP delegates to the Utah state Republican convention believe that both Hatch and Bennett should be replaced because they’re not conservative enough.  Bennett is up for re-election and probably will not even win his party’s nomination.  He might not even survive this week’s coming party convention.

Now… although I certainly don’t believe in or support many of their policies and votes, I can see where others might… and might wish for all their votes to follow “conservative” principles – but to throw out a three-term conservative incumbent over such ratings?  Does it really make any sense?

No… it doesn’t, and that’s not because I’m a great fan of either senator.  I’m not.  But here’s why replacing Bennett – or Hatch – is totally against the so-called conservatives’ own best interests.

First, the ratings are based on “political litmus test” votes, often on issues that indicate ideology and don’t represent votes on bills that actually might make a difference.  Second, the “difference” between Bob Bennett’s 84% rating and a perfect 100% rating represents all of four votes taken over the entire year of 2009.  Second, seniority in the Senate represents power.  It determines who chairs or who is the ranking minority member on every committee and subcommittee, and that helps determine not only what legislation is considered, but when it’s considered, and what’s actually included in it.  The Senate is an extremely complex body, and it takes years even to truly understand its workings.  To toss out an incumbent who is predominantly conservative, but not “perfectly” conservative, in favor of a challenger who may not even win an election, but who, if he does, has little knowledge of the Senate, and less power, is not an act of conscience, but one of stupidity.  Third, no matter how conservative [or how liberal] a senator is, each senator is restricted by the rules of the body to voting on what is presented. In the vast, vast, majority of cases, that means that the vote of an “imperfect” conservative can be no different from that of a “perfect” conservative.

I can certainly see, and have no problem, with conservatives targeting a senator who seldom or never votes in what they perceive as their interest, but to remove a sitting senator with power and influence who votes “your way” 80-90% of the time in favor of someone who may not win the election, and who will have little understanding or power if he does… that, I have to say, is less than rational.

In the interests of fairness, I will point out that the left wing of the Democratic Party is also guilty of the same sort of insane quest for ideological purity, and that the majority of Americans are fed up with these sorts of extremist shenanigans.  But in the current political climate, where most Americans are fed up with Congress, they may well vote to throw whoever’s in office right out of office… along with Bob Bennett.  And then, next year, when legislative matters are even worse from their point of view… they’ll be even angrier… even though almost none of the voters will admit that everyone wants more from government, in one way or another, than anyone wants to pay for – except for those on the extreme, extreme right, and they want no government at all… and that’s a recipe for anarchy in a world as technologically and politically complex as ours.

Reality or Perception?

The growth of high-technology, particularly in the area of electronics, entertainment, and communications, is giving a new meaning to the question of what is “real.”  As part of that question, there’s also the issue of how on-line/perceptual requirements are both influencing and simultaneously diverging from physical world requirements.

One of the most obvious impacts of the instant communications capabilities embodied in cell-phones, netbooks, laptops, and desktops is the proliferation of emails and text messages.  As I’ve noted before, there’s a significant downside to this in terms of real-world productivity because, more and more, workers at all levels are being required to provide status reports and replies on an almost continual basis.  This constant diversion encourages so-called “multitasking,” which studies show actually takes more time and creates more errors than handling tasks sequentially – as if anyone in today’s information society is ever allowed to handle tasks sequentially and efficiently.

In addition, anyone who has the nerve or the foolhardiness to point this out, or to refrain from texting and on-line social networking, is considered out of touch, anti-technology, and clearly non-productive because of his or her refusal to “use the latest technology,” even if their physical productivity far exceeds that of the “well-connected.”  No matter that the individual has a cellphone and laptop with full internet interconnectivity and can use them to obtain real physical results, often faster than those who are immersed in social connectivity, such individuals are “dinosaurs.”

In addition, the temptations of the electronic world are such, and have created enough concern, that some companies have tried to take steps to limit what on-line activities are possible on corporate nets.

The real physical dangers of this interconnectivity are minimized, if not overlooked.  There have been a number of fatalities, even here in Utah, when individuals locked into various forms of electronic reality, from Ipods to cellphones, have stepped in front of traffic and trains, totally unaware of their physical surroundings.  Given the growth of the intensity of the “electronic world,” I can’t help but believe these will increase.

Yet, in another sense, the electronic world is also entering the physical world.  For example, thousands and thousands of Asian young men and women labor at various on-line games to amass on-line virtual goods that they can effectively trade for physical world currency and goods.  And it works the other way.  There have even already been murders over what happened in “virtual reality” communities.

The allure of electronic worlds and connections is so strong that hundreds of thousands, if not millions, of students and other young people walk past those with whom they take classes and even work, ignoring their physical presence, for an electronic linkage that might have seemed ephemeral to an earlier generation, but whose allure is far stronger than physical reality.…

Does this divergence between the physical reality and requirements of society and the perceptual “reality” and perceived requirements of society herald a “new age,” or the singularity, as some have called it, or is it the beginning of the erosion of culture and society?

Important Beyond the Words

Despite all the “emphasis” on improving education and upon assessment testing in primary and secondary schools, education is anything but improving in the United States… and there’s a very good reason why.  Politicians, educators, and everyday parents have forgotten one of the most special attributes that makes us human and that lies behind our success as a species – language, in particular, written language.

An ever-increasing percentage of younger Americans, well over a majority of those under twenty, cannot write a coherent paragraph, nor can they synthesize complex written information, either verbally or in writing, despite all the testing, all the supposed emphasis on “education.”  So far, this has not proved to be an obvious detriment to U.S. science, business, and culture, but that is because society, any society, has always been controlled by a minority.  The past strength of U.S. society has been that it allowed a far greater percentage of “have-nots” to rise into that minority, and that rise was enabled by an educational system that emphasized reading, writing, and arithmetic – the three “Rs.”   While mastery of more than those three basics is necessary for success in a higher-technology society, ignoring absolute mastery in those subjects for the sake of knowledge in others is a formula for societal collapse, because those who can succeed will be limited to those whose parents can obtain an education for their children that does require mastery of those fundamental basics, particularly of writing.  And because in each generation, there are those who will not or cannot truly master such basics, either through lack of ability or lack of dedication, the number of those able to control society will become ever more limited and a greater and greater percentage of society’s assets will become controlled by fewer and fewer, who, as their numbers dwindle, find their abilities also diminish.  In time, if such a trend is not changed, social unrest builds and usually results in revolution.  We’re already seeing this in the United States, particularly in dramatically increased income inequality, but everyone seems to focus on the symptoms rather than the cause.

Why writing, you might ask.  Is that just because I’m a writer, and I think that mastery of my specialty is paramount, just as those in other occupations might feel the same about their area of expertise?  No… it’s because writing is the very foundation upon which complex technological societies rest.

The most important aspect of written language is not that it records what has been spoken, or what has occurred, or that it documents how to build devices, but that it requires a logical construct to be intelligible, let alone useful. Good writing requires logic, both in structuring a sentence, a paragraph, or a book.  It requires the ability to synthesize and to create from other information.  In essence, mastering writing requires organizing one’s thoughts and mind.  All the scattered facts and bits of information required by short-answer educational testing are useless unless they can be understood as part of a coherent whole.  That is why, always, the best educational institutions required long essay tests, usually under pressure.  In effect, such tests both develop and measure the ability to think.

Yet the societal response to the lack of writing, and thus thinking, ability has been to institute “remedial” writing courses at the college entry level.  This is worse than useless, and a waste of time and resources.  Basic linguistics and writing ability, as I have noted before, are determined roughly by puberty.  If someone cannot write and organize his or her thoughts by then, effectively they will always be limited.  If we as a society want to reverse the trend of social and economic polarization, as well as improve the abilities of the younger generations, effective writing skills have to be developed on the primary and early secondary school levels.  Later than that is just too late.  Just as you can’t learn to be a concert violinist or pianist beginning at age eighteen, or a professional athlete, the same is true for developing writing and logic skills.

And because, in a very real sense, a civilization is its written language, our inability to address this issue effectively may assure the fall of our culture.

The Failure to Judge… Wisely

In last Sunday’s education supplement to The New York Times, there was a table showing a sampling of U.S. colleges and universities and the distribution of grades “earned” by students, as well as the change from ten years earlier – and in a number of cases, the change from twenty or forty or fifty years ago.  Not surprisingly to me, at virtually every university over 35% of all grades granted were As.  Most were over 40%, and at a number, over half of all grades were As.  This represents a 10% increase, roughly, over the past ten years, but even more important it represents a more than doubling, and in some cases, a tripling of the percentage of As being given from 40-50 years ago.  Are the teachers 2-3 times better?  Are the students?  Let us just say that I have my doubts.

But before anyone goes off and blames the more benighted university professors, let’s look at society as a whole.  Almost a year ago, or perhaps longer, Alex Ross, the music critic for The New Yorker, pointed out that almost every Broadway show now gets a standing ovation, when a standing ovation was relatively rare some fifty years ago.  When I was a grade-schooler, there were exactly four college football bowl games on New Year’s eve or New Year’s day, while today there are something like thirty spread over almost four weeks.  Until something like half a century ago, there weren’t any “divisions” in baseball.  The regular season champion of the American League played the regular season champion of the National League.  It’s almost as though we, as a society, can’t accept the judgment of continual success over time.

And have you noticed that every competition for children has almost as many prizes as competitors – or so it seems.  Likewise, there’s tremendous pressure to do away with grades and/or test scores in determining who gets into what college.  And once students are in college, they get to judge their professors on how well they’re being taught – as if any 18-21 year truly has a good and full understanding of what they need to learn [admittedly, some professors don’t, but the students aren’t the ones who should be determining this].  Then we have the global warming debate, where politicians and people with absolutely no knowledge and understanding of the mechanics and physics of climate insist that their views are equal to those of scientists who’ve spent a lifetime studying climate.  And, of course, there are the intelligent design believers and creationists who are using politics to dictate science curricula in schools, based on their beliefs, rather than on what can be proven.

And there’s the economy and business and education, where decisions are made essentially on the basis of short-term profit figures, rather than on the longer-term… and as a result, as we have seen, the economy, business, and education have all suffered greatly.

I could list page after page of similar examples and instances, but these all point out an inherent flaw in current societies, particularly in western European societies, and especially in U.S. society.  As a society, we’re unwilling or unable, or both, to make intelligent decisions based on facts and experience.

Whether it’s because of political pressure, the threat of litigation, the fear of being declared discriminatory, or the honest but misguided belief that fostering self-esteem before establishing ability creates better students, the fact is that we don’t honestly evaluate our students.  We don’t judge them accurately.  Forty or fifty percent do not deserve As, not when less than thirty percent of college graduates can write a complex paragraph in correct English and follow the logic [or lack of it] in a newspaper editorial.

We clearly don’t judge and hold our economic leaders, or our financial industry leaders, to effective standards, not when we pay them tens, if not hundreds, of millions of dollars to implement financial instruments that nearly destroyed our economy.  We judge those running for political office equally poorly, electing them on their professed beliefs rather than on either their willingness to solve problems for the good of the entire country or their willingness to compromise to resolve problems – despite the fact that no political system can survive for long without compromise.

Nor are we, again as a society, particularly accurate in assessing and rewarding artistic accomplishments, or lack of them, when rap music, American Idol and “reality” shows draw far more in financial reward and audiences than do old-fashioned theatre, musical theatre [where you had to be able to compose and sing real melodies], opera, and classical music, and where hyped-up graphic novels are the fastest-growing form of  “print” fiction.   It’s one thing to enjoy entertainment that’s less than excellent in terms of quality;  it’s another to proclaim it excellent, but the ability to differentiate between popularity and technical and professional excellence is, again, a matter of good judgment.

In fact, “judgment” is becoming the new “discrimination.”  Once, to discriminate meant to choose wisely;  now it means to be horribly biased.  The latest evolution in our current “newspeak” appears to be that to judge wisely on the basis of facts is a form of bias and oppression.  It’s fine to surrender judgment to the marketplace, where dollars alone decide, or to politics, where those who are most successful in pandering for votes decide… but to decide based on solid accomplishment – or the lack thereof, as in the case of students who can’t read or write or think or in the case of financiers who lose trillions of dollars – that’s somehow old-fashioned, biased, or unfair.

Whatever happened to judging wisely?

Jeremiads

Throughout recorded history runs a thread whereupon an older and often distinguished figure rants about the failures of the young and how they fail to learn the lessons of their forebears and how this will lead to the downfall of society.  While many cite Plato and his words about the coming failure of Greek youth because they fail to learn music and poetry and thus cannot distinguish between the values of the ancient levels of wisdom ascribed to gold, silver, and bronze, such warnings precede the Greeks and follow them through Cicero and others.  They also occur in other cultures than in western European descended societies.

Generally, at the time of such warnings, as with the case of Alcibiades with Socrates, there are generally two reactions, one usually from the young and one usually from the older members of society.  One is: “We’re still here; what’s the problem; you don’t understand that we’re different.”  The other is: “The young never understand until it’s too late.”

I’ve heard my share of speeches and talks that debunk the words of warning, and generally, these “debunkers” point out that Socrates and Cicero and all the others warned everyone, but today we live at the peak of human civilization and technology.  And we do… but that’s not the point.

Within a generation of the time of Plato’s reports of Socrates’ warnings, Greece was spiraling down into internecine warfare from which it, as a civilization, never fully recovered.  The same was true of Cicero, but the process was far more prolonged in the case of the Roman Empire, although the Roman Republic, which laid the foundation of the empire, was essentially dead at the time of Cicero’s execution/murder.

The patterns of rise and fall, rise and fall, of cultures and civilizations permeate human history, and so far, no civilization has escaped such a fate, although some have lasted far longer than others.

There’s an American saying that was popular a generation or so ago – “From shirt-sleeves to shirt-sleeves in four generations.”  What it meant was that a man [because it was a society even more male-dominated then] worked hard to build up the foundation for his children, and then the next generation turned that foundation into wealth and success, and the third generation spent the wealth, and those of the fourth generation were impoverished and back in shirt-sleeves.

To build anything requires effort, and concentrated effort requires dedication and expertise in something, which requires concentration and knowledge.  Building also requires saving in some form or another, and that means forgoing consumption and immediate satisfaction.  In societal terms, that requires the “old virtues.”  When consumption and pleasure outweigh those virtues, a society declines, either gradually or precipitously.  Now… some societies, such as that of Great Britain, for years pulled themselves back from the total loss of “virtues.”

But, in the end, the lure of pleasure and consumption has felled, directly or indirectly, every civilization.  The only question appears to be not whether this will happen, but when.

So… don’t be cavalier about those doddering old fogies who predict that the excess of pleasure-seeking and self-interest will doom society.  They’ll be right… sooner or later.