Archive for the ‘General’ Category

Accuracy Gets No Notice

The December issue of The Atlantic Monthly contains a rather interesting article [“I was wrong, and so are you”] by Daniel Klein, a conservative/libertarian, who had published an op-ed piece in the Wall Street Journal in June of 2010 arguing that, based on a study that he and another economist had earlier conducted, liberals/progressives had a far poorer grasp of basic economics than did conservatives.  Right wing and conservative groups trumpeted the results, and comments on the study were the second-highest of anything published in the Journal for the month in which it was printed. Klein’s in-box was also filled with messages suggesting that he had rigged the study.

After considering the reaction and the criticisms of the analysis of that study [which had been designed for another purpose], Klein and his co-author designed a second study specifically for the purpose of evaluating the accuracy of people’s economic perceptions and comparing their political outlook to the accuracy of their economic views on various issues.  To Klein’s surprise, the second study indicated that [astonishing] that all across the political spectrum of the respondents, each group was equally wrong when evaluating the accuracy of economic statements at variance with their political beliefs. As Klein wrote, “the more a statement challenged a group’s position, the worse the group did” [in accurately evaluating the statement].

In short, in all cases, respondents were less accurate in economic judgments that conflicted with their underlying biases and views, and the greater the conflict, the lower the accuracy.  What was even more interesting was that the level of education seemed to matter very little or not at all.

To me, all this was scarcely surprising, but what was surprising was that, while scholarly reviewers found the new study accurate, there was essentially no public or media reaction to the release of the results of the follow-up study, even though Klein was very clear in declaring that the new study invalidated the results of the earlier work.  Given that the results of the second study were also at variance with Klein’s own political predilections, it would seem likely that there might be at least more than polite notice of the second study.

There wasn’t. The few academic/critical reviewers who did comment essentially said, “there’s a lot of confirmation bias out there.”  The conservative/right wing types have said nothing, in contrast to their trumpeting the earlier [and incorrect] work, and there seems to be little liberal reaction either.

In short, we all want to hang on to our biases, even in the face of information to the contrary, and the more that information challenges what we believe, the more strongly we dispute it.

Is it any wonder Congress can’t get anything constructive done?

 

The “Ap” Society

One of my smallest granddaughters is enchanted with the “aps” on her mother’s smartphone [she can’t be enchanted with mine, because I only have a new version of an old-fashioned cellphone], and everywhere I look or read, there’s another “killer ap.”  And I don’t have a problem with “aps.”  I do have an enormous problem with what they represent… in the deeper sense.

The other week, I was reading an article about the difference between inventors and “tweakers,” and one of the points made by the writer was that, in general, initial inventions seldom are what change society.  It’s the subsequent “tweaks” to those basic innovations that make the difference.  Bill Gates didn’t invent the personal computer, but the tweaks provided by Microsoft made it universal.  Steve Jobs was a superb tweaker and marketer, and those abilities led to the I-Phone, among other commercial and societally accepted and successful products, and all the smartphone clones that are changing communications patterns in technological societies.  And, of course, killer aps are another form of tweaking.

But… as I’ve noted before, for all our emphasis on tweaking and commercialization, we’ve seen very little development and implementation of basic technological innovation in more than a half century. We still generate the vast majority, if not essentially all, of our electricity based on 1950s (or earlier) principles; aircraft and automotive propulsion systems are merely tweaked versions of systems in use more than a half century earlier, and we don’t travel any faster than in 1960 (and actual travel time is longer, given security and other problems).

In some areas, we’ve actually shelved technology that was superior in performance to currently used technology for reasons of “economic efficiency,” i.e., cheaper. That tends to remind me of the ancient Chinese and the Ptolemaic Greeks, and even the Romans, who never implemented technological advances because slaves or servants were cheaper.

Take Burt Rhutan, one of the most prolific and dynamic aircraft designers of the past generation.  What I find most interesting is that for all of the technical success of his designs, few indeed have ever resulted in being produced in large numbers – and it’s not because his aircraft are particularly expensive [as aircraft go, that is].

Of course, all this raises the question of whether we’ve reached the effective limits of technology. This issue was raised more than a century ago, when some U.S. luminaries proposed closing the patent office because there was nothing new to discover.  It certainly wasn’t so back then, but all the emphasis on tweaking and commercialization I see now raises that same question once again, if in a slightly different perspective.  Have we hit the limits of basic science and technology?  Or are we just unwilling to invest what is necessary to push science further, and will we settle for a future limited to “killer aps”?

 

Of Mice, Men, and Ethics

I hate sticky traps. But sometimes, there’s no recourse, not when the rodent hides in crannies where the cats can’t follow, and in spaces where it’s impossible to place “humane” or regular traps.  But sticky traps create another problem – and that’s what to do with a living creature that looks at you with fearful eyes.  Despite having seen the damage mice can do when uncontrolled, I still hate having to dispose of them.  But it takes days to clean and sterilize the mess even one mouse can leave… and, like other creatures that sample domestic comfort, mice that are released have this tendency to return.  So I have a simple rule with various pests – stay out of the house, and I’ll leave you alone.

In the aftermath of the rodent, however, I was reading a commentary by a reviewer on “ethics” and whether characters by various authors lack ethics when they kill without showing remorse and angst, even when those they kill are people who, by any reasonable standard, are truly evil.  Since some of my characters have been charged, upon occasion, with such behavior, I couldn’t help thinking about the issue.

What it seems to me is that the issue for all too many people is either whether the “killer” feels sorry or concerned about his acts or whether the acts take place in a setting where the one doing the killing has “no choice.”  And over the years, I’ve realized that, for many, many, readers, the ones who are dispassionate or don’t feel “bad,” regardless of the impact of their actions, are generally considered as bad guys, or antiheroes at best, as in the case of Dirty Harry or others, while the good guys are the ones who reluctantly do what must be done.  If a protagonist doesn’t show reluctance… well, then he or she is either a villain, soulless, or an anti-hero without true ethics.  Part of this attitude obviously stems from a societal concern about individuals without social restraints – the sociopaths and the psychopaths – but is it truly unethical [and I’m not talking about illegal, which is an entirely different question, because all too often application of the law itself can be anything but ethical] to kill an evil person without feeling remorse?  And does such a killing make the protagonist unethical?

How can it be more “ethical” to slaughter other soldiers in a battle, other soldiers whose greatest fault may well be that they were on the “other side,” than to quietly dispose of an evil person on a city side street?  Well… one argument is that the soldiers were ordered to kill, and no one authorized the disposal of the evil individual.  By that reasoning, Nazi death camp guards were acting ethically.  Yet… we don’t want individuals taking the law into their own hands.  On the other hand, what can individuals do in such a circumstance when the law offers no protection?

These are all issues with which we as writers, and as citizens, must wrestle, but what bothers me is the idea that, for some people and some readers, the degree of ethics rests on the “feelings” of the individual who must face the decision of when to use force and to what degree.  Was I any more or any less ethical in killing the rodent vandalizing my kitchen because I felt sorry for the little beast?  It didn’t stop me from putting an end to him.  Isn’t the same true in dealing with human rodents?

And don’t tell me that people are somehow “different”?  With each passing year, research shows that almost all of the traits once cited as distinguishing humans as unique also exist in other species.  Ravens and crows, as well as the higher primates, use tools and have what the theorists call a “theory of mind.”  The plain fact is that every species kills something, whether for food, self-defense, territory, or other reasons.

So…perhaps a little less emphasis is warranted on whether the feelings about the act of killing determine whether the killing is “ethical” or not.  Admittedly, those characters who show reluctance are certainly more sympathetic… but, really, should they be?  Or should they be evaluated more on the reasons for and the circumstances behind their acts?

 

 

 

 

Insanity – Political and Otherwise

At the end of the movie Wall Street: Money Never Sleeps, the protagonist says something like, “Insanity is doing the same thing time after time and expecting a different result.  All of us are insane at times, but what happens when more and more of us are insane at the same time?”

Recent off-year city council elections here in Cedar City reminded me of this rather forcefully.  Two of the candidates running for re-election were incumbents, and both were handily defeated – and replaced by candidates with exactly the same backgrounds, views, and general attitudes of the incumbents – and those new councilmen have absolutely no experience in municipal government. As I noted more than a year ago, the voters of Utah did essentially the same thing in replacing the then-incumbent ulrea-conservative Republican Senator with an ultra-conservative clone.  In a national politics generally, the Democrats continue to reinforce their ideology and the Republicans theirs, and in general each party is continuing to do the same thing they’ve always done with the hope of a different result.

And that different result isn’t going to happen, because increased taxes [the Democratic view]can’t cover the annual deficit, let alone the debt ; and there’s no way to cut federal programs and regulations [the Republican view] to the degree necessary to reduce massive deficits without destroying both government and the economy.  But both sides resist compromise, and continue to do the same thing… and that is truly insanity, and no one is calling them on it.

From what I can see, this is exactly what’s happening politically in the United States, and perhaps elsewhere around the world as well.

Have we reached the point in society where our illusions mean more to us than the survival of our society?  Where ideological “purity” is all, and practical compromise is a dirty filthy thing not to be mentioned anywhere?

Well… certainly various forms of purity have run rampant before, such as the Nazi effort for racial purity, the endless wars/massacres over religious/ethnic/political purity, ranging from those that plagued Europe for some 500 years, to the Chinese and Russian revolutions, to Pol Pot in Cambodia, to even the Mountain Meadows massacre in Utah.  And somehow, after all the fighting was over, and the hundreds of millions of dead bodies buried or ignored, there were still two sides left, two views conflicting, if temporarily more quietly.  Protestantism and Catholicism still exist in Europe, Ireland, and the British Isles.  The Mormon Church remains predominant in Utah, but it’s far from exclusive, and non-Mormons outnumber Mormons in Salt Lake City itself. Both China and Russia have had to come to terms with capitalism, and right wing racial hate groups still exist, if in far smaller numbers, across Europe.

Perhaps… it just might be well to recall that when “ideals” ignore reality, they all too easily become illusions.  Yet, without ideals… everything is sold to the most powerful or wealthiest.  And balancing ideals with reality is also a compromise… like life.

Insanity is not only doing the same thing time and time again and expecting the same result; it’s also failing to recognize that inflexible adherence to any ideal inevitably leads to unrest, disruption, and all too often… death and destruction… all the time while each set of true believers claims that everything would be fine – if only the other side would realize the error of their ways.

 

Another Take on Hypocrisy

Some ten years ago, I attended a memorial service for a woman who had died from a heart attack – the last of a series over a year or so.  The church was filled to overflowing, and everyone had wonderful things to say about her.  She was excellent technically in the position she held, and, as a single woman, she had even fostered a wayward teen girl and tried to set her – and her daughter – on the path to a more productive life.  She worked hard and long at her job, and she was helpful to her colleagues. But she had one fault. She wasn’t averse to pointing out when she was given a stupid or non-productive assignment, and, worse, she was almost invariably accurate in her assessments.

The result?  Her superiors piled more and more work on her while effectively cutting her pay and status, and because she was in her late fifties or early sixties trying to support herself and two others, she had little choice but to keep working.  For whatever reason, the one colleague with whom she worked well had her job abolished – only to have it reinstated a year or so later and filled by a man [who didn’t last all that long, either].  Employees in other departments who tried to be advocates for her were either ignored or told that it was none of their business… and, besides, she brought it on herself because of her sharp tongue. After her first heart attack, as soon as she could, she went back to work because her position wasn’t covered by short-term disability insurance, and she was too young for Social Security.  She died, of course, some months later, after she’d lost her house and was living in a trailer.

Just another sad story, another one of the countless tales of people who have run afoul of adversity after adversity. Except… a goodly portion of those people who had offered tributes at her memorial service were the very people who had effectively undercut her and driven her to her death.

They praised her talents, but hated her honesty.  They praised her charity toward others, while practicing little toward her.  And, in the end, after the memorial service was over, she was quietly forgotten, and the once-wayward teen moved out of town, and life went on for the men who had driven an honest, if acerbic, woman to death.

Why do I remember these events?  Because, in reflecting on one woman’s death, I see them played out on a larger and larger scale, day after day, when the voices of honesty and reason are drowned in a sea of rhetoric, often quietly fomented by those who created so many of today’s major problems, especially the politicians and the financial community.  At the same time, no one with the power to resolve the situation wants to or to recognize the embarrassing facts about their part in creating the current problems… even while romanticizing the acts and deeds of deceased politicians with whom they often disagreed while paying lip service to hard-working Americans whose real wages have declined over the past decade.

But then, maybe calling the acts of the perpetrators and their subsequent rhetoric mere hypocrisy is too generous.

 

 

 

Tolerance and Hypocrisy

Tolerance of the unjust, the unequal, and the discriminatory is anything but a virtue, nor is fiction that brings to light such problems in society a vice.  Yet among some readers and reviewers there seems to be a dislike of work that touches upon such issues. Some have even gone so far as to suggest such fiction, in portraying accurately patterns of intolerance, inequality, and gender discrimination that such fiction, actually reinforces support of such behaviors.  Over the past few years, I’ve seen reviews and comments about my fiction and that of other writers denigrated because we’ve portrayed patterns of discrimination, either on the basis of gender, race, ethnicity, or sexual orientation.  I certainly hope what I’ve seen are isolated incidences, but even if they are isolated incidences, I find them troubling, especially when readers or reviewers complain that illustrating in fiction what occurred either historically or continues to occur in present-day society constitutes some form of discrimination and showing how it operates is hateful and insulting.

Discrimination is hateful, insulting, and degrading, but pretending it doesn’t exist while preaching tolerance is merely a more tasteful way of discriminating while pretending not to do so… and that’s not only a form of discrimination, but also a form of hypocrisy. It somehow reminds me of those Victorians who exalted the noble virtues of family and morality and who avoided reading “unpleasant” books, while their “upstanding” life-style was supported at least in part by child-labor, union-breaking tactics that including brutality and firearms, and sweat-shop labor in which young women were grossly underpaid.

Are such conditions better than they were a century ago?  Of course they are – in the United States and much of the developed world.  But gender/sexual discrimination still exists even here – it’s just far more subtle – and it remains rampant in much of the developing and third world.  So… for a writer to bring up such issues, whether in historical or fantasy or futuristic science fiction is scarcely unrealistic, nor is it “preaching” anything.  To this day, Sheri Tepper’s Gate to Women’s Country is often violently criticized – if seldom in “respectable” print, but often in male-oriented discussion – because it postulates a quietly feministically-dominated future society and portrays men as dominated by excessive aggression and sexual conquest, yet a huge percentage of fantasy has in fact historically portrayed men almost “heroically” in such a light. Why the criticism of writers such as Tepper?  Might it just be that too many readers, largely male, don’t like reading and seeing historically accurate patterns of sexual discrimination reversed?  And how much easier it is to complain about Tepper and others than to consider the past and present in our world today.

There’s an old saying about what’s sauce for the goose is sauce for the gander…

 

Helpful Technology?

A week or so ago, my trusty and ancient writing computer bit the dust, and I replaced it with a brand-new version, equipped with the latest version of Word.  After a fair amount of muttered expletives, I managed to figure out the peculiarities of the latest word processing miracle from Microsoft, or at least enough to do what I do.  Then I discovered that every time I closed the program, the new defaults for page setup and font that I’d established vanished when I opened the program.  My local techs couldn’t figure out why, but they did give me a support number for Microsoft.  The first tech was cheerful, and when we quickly established that I’d been doing all the right things, and she couldn’t figure it out either, she referred me to another tech.  In less than five minutes, he’d guided me through things and solved the problem – and it wasn’t my fault, but that of a piece of software installed by the computer manufacturer.  Word now retains my defaults, and we won’t talk about some of the other aspects of the program [since I’ve dwelt on those before].

All that brings me to the next incredible discovery – and that’s the blundering idiocy known as a grammar checker.  Unfortunately, the Microsoft people didn’t retain a wonderful feature of my old Word 7.0 – the separation of the spell-check and grammar features.  So… if I want to spell-check a document – which I do, because my typing is far from perfect – I must endure a grammar check.  Now… I wouldn’t mind an accurate grammar check, but what passes for a grammar check is an abomination for anyone who writes sentences more complex than subject-verb-object, and especially someone who likes a certain complexity in his prose. The truly stupid program [or programmers who wrote it] cannot distinguish between the subject in the main sentence and the subject in an embedded subordinate clause, and if one is plural and the other singular, it insists that the verb in the subordinate clause be changed to match the subject in the main sentence.

[It also doesn’t recognize the subjunctive, but even most copy-editors ignore that, so I can’t complain about that in a mere program.]  There are also a number of other less glaring glitches, but I’m not about to enumerate them all.

For me, all this isn’t a problem, although it’s truly an annoyance. But for all those students learning to write on computers it is a problem, especially since most of them have absolutely no idea about the basics of grammar, let alone about how to write correct complex sentences – and now we have a computer grammar-checking program that can only make the situation worse!

There are definitely times when “helpful” technology is anything but, and this definitely qualifies as such.

 

Good-bye?

When I returned to Cedar City after going to the World Fantasy Convention in early November, I was surprised – and appalled – to find merchants, especially our single “big-box” chain store – busy replacing the Halloween displays and immediately putting up Christmas decorations and sales promotions.  There was little space or mention given to Thanksgiving.  And I wondered if this happened to be a mere local phenomenon.  Then I went on my whirlwind tour for Scholar and discovered that in all the cities I visited, the same thing was happening.  In fact, more than two weeks before Thanksgiving, I didn’t seen any commercial references to Thanksgiving, only to Christmas, and in most stores and malls Christmas music was playing.  Then I read where some merchants were pressing to begin the Christmas madness sales at midnight on Thanksgiving Day, forcing sales personnel to stay up all night or to do with little sleep – to cram in a few more hours of sales madness, pushing “black Friday” into Thanksgiving Thursday.

Years ago, I remember reading a short story by Fred Pohl called “Happy Birthday, Dear Jesus,” that was set in a future where the “Christmas season” begins in September, and, of course, I’m sure that many readers found that delightfully exaggerated back in 1956, when the story was first published, but Fred certainly anticipated a point we’ve almost reached.

To say that I find this trend disturbing would be an understatement.  Halloween and Christmas squeezing out Thanksgiving?  A Christmas buying season now beginning in October?

Yet, on reflection, it’s certainly understandable.  Thanksgiving was a holiday originally celebrated for giving thanks for having survived hard times and having attained modest prosperity.  And how many people really give thanks today?  After all, don’t we deserve all the goods and goodies we have?  Aren’t we entitled to them?  Then, too, Thanksgiving doesn’t put that much loot in the pockets of the merchants.  It’s a time for reflection and quiet celebration at home.  It requires personal time and preparation to be celebrated properly.  You just can’t go out and spend money and buy love or assuage your guilt with material gifts.  You have to consider what your blessings are, and what you’re thankful for… and reflect upon those who don’t have much for which to be thankful.

Christmas and Halloween have much in common in current American culture.  They’ve become all about the goodies – both for the consumer and the merchants… and both our son, who manages an upscale men’s fashion outlet in New York City and my editor have made the point that the comparative success or failure of the year depends on how much they sell in the “Christmas” season.  They’re certainly not alone, and many jobs, and the earnings of many workers, depend on such sales.  Yet, the economic health of a nation depending on holiday conspicuous consumption?  That’s frightening in itself. Add to that that such consumption is crowding out times of personal family reflection and an appreciation of what we do have for a frenzy devoted to what we don’t have.

Economic necessity or not… couldn’t we still reserve a small space of dedicated time for Thanksgiving between the buying and selling frenzy?

 

 

 

 

 

 

 

 

Return to the Past?

After finishing a whirlwind tour – seven cities and some of their suburbs in seven days – I’ve seen a trend I noticed years ago becoming even stronger… and more than a little disturbing.  Once upon a time, books were so expensive and hard to come by that only the very wealthy possessed more than a few, and most people had none.  Libraries were few and reserved effectively for the well-off, because few of those less than well-off could read or could manage access to them.

What does that have to do with today or my tour?

Only the fact that, despite such innovations as ebooks and e-readers, in a subtle yet substantive way we’re on a path toward the past in so far as books are concerned.  Yes, millions of books are printed and millions are now available, or soon will be, in electronic formats, but obtaining access to those books is actually becoming more and more difficult for an increasing percentage of the population across the United States.  With the phase-out of small mall bookstores, more than 2,000 bookstores that existed thirty years ago are now gone.  While they were initially replaced by some 1300 “big-box” bookstores, with the collapse and disappearance of Borders and consolidation by other chains, the numbers of chain bookstores has now dropped at least 25%, if not more, in the last few years.  Add to that the number of independent bookstores that have closed, and the total shrinkage in bookstores is dramatic.

Unhappily, there’s another aspect of this change that’s far worse.  Overwhelming numbers – over 90%  – of large bookstores in the United States are situated in “destination” locations, invariably near or in wealthy areas of cities and suburbs, reachable easily only by automobile.  At the same time, funding for public and school libraries is declining drastically, and, in many cases, funds for books are slim or non-existent and have been for years.

But what about electronic books… ebooks?

To read an ebook, one needs an e-reader of some sort, or a computer.  In these economically straitened times, adults and children from less affluent backgrounds, especially those near or below the poverty level, have difficulty purchasing an e-reader, let alone ebooks. Somehow, this fact tends to be overlooked, again, as if reading might not even be considered a problem for the economically disadvantaged

In the seven cities I visited on my recent book tour, every single chain bookstore or large independent was located in or adjacent to an affluent area. Not a single major bookstore remains in less affluent areas.  As I mentioned in a much earlier blog, this is not a new pattern, but the trend is becoming almost an absolute necessity, apparently, for new bookstore locations. Yet who can blame the bookstores? Small mall bookstores aren’t nearly so profitable as trendy clothes retailers, and most mall rents are based on the most profitable stores. Hard times in the book industry have resulted in the closure of unprofitable stores, and those stores are almost invariably located in less affluent areas. These economic realities also affect the WalMart and grocery store book sections as well.  In particular, grocery retailers in less affluent areas are less likely to carry books at all.

But no matter what the reason, what the economic considerations may be, when a city and suburbs totaling more than two million people have less than ten major bookstores, with only one major independent, and all of those stores are located in economically well-off areas, I can’t help but worry that we are indeed on a road to a past that we shouldn’t be revisiting.

 

 

 

The Comparative Species

For all our striving as a species to find clear and absolute answers to everything, from what is “right” to the deepest mysteries of the universe, at heart, human beings remain a highly comparative species.  In its best form, this compulsive comparativeness can fuel high achievement in science and technology.  Whether we like it or not, competitive comparativeness fueled the space program that landed men on the moon, the early development of the airplane, even the development of commercial and residential electrification, not to mention untold advancements in many fields.

The worst aspects of comparativeness remind me, however, of the old saying that all comparisons are odious.

In personal affairs, comparisons tend to be subjective and unfair, particularly in politics and business.  The late Richard Nixon was pilloried for taping conversations in the White House, yet White House taping had gone on in several previous administrations.  He resigned under threat of impeachment for covering up the Watergate burglaries, yet cover-ups have occurred in government for generations.  The full extent of the naval oil reserve scandals in the Harding administration didn’t come out for decades, nor did the extent of Jack Kennedy’s extensive philandering in the White House.  While both Kennedy and Nixon had grave faults, in point of fact, Nixon actually had many accomplishments as president, while Kennedy’s sole measurable achievement was averting nuclear war in the Cuban Missile crisis, yet in popular opinion, there’s essentially no comparison.  The ballyhooed Presidential debate between Kennedy and Nixon was another example of the fickleness of comparativeness.  Among those who heard the debate on radio, a significant majority felt Nixon had “won.”  Among those who watched it on television, a majority opted for Kennedy.  Same debate, same words – but physical appearance carried the day.

Likewise, study after study has shown that men who are taller, regardless of other qualifications, receive more pay and more respect than shorter men, even those more able in terms of ability and achievement, and interestingly enough, in almost all U.S. presidential elections, the taller candidate has been the winner.

Another example surfaced with the recent deaths of Steve Jobs and Dennis Ritchie.  While the entire world seemed to know about Jobs, and mourn his early and untimely death, only a comparative handful of people seemed to know about Dennis Ritchie, who was the pioneer who developed the first widespread and fundamental computer languages [the C programming language and the UNIX system] which made possible the later success of both Steve Jobs and Bill Gates. Yet Jobs’ death appeared everywhere, and Ritchie rated a small paragraph buried somewhere in newspapers, if that.  Although Ritchie’s death was widely mentioned in technical and professional journals, it went almost unnoticed in the popular media.

In the end, the question may be: Is it that comparisons are so odious, or that the grounds on which we make those comparisons are so odious?

 

Unforeseen Results

Just before I left on this book tour [and yes, I’m writing this on the road, which I usually don’t do], I read an article on how unprepared recent college graduates and even those getting advanced degrees happen to be in terms of what one might call personal preparedness.  The article, by a professional business recruiter, stated that most graduates had little idea of even what to wear to an interview, let alone how to get one.

Then, on one of the airplane jaunts, I read about how college students are moving out of engineering and science courses because “they’re too hard,” despite the fact that the average college undergraduate studies half as much today as the average student did thirty years ago.  To complete this depressing litany, I finished up with an opinion piece by a scientist who lectures occasionally, and who cited figures to show that today’s students have trouble learning anything in science without repeated repetition of the material because they don’t easily retain what they’ve heard in the classroom without that repetition.

But to top it all off, last night I ran into an attorney who teaches part-time at a prestigious southern law school, and we got to talking after the signing at the bookstore.  What she told me was truly astounding.   She set up a class where attorneys in various fields came and discussed the actual practice of law and where the students, all in their final year of law school, were told to be prepared to ask questions and were given the time and opportunity to do so.  First off, all were buried in their laptops, and not a single one could ask a question without reference to the laptop or notebook.  Second, not a one could ask a follow-up question or one not already prepared on the computer.  Third, not a one engaged in extended eye-to-eye contact with the visiting attorneys, and fourth, not a single one asked any of the visiting attorneys for a business card, despite the fact that none of them had job offers and all would be looking for positions in six months.  Considering the fact that almost all law firms are becoming very picky about new hires and that many have actually laid off experienced attorneys, none of these law students seemed to have a clue about personal interaction or personal networking.  Oh… and almost none of them actually wore better clothes to that class.

If this is what the new, computerized interactive net-based society has to offer, we’re all in big trouble, and those of us who are approaching senior citizen status may well have to keep working a lot longer for more reasons than economic necessity.

 

No Objective Truth?

The other day, one commenter on a blog asked if I wanted to write about the growth of a belief structure in American society that essentially denies the existence of “objective truth.”  Actually, I’ve written about aspects of this before, particularly as a subset of the selective use of information to reinforce existing confirmation bias, but I find the growth of the feeling that there is no objective truth, or that scientifically confirmed “facts” remain a matter of opinion – and that everyone’s opinion is equal – to be a disturbing but almost inevitable outcome of the fragmentation of the media along lines corresponding to existing belief structures, as well as of the increasing role that the internet and social media play in the day-to-day life of most people.

The basic ground rule of any successful marketing effort is to get the target audience to identify with your product.  Usually that’s accomplished by positioning the product to appeal to biases and beliefs.  Information – which is largely no longer news or factual/objective reporting outside of stringently peer-reviewed scientific journals – apparently must no longer have more than a tangential relationship to facts or objectivity, but has its content manipulated to appeal to its desired target audience.  Now… this is scarcely new.  Modern yellow journalism dates back more than a century, but because the economics of journalistic production limited the number of perspectives that could be specifically pandered to, because the law did have an effect in so far as actual facts were concerned, and because there remained a certain basic integrity among at least some media outlets until comparatively recently, facts were not quite so routinely ignored or distorted in quite so many ways.

One of the mixed blessings of technology is that millions and millions of people in every high-tech society have access to and the ability to use comparatively sophisticated media techniques (particularly compared to those available even a generation ago) to spread their views and versions of the “facts” in ways that can be appealing and often compelling.  In turn, the multiplicity of ways of presentation and distortion of existing verified facts now in existence creates the impression that such facts are not fixed, and the next step for many people is the belief that facts are only a matter of opinion… and since everyone’s opinion is valid… why then, “my” view of which fact or interpretation is correct, or can be ignored, is just as good as anyone else’s.

This “personalization of truth” leads to many rather amazing results such as, for example, that, as the scientific consensus on the issue of global warming has become almost unanimous in the fact that, first, such global warming is occurring, and, second, that there is a strong anthropomorphic component to such warming, popular opinion agreeing with these findings has dropped almost twenty percent.

Unfortunately, occurrences such as global warming or mechanisms such as oncoming vehicles combined with high-volume earbuds, famines and political unrest, viruses and bacteria, and high-speed collisions are all present in our world. Consequently, rising sea levels, violent weather changes, fatalities due to disease among the unvaccinated, starvation, or due to failure to wear seatbelts will all take their toll, regardless of the beliefs of those who ignore the facts.

Belief is not a viable defense or preventative measure against climate change, biology, on-coming heavy objects, or other objective impingements upon subjective solipsistic unreality… no matter what or how you believe.

 

“Downers,” Stereotypes, and Literary Quality

Yesterday, when I was flying back from the World Fantasy Convention in San Diego, I read through two “best-of-the-year” anthologies… and landed in Cedar City thoroughly depressed… somewhat disgusted… and more than a little irritated.  No… I wasn’t irritated that the anthologists hadn’t picked one of my infrequent short stories.  In the first place, I know that it’s unlikely anything I write will find favor with most anthologists [although there have been exceptions].  In the second place, I hadn’t published any short fiction for the year in question.

My anger, irritation, and depression came from the same root cause.  Out of something like fifty stories, all but perhaps ten were downers.  Out of the ten that weren’t, eight were perhaps neutral or bitter-sweet, and only two could be called upbeat.  Now… I don’t have anything against downbeat or depressing stories.  I don’t even have anything against them being singled out as good stories.  Certainly they were all at least better than competently written, and some were indeed superbly written.  And I definitely don’t think a story has to be upbeat to be great or “best-of-the-year,” but after many, many years of writing professionally, and even more of reading all manner of books and stories, ranging from “genre” fiction to the acclaimed classics, it’s clear to me that the excessive citation of literarily depressing stories as classics and excellence is hardly a mark of intellectual distinction, let alone impartial judgment.

All this, of course, reinforces my feelings about those critics and anthologists who seem to dismiss anything with any upbeat feel or positive aspects… or anything that isn’t “literary mainstream.”

The latest New York Times book section contains a review with an opening along the line of “A literary novelist writing a genre novel is like an intellectual dating a porn star.”  Supposedly, reviewers who write about books should be able to transcend stereotypes, not reinforce them, but then, snobbery is often based on the enshrinement of stereotypes contrary to the snob’s world view, and all too many critics, reviewers, and even some anthologists are little more than snobs.

A good story is a good story, and a bad story is a bad story, whether it’s “literary” or genre.

More Wall Street Idiocy

I recently discovered that the cable company Hibernia Atlantic is spending $300 million to construct and lay a new transatlantic cable between London and New York [New Scientist, 1 October].  Why? In order to cut 6 milliseconds from the 65 millisecond transit time in order to get more investment trading firms to use their cable.  For 6 milliseconds?  That’s apparently a comparative age when computers can execute millions of instructions in a microsecond, and London traders must think that those 6 milliseconds will make a significant difference in the prices paid and/or received.

And they may well.  Along the same lines, a broker acquaintance of mine pointed out that New York City real estate closest to the New York Stock Exchange computers commands exorbitant rents and prices for exactly the same reason… but I find the whole idea totally appalling – not so much an additional data cable, but the rationale for its use. Human beings can’t process much of anything in 6 milliseconds so that the speed advantage is only useful to computers using trading algorithms.  As I’ve noted earlier, the use of programmed and computer trading has led to a shift in the rationale behind trading to almost total reliance on technical patterns, which, in turn, has led to increased volatility in trading.  Faster algorithmic trading can only increase that volatility, and, regardless of those who deny it, can also only increase the possibility of yet another “flash crash” like that of May 2010, and, even if the new “circuit-breakers”cut in and work as designed, the results will still disrupt trading significantly and likely penalize the minority of traders without superspeed computers.

Philosophically speaking, the support for building such a cable also reinforces the existing and continually growing reliance on maximizing short-term profits and minimizing longer-term concerns, as if we don’t already have a society that isn’t excessively short-term. You might even call it the institutionalization of business thrill-seeking and attention-deficit-disorder. This millisecond counts; what happens next year isn’t my concern.  Let my kids or grandkids worry about what happens in ten or twenty years.

And one of the problems is that this culture is so institutionalized that any executive who questions it essentially destroys his or her future. All you have to do is look at those who did before the last meltdown.

Yes, the same geniuses who pioneered such great innovations as no-credentials-check-mortgages, misleadingly “guaranteed” securitized mortgages, banking deregulation, fees-for-everything-banking, and million dollar bonuses for crashing the economy are now going to spend a mere hundreds of millions to find another way to take advantage of their competitors… without a single thought about the implications and ramifications.

Isn’t the free market wonderful?

 

Why Don’t the Banks Get It?

Despite the various “Occupy Wall Street” and other grass-roots movements around the country, banks, bankers, and investment bankers really don’t seem to get it.  Oh, they understand that people are unhappy, but, from what I can tell, they don’t seem terribly willing to accept their own role in creating this unhappiness.

It certainly didn’t help that all the large banks ducked out of the government TARP program as soon as possible so that they wouldn’t be subject to restrictions on salaries and bonuses for top executives – bonuses that often exceeded a million dollars an executive and were sometimes far, far greater.  They all insist, usually off the record, that they feared “losing” top talent, but where would that talent go?  To other banks?

Then after losing hundreds of billions of dollars on essentially fraudulently rated securitized mortgage assets, they took hundreds of billions of dollars in federal money, but apparently not to lend very much of it, especially not to small businesses, who are traditionally the largest creators of new jobs in the country. At the same time, they continue to foreclose on real estate on a wholesale basis, even when ordered not to by judges and states and regulators and even in cases when refinancing was feasible with an employed homeowner.

And then… there’s the entire question of why the banks are having financial difficulties.  I’m an economist by training, and I have problems understanding this.  They’re getting money cheaply, in some cases, almost for free, because what they pay depositors is generally less than one percent, and they can obtain federal funds at an even lower rate.

Mortgages are running 4-6%, and interest on credit card debt is in the 20% range and often in excess of 25%.  Yet this vast differential between the cost of obtaining the product and the return on it apparently isn’t sufficient?

And that brings us to the latest bank fiasco.  For years, the banks, all of them, have been urging customers to “go paperless.”  Check your statement electronically; don’t write checks; use your debit card instead. Then, after the federal government tried to crack down on excessive fees for late payments, overdrafts, and the like, now several of the largest banks are floating the idea of a monthly fee for debit card use.  Wait a second!  Wasn’t this the banks’ idea in the first place?  Wasn’t it supposed to reduce costs?  So why are they going to charge depositors more to use their own money?

And the banks still don’t get it?  With all those brilliant, highly compensated executives?

Or don’t they care?

What Is a Cult?

Recently, apparently members of the Christian right are suggesting that presidential candidate Mitt Romney is not a “Christian,” but a member of a “cult.” As a resident of Utah for nearly twenty years, and as a not-very-active Episcopalian who still resents the revision of the King James version of the Bible and Book of Common Prayer, I find the raising of this issue more than a disturbing, not so much the question of what Mr. Romney believes, but the implications that his beliefs are any stranger or weirder than the beliefs of those who raised the issue.

Interestingly enough, the top dictionary definitions of the word “cult” are “a system of religious rites and observances” and “zealous devotion to a person, ideal, or thing.”  Over the past half-century or so, however, the term cult has come to be used in a more pejorative sense, referring to a group whose beliefs or practices are considered abnormal or bizarre.  Some sociologists make the distinction that sects, such as Baptists, Lutherans, Anglicans, Catholics, etc., which are products of religious schism, therefore arose from and maintain a continuity with traditional beliefs and practices while cults arise spontaneously around novel beliefs and practices. Others define a cult as an ideological organization held together by charismatic relationships and the demand of total commitment to the group and its practices.

Mitt Romney is a practicing Mormon, a member of the Church of Jesus Christ of Latter Day Saints, but does that make him a member of a cult?  Since the LDS faith specifically believes in Jesus Christ and follows many “Christian” practices such as baptism, belief in an omnipotent God and his son Jesus Christ, and rejected the practice of polygamy a century ago, can it be said to be a total “novel” faith or any more “bizarre” or “abnormal” than any number of other so-called Christian faiths?  Mormonism does demand a high degree of commitment to the group and its practices, but is that degree of commitment any greater than that required by any number of so-called evangelical but clearly accepted Christian sects?

While I’m certainly not a supporter of excessive religious beliefs of any sort, as shown now and again in some of my work, and especially oppose the incorporation of religious beliefs into the structure of secular government, I find it rather amazing that supporters who come from the more radical and even “bizarre” [in my opinion] side of Christianity are raising this question.  What troubles me most is the implication that fundamentalist Christianity is somehow the norm, and that Mormonism, which, whether one likes it or not, is clearly an offshoot of Christianity, is somehow stranger or more cultlike than the beliefs of the evangelicals who are raising the question.

This isn’t the first time this kind of question has been raised, since opponents of John F. Kennedy questioned whether the United States should have a Catholic president, with the clear implication that Catholicism was un-American, and it won’t be the last time.  The fact that the question has been raised at all in this fashion makes me want to propose a few counter-questions.

Why are those politicians who endorse and are supported by believers in fundamentalist Christianity not also considered members of cults?

Are we electing a president to solve pressing national problems or one to follow a specific religious agenda?

Does rigid adherence to a religious belief structure make a more effective president or a less effective one?  What does history show on this score?

And… for the record, I’m not exactly pleased with any of the candidates so far.

 

The Wrong Message

Social media are here, regardless of whether we like them, dislike them, use them, or don’t use them.  They’re also becoming a part of education, and school districts and colleges and universities across the country are struggling with policies that allow constructive use of social media while curbing abuse.  Some school districts prohibit their use in education entirely, while others range from restricted use to almost unrestricted use.

Time will tell, as with many things, just what uses will be allowed, but there’s one aspect of all of this that, I must say, troubles me greatly.  One educator, cited in a recent article in The Christian Science Monitor, made an observation along the lines that he had to give feedback on assignments to students through Facebook because students never checked email since email just wasn’t part of their world.

I relayed that comment to my wife the college professor, and she nodded sagely, informing me that a growing percentage of college students simply never check their email or answer telephone messages. She should know, since her university system will inform her whether any email she has sent has even been opened – and many aren’t.  An increasing number of students only respond, and not necessarily reliably, to text messages and Facebook postings.

What?  Since when are students determining what forms of communication will be used in education?  The issue here, it seems to me, is not just whether social media has a place in education, and what that place should be, but also who exactly is setting the standards and the ground rules.

To begin with, for a teacher to reach a student through a social network, the teacher must belong to that network, and depending on the settings, etc., must request of the student to be accepted as a “friend,” or request that the student contact them and be accepted as a friend. In short, either party can refuse communications, and, in effect, the students are effectively setting the requirements for what communications they’ll receive and how.  I can certainly see students – and parents – rebelling if teachers required communications via FedEx, UPS, or carrier pigeon, but not accepting emails as opposed to Facebook messages?  Email is a non-obligatory electronic communications system far more open to all users and recipients, and takes no more time or equipment than does Facebook or any other social network. Also, teachers should be teachers, not “friends,” because even the most brilliant of students should not be encouraged to think of themselves as the equal of their teachers, no matter how much greater some of them may doubtless end up.

Again, I may be antiquated, but at this point using social networks for any form of “official” communication, whether educational, governmental, or business, raises questions about security, privacy, scholastic policies, discipline, and propriety that certainly have not been answered.

 

Too Much Instantness?

Who’s the leading GOP presidential candidate this moment?  Romney? Perry? Cain? Is Christie in or out? What about Palin? The stock market’s up three hundred points – oops, down four hundred, up one hundred, down two hundred…  The latest on Amanda Knox, or whatever celebrity’s hot, bestseller numbers on Amazon reported hourly… commodity reports tracked by the millisecond, commodities and stocks traded by the nanosecond….

Forget about telephone calls.  Keep up with Twitter, 128 character quick bits, or friend messages, quick test messages on your iPhone.  Forget about so-called instant messages; they’re too slow, and emails… obsolete!

Have we as a society lost our minds?

There’s an old, old saying – Act in haste; repent at leisure – and I have the feeling that almost no one has heard it or remembered it. We’re inundated with instant information, pressured to act and decide instantly.  The worst of it is that because there’s so much instant communication and information, people are often taking longer and longer to get around to working on projects and doing actual work because they have to deal with the instant information, and that means more and more decisions and actions are taken with less and less forethought because there’s less and less time to actually consider them, and almost everything becomes an instant decision.

For example, when the liquidators took over Borders, they didn’t have “enough time” to consider selling blocks of leases to other bookstores and chains, or to sell book stock in lots.  In the end, I suspect, they raised far less cash than if they’d taken a bit more time to plan things out.

My son and I tried to buy a bathing suit for his daughter, because she’d inadvertently left hers behind.  This was the first weekend in August – still summer, one might think.  We had to try four stores before we could find any bathing suits at all – in the suburbs of Washington, D.C., where the temperature stays above eighty degrees until October.  Why?  Because instant automated decisions insist that the summer buying season is over in mid-July.

Programmed computer trades, made in nanoseconds, have transformed the stock market from a marketplace where fundamentals and logic had a role into a largely “technical” market based on using algorithms to make quick profits, but the result is an extremely volatile market, and one in which the risks of catastrophic losses and meltdowns become more and more probable, even when the underlying fundamentals of many securities are sound.  What’s happening is that the instant information drags the entire market up or down almost in lockstep, regardless of the differentials in values of various stocks.  So “hot” stocks with little behind them behave in much the same way as issues with solid fundamentals. That has turned the market into even more of a casino than it was. We’ve already had one “flash crash” in the market, and I’d be astonished if we don’t have another.

The instant emphasis pervades everything, it seems, even when there’s a question as to whether it makes sense, but, after all, “instant” is so much better.

 

Dead or Alive?

No… I’m not going to write about “wanted” posters, but about the awareness of being alive.  What sparked this was a New York Times article about how Grande Prix racing had gone from a sport that killed drivers every year on a predictable basis to one that seldom sees fatalities, thanks to the improved safety technology incorporated in the race cars… and how its public profile has dropped in the American media.

Every so often my wife and I may glance at a story or an ad or something that depicts so-called extreme sports.  Almost invariably, even when she says not a word, I know what she’s thinking.  She can’t understand why anyone would engage in something that dangerous, and she thinks they’re idiots for doing so.

My attitude is a bit different. Not only do I think they’re foolish, but I tend to feel sorry for them. Anyone who can only feel alive when risking death and annihilation, or who can only find a thrill or meaning in life in such circumstances, most likely isn’t truly alive most of the time anyway.  Many of those individuals, interestingly enough, claim that the rest of us aren’t truly alive because we don’t understand what it is to be alive in the face of danger.

Obviously, we’re all different, but I’d like to think that it shouldn’t take the imminent threat of death to feel alive, but what bothers both of us even more than that is the apparently growing popularity of such “sports”… where, like the crowds in the Roman Coliseum or the Circus Maximus, everyone roars when there’s a death or a crash.  But then, some Republicans roar when a governor boasts about the executions in his state. I’m all too aware that life can be fragile, and that no one so far has managed to get out of it alive, but I find it a sad commentary on humanity that bystanders and voyeurs can get a thrill or pleasure out of death and destruction.

Oh… I know that tendency has been around throughout history, and that less than two generations ago in parts of the United States, lynching was a spectator sport.  I’m also more than casually aware that death is, sooner or later, potentially all too close to most military personnel… but shouldn’t death be thought of as a reluctant necessity rather than with excitement or as entertainment?

And what does it say about us as a culture that the more violent forms of “entertainment” seem to be the most popular?

Dead or alive…?

 

The Same Book? [And Lots of Spoilers]

For at least several years, I’ve been puzzled by the handful of readers/reviewers who insist I write “the same book” over and over.  My first reaction was that they weren’t reading all of what I wrote… but several of these reader reviewers have clearly read much of what I write.  So my latest reaction tends to be, “If you find what I write so objectionable in its repetition, why do you keep reading my work and repeating your objections?”  If you don’t like it… then don’t read it.  I understand that my work doesn’t appeal to everyone.  No author’s work does.

But perhaps they feel so strongly that they’re compelled to try to persuade others that my work is “repetitious” or the like?  Why?  What’s the point?  I’ll admit that there are books and series that I feel the same way about… but I don’t spend time and ink trying to make that point to those who love those books and series.  If their followers enjoy them, then that’s their pleasure.

This “sameness” criticism has been applied especially to the Recluce Saga, and since several amateur reviewers [who consider themselves superior] continue harping, I thought I’d try to take a more analytical look at the saga and see if I could identify persistent areas of “sameness/repetition.”

One charge is that I always write about young people trying to find their way, yet out of the 16 books in the Recluce Saga, only four deal with protagonists younger than 20 [six, if you count the second book in the case of Lerris and Cerryl], and those young people come from very different backgrounds, ranging from being an orphan to being the son of a ruler.  In six of the sixteen books, the protagonists are well-established in their occupations and all over 30. Do they all then go from rags to riches?  In only three cases in all the Saga do the protagonists become absolute rulers – Cerryl, Lorn, and Saryn.  While Cerryl does move from “nothing” to high wizard, Lorn is the son of the fourth most powerful man in Cyad, and takes two books and much effort to reach the top spot. Saryn begins as number two in Westwind and ends up as number one in Lornth. Creslin starts out as the son of a ruler and ends up as one of five members of the ruling council, in roughly the same place after a great deal of trial and tribulation.  Kharl is a prosperous cooper who loses everything and finally manages to become a modestly endowed junior member of the aristocracy.  Dorrin  comes from a prosperous background, is exiled, fights, and ends up as what might be called an engineering tribune who founded Nylan. Justen begins as an engineering mage and ends up as a druid-influenced gray wizard and far from wealthy.  Rahl begins as a scrivener and ends up as the Mage-Guard advisor to the provincial governor. Nylan begins as a ship’s engineer and ends up as a gray mage in Naclos.    So… most of them did somewhat better for themselves, if at rather high costs, but not all did.

Well… maybe the books are stylistically similar.  Of the sixteen, two were written in the first-person past tense, four in the third-person present tense, and ten in the third-person past tense [which is the POV used in about 90% of all F&SF books].  That doesn’t present an overwhelming “similarity” in approach and actually differs greatly from the average.

Then does this purported sameness lie in the plot or the characters?  I’d be the first to admit that there is one definite element of similarity – the main characters all do survive and succeed to some degree, but the degree of their physical success varies considerably.  Creslin and Megaera effectively lose their entire families and end up trying to build a land on a desert isle.  Lerris ends up with no wealth, and no family except his wife.  Lorn becomes emperor, but loses his father and sister, and his remaining sister exiles herself. Justen spends his life as a wandering gray mage.  Rahl becomes a high-ranking mage-guard and does marry his love.  Kharl loses his wife and children, but eventually gains true love and  small estate.  Nylan gains nothing, except his wife and son, and loses his daughter.  Cerryl gains great power, and will spend the rest of his life looking over his shoulder.  Maybe I’m missing something, but the only similarity I see is that these characters have paid high prices for their survival and success, and the prices they have paid differ in how and when they were paid.

Heinlein once observed that there were only three plots in fiction – the success story and its opposite, the tragedy; the love story; and the story of the person who learned something.  I’ve only written one tragedy [The Forever Hero], and while many of my books incorporate love stories, I will admit that most of my books do center on people who have learned something and who have succeeded to some degree – if generally at a high personal cost.

If some reviewers claim that this is writing the same book again and again, then the same claim could be lodged against  90% of all the books ever written, because every book with a plot will have a basic sameness compared to what came before, and like pretty much every writer, I’m guilty of that sameness.

So what else is new?