Archive for the ‘General’ Category

More Musings on Morality

What is morality?  Or ethics?  The simple answer is “doing the right thing.”  But the simple answer merely substitutes one definition for another, unless one can come up with a description or definition of what “right” or “ethical” or “moral” might be.  A few days ago, a reader (and writer) asked what would seem to many to be an absurdly abhorrent question along the lines of, “If morality represents what is best for a culture or society, then isn’t what maximizes that society’s survival moral, and under those circumstances, why would a society that used death camps [like the Nazis] be immoral?”

Abhorrent as this type of question is, it raises a valid series of points.  The first question, to my way of thinking, is whether ethics [or morality] exist as an absolute or whether all ethics are relative.  As I argued in The Ethos Effect, I believe that in any given situation there is an absolutely objectively correct moral way of acting, but the problem is that in a universe filled with infinite combinations of individuals and events, one cannot aggregate those individual moral “absolutes” into a relatively simple and practical moral code or set of laws because every situation is different.  Thus, in practice, a moral code has to be simplified and relative to something. And relativity can be used to justify almost anything.

Taking, however, that survival on some level has a moral value, can a so-called “death camp” society ever be moral?  I’d say no, for several reasons.  If survival is a moral imperative, the first issue is on what level it is a moral imperative.  If one says individual survival is paramount, taken admittedly to the point of absurdity, in theory, that would give the individual the right to destroy anyone or anything that might be a threat. Under those circumstances, there is not only no morality, but no need of it, because that individual recognizes no constraints on his or her actions.  But what about group or tribal survival?  Is a tribe or country that uses ethnic cleansing or death camps being “moral” – relative to survival of that group?

Again… I’d say no, even if I agreed with the postulate that survival trumps everything, because tactics/practices that enhance one group’s survival by the forced elimination or reduction of others within that society, particularly if the elimination of other individuals is based on whether those eliminated possess certain genetic characteristics, or fail to possess them, is almost always likely to reduce the genetic variability of the species and thus run counter to species survival, since a limited genetic pool makes a species more vulnerable to disease or even the effects of other global or universal factors from climate change to all manner of environmental changes.  Furthermore, use of “ethic cleansing” puts an extraordinary premium on physical/military power or other forms of control, and while that control may, in effect, represent cultural/genetic “superiority” in the short run, or in a specific geographic area, it may actually be counter-productive, as it was for the Third Reich, when much of the rest of the world decided they’d had enough.  Or it may result in the stagnation of the entire culture, which is also not in the interests of species survival.

The principal problem with a situation such as that created by the Third Reich and others [where so-called “ethic cleansing” is or has been practiced] is that such a “solution” is actually counter to species survival.  The so-called Nazi-ideal was a human phenotype of a very narrow physical range and the admitted goal was to reduce or eliminate all other types as “inferior.”  While there’s almost universal agreement that all other types of human beings were not inferior, even had they been so, eliminating them would have been immoral if the highest morality in fact is species survival.

Over the primate/human history various characteristics and capabilities have evolved and proved useful at different times and differing climes.  The stocky body type and small-group culture of the Neanderthals proved well-suited to pre-glacial times, but did not survive massive climate shift. For various reasons, other human types also did not survive. As a side note, the Tasmanian Devil is now threatened by extinction, not by human beings, but because the genetics of all existing Tasmanian Devils is so alike that all of them are susceptible to a virulent cancer – an example of what could happen when all members of a species become too similar… or “racially pure.”

Thus, at least from my point of view, if we’re talking about survival as a moral imperative, that survival has to be predicated on long-term species survival, not on individual survival or survival/superiority of one political or cultural subgroup.

 

Public Works or Public Boondoggle?

For the past several months, an almost continual simmering issue at City Council meetings here in Cedar City has been over the new aquatic center.  First, there were the charges and countercharges over the cost overruns, and although most people eventually conceded that the additional work was necessary, there was great debate over the price tags.  Then came the continuing arguments over the operating costs, which most likely resulted in two incumbent city council members being defeated in the municipal election and the third whose term was up not even running for re-election.  At present, revenues only cover a bit more than sixty percent of the operating costs, and all three of the newly elected councilmen declare that the center should be self-sustaining.

Right!  A survey by one of the state new organizations discovered that not a single aquatic center in all of Utah had revenues that covered its costs.  One managed to recover almost eighty percent of its annual operating costs, and one only managed about fifty percent, and all the rest fell in between.  Why?  Because, like it or not, the people who use aquatic facilities are predominantly either families or seniors, and the majority of both have limited funds.  Increasing fees drops the number using the facility, and if fees are considered too high for the local community, total revenue drops even with increased per capita fees.  Add to that the fact that Cedar City is a rural university town located in a county with the lowest family income in the state, and the potential for raising fees is pretty limited.

This debate raises the eternal question about publicly funded projects.  Which are justified and which are boondoggles?  Comparatively, very few people seem to complain about public park budgets, for which no out-of-pocket fees are ever collected, but many would say that’s because they’re open to everyone.  Open, yes, but I have to say that although we have good parks here, and I’m for them, and for my tax money being used for them, I’ve set foot in them only twice in the eighteen years I’ve lived here.  I’m for them, and for the aquatic center, because they make the community a better place.  I’m also for them because I’ve lived all over the USA, and I can see that the tax levels here are low, most probably too low, and the local politicians certainly aren’t spendthrifts with the public money.  Sometimes, though, they’re idiots.

Cedar City is home to the Utah Shakespeare Festival, a good regional theatre [it won a Tony some ten years ago as one of the best regional theatres in the United States] based largely on the campus of Southern Utah University.  Founded some fifty years ago, it’s grown from a three-day event to almost a half-year full repertory theatre.  The university, however, has also grown enormously over the past two decades, from around 3,000 students to over 8,000, and there’s really not enough theatre space for both the University theatre, dance, and music programs and the Festival.  The Festival professionals have recognized this, and for years have been working on an expansion plan that would make the Festival far less dependent on university facilities.  In order to obtain some state and foundation funding, the Festival requested a grant of two million dollars from the local RDA, controlled by the city council, in order to demonstrate the required local support.  Several council members objected, and the entire $20 million plus expansion project was threatened before reason finally prevailed.

Was that $2 million a boondoggle?  Scarcely.  Economic studies have shown that the Festival generates between thirty-five and forty million dollars annually for local businesses, and provided a great economic cushion for the town some thirty years ago when the iron mines closed, and that’s been with minimal economic support from the town. For fifty years the town has benefited from the University’s support of the Festival.  Yet the decreasing percentage level of state support for the University [and any higher education institution in Utah] and the need to raise student tuition to compensate has placed the University in a position where it can no longer be so generous to the Festival.  Despite the enormous economic benefit to the town from the Festival, some politicians would call a two million dollar grant a boondoggle.

A decade ago, local politicians decided the town needed a good local theatre, one independent of the educational institutions… and they built one that holds almost 1000 seats, with good acoustics and associated modest convention facilities.  As a consequence, Cedar City has been able to host events from traveling operas to American Idol vocalists and everything in between.  But once again, the new councilmen are demanding that the theatre make money… despite the fact that the previous director [who was forced out by the new council] came very close to doing so.  NO decent performance theatre in a town of 40,000 people can do that [a lot of Broadway theatres can’t, and they charge exorbitant rates, which isn’t possible here].  But what that “borderline” economic performance doesn’t show is the thousands of people who travel to Cedar City from nearby and sometimes not so nearby rural areas for those shows and other events, and the hundreds of thousands, if not millions of dollars they spend in town on those trips.  Nor does it count the food and lodging paid for by the performers [and when those performers include 100 member symphony orchestras, that’s not inconsequential].

Especially in rural areas like Iron County, whether a town or small city prospers or withers depends not just on low taxes, but also on the quality of life, and often a “good” quality of life can generate enormous economic benefits, which tend to flow back in tax and other indirect revenue sources.  Past management of the quality of life has led to Cedar City being named as an outstanding community for both families and retirees, but with the recent rise of Tea Party type politicians, there’s been a cry for lower taxes and spending, despite the fact that they’re already too low.  There’s a huge difference between managing public facilities well and concentrating on profit-loss figures from single facilities or projects as an indication of their community usefulness and “profitability.”

Yes… there are many public boondoggles, and I’ve seen all too many of them, but just because a public facility or expenditure doesn’t cover its operating costs directly doesn’t mean it’s a boondoggle… or that the town isn’t “profiting.”   And that’s something too many people and politicians fail to understand.

 

 

The Hidden Costs of Transportation

A number of family members visited us over the holidays, and I ended up having to ship gifts, ski clothes, etc., back to them.  Some of them stayed almost a week, which we appreciated because we live great distances from them and with everyone working [which, as I’ve mentioned before, more and more often requires more and more time and effort for those who have jobs and wish to keep them], we don’t get to see them often.  Staying longer does require a few more clothes, especially in the case of small children, even though our washing machine was busy at many times, and more clothes means more weight.  More weight means checked suitcases… and since Southwest doesn’t fly to Cedar City, checked bags add to the cost of travel.

Then I recalled that, at one time, a little over ten years ago, a checked bag was not only free, but you could put 60 pounds of clothes and gear in it, rather than the current 50 pounds. That ten pound reduction doubtless reduced the strain on baggage handlers, and most probably accounted for some fuel savings – and cost savings – for the airlines.  All in all, though, these cost-savings measures for the airlines add to the cost for the traveler.  They also add to the inconvenience, since the overhead luggage bins are not adequate for all the carry-ons if a flight is full – and most are these days.  Then, too, there are the charges for seats with slightly more leg-room, and the elimination of in-flight meals in coach [often replaced with a “menu” of items for which the costs are just short of exorbitant].

Airport security also adds to the time spent in travel – from an additional 30-45 minutes at small airports to more than an hour at major hubs. And time is money, meaning that the more security agents on duty [to reduce waiting] the higher the cost to government.

Then I discovered that, because December 26th was a holiday this year, all the packages we’d hoped to ship back to the various coasts on Monday had to wait until Tuesday, and one of my sons and I wasted gas and money to discover that – because the local shippers never said that they were closed – they just left messages on their telephones that they were busy and asked us to leave messages or to call back.  Now, except for the various layers of government, banks, and the stock market, most other businesses – except for the shippers – were open, obviously believing that Sunday, December 25th, was the holiday, and not Monday.

Given the “efficiency,”  “effectiveness,” and self-centeredness of government, banks, and financiers, to find shippers following their lead gave me a very disconcerted feeling… and, well… you all know what I think about government, banks, and financiers, not to mention the airline industry.

 

The Difference Between Science and Scientists

Recently, I’ve posted a few blogs dealing with various aspects of personal opinion and confirmation bias and how the combination can, to an outsider, make any individual, in certain circumstances, look like a complete idiot.  That even includes scientists, sorry to say, yet “science” as a whole has an unprecedented record of accuracy over time, regardless of what climate change deniers and creationists say.  If scientists can be as personally biased and opinionated as all the rest of us, how does “science” end up with such a long-term record of accuracy?

There’s one basic reason, and that is that the modern structure of science, if you will, requires proof, and all the proof that is submitted is subject to scrutiny and attack from all quarters.  What emerges from this often withering barrage almost always turns out – in time – to be more correct and more accurate than that which preceded it.  That’s not to say that, upon occasion, it hasn’t taken the scientific establishment time to get things right, but eventually better techniques and better thought proved that plate tectonics was correct, just as, regardless of the creationists, there’s an overwhelming body of evidence in favor of evolution, and that relativity provides a more accurate picture of the universe than did Newton, or the Ptolemaic theorists.

But there are several “problems” with the scientific method.  First, establishing more accurate knowledge, information, or theories takes time, and often large amounts of resources, as well as winnowing through and considering a fair amount of uncertainty at times. Second, it requires reliance on data and proof; mere opinion is not sufficient.  Third, it’s not as set in stone as human beings would like.  The early Greek scientists had a fair idea about the earth and the moon, but their measurements and calculations were off.  As methods, equipment, and techniques improved, so did the measurements, and Newton did far better, and his methods and theories result in a high degree of accuracy for most earth-bound measurements and systems, but Einstein and his successors have provided an even more accurate explanation and more accurate measurements. And fourth, at present, the scientific method isn’t absolutely precise in predicting specific future results of massive interacting inputs.

That lack of absolute precision in dealing with future events often causes people to doubt science as a whole, even though its record is far better than any other predictor or prediction system.  Part of its accuracy comes from the fact that science as a structure adapts as more information becomes available, but some people regard this adoption of new data and systems as unsettling, almost as if they were saying, “If science is so good, why can’t you get it right the first time?”  An associated problem is that science is far more accurate as a descriptor than a predictor, and most people subconsciously assume that the two are the same.

Even so, one could easily adapt Churchill’s statement about democracy to science, in saying that it’s the poorest way of describing the universe and predicting how things will happen – except for any other way that’s ever been tried.  And that’s because the structure of modern science is greater than any individual scientist.

 

 

 

Lateness as a Reflection on the Pool of Self

The other Sunday, I was finishing up my morning walk/run with the crazy sweet Aussie-Saluki some two blocks from home, and the church bells rang the hour.  A few minutes later, as we passed the church, I saw cars speeding in and people hurrying into the church.  A block later, people were still hurrying to the church [not my church, since I confess to being a less than diligent congregant at another one]. Once upon a time, I was indeed a most religious young man, president of a church youth group and an acolyte at services every Sunday. Consequently, I had the chance to observe just how many people were late to services, and, frankly, late-comers were rare, extraordinarily quiet, and invariably their body posture reflected a certain discomfort. I doubt I saw as many late-comers in all the years I served as an acolyte as I saw on my walk on that single recent Sunday morning.

This observation got me to thinking, realizing that lateness and/or lack of interest in punctuality has become an increasing staple in our society.  When my wife produces an opera at the college, there are always between twenty and fifty attendees who come in after the first break, and that doesn’t count those who struggle in during the overture.  When we attend local concerts, the same thing is true.  More and more college professors I encounter relate their tales of students who cannot seem to arrive on time, and some have had to resort to locking doors to avoid disruptions from late-comers.  My wife even got a jury notice emphasizing that, if she were picked for jury selection, she needed to be punctual or she could face a stiff fine. This morning, in the paper, there was a story about a surgeon who was late to a court appearance — and who was imprisoned when the judge was less than impressed.

What exactly has happened to a society where cleanliness was next to Godliness and punctuality was a virtue?  And where even professional people who should know better don’t?

Oh… I know this is a western European-derived “virtue.”  When my wife did a singing tour of South America, no concert ever started “on time,” and in one case, the performance actually started more than an hour after the announced time because there was social jostling among the “elite” to see who could be the most fashionably late… as if to announce their power to make others wait.  And I have to confess that I tend to have an obsession with being on time because my father almost never was.

Still… what is it about being late?  Is it because, as our lives have gotten more and more crowded [often with trivia], we have trouble fitting everything in?  Is it because, with an internet/instant communications society, each of us feels more and more like the center of the universe, and our schedule takes precedence over that of others?  Is it merely a way of demonstrating personal power and/or indifference to others, or a lack of caring about the inconvenience being late can cause to others?  Is it a symptom of the growing emphasis of “self” over others?

I don’t have an answer… but I do know that I think most uncharitable thoughts about late-comers to anything, apparently oblivious or even enjoying the scene, whose lateness disrupts everyone else’s concentration and enjoyment… or even more important activities, like judicial proceedings.  And I seriously doubt I’m alone in those thoughts.

 

Accuracy Gets No Notice

The December issue of The Atlantic Monthly contains a rather interesting article [“I was wrong, and so are you”] by Daniel Klein, a conservative/libertarian, who had published an op-ed piece in the Wall Street Journal in June of 2010 arguing that, based on a study that he and another economist had earlier conducted, liberals/progressives had a far poorer grasp of basic economics than did conservatives.  Right wing and conservative groups trumpeted the results, and comments on the study were the second-highest of anything published in the Journal for the month in which it was printed. Klein’s in-box was also filled with messages suggesting that he had rigged the study.

After considering the reaction and the criticisms of the analysis of that study [which had been designed for another purpose], Klein and his co-author designed a second study specifically for the purpose of evaluating the accuracy of people’s economic perceptions and comparing their political outlook to the accuracy of their economic views on various issues.  To Klein’s surprise, the second study indicated that [astonishing] that all across the political spectrum of the respondents, each group was equally wrong when evaluating the accuracy of economic statements at variance with their political beliefs. As Klein wrote, “the more a statement challenged a group’s position, the worse the group did” [in accurately evaluating the statement].

In short, in all cases, respondents were less accurate in economic judgments that conflicted with their underlying biases and views, and the greater the conflict, the lower the accuracy.  What was even more interesting was that the level of education seemed to matter very little or not at all.

To me, all this was scarcely surprising, but what was surprising was that, while scholarly reviewers found the new study accurate, there was essentially no public or media reaction to the release of the results of the follow-up study, even though Klein was very clear in declaring that the new study invalidated the results of the earlier work.  Given that the results of the second study were also at variance with Klein’s own political predilections, it would seem likely that there might be at least more than polite notice of the second study.

There wasn’t. The few academic/critical reviewers who did comment essentially said, “there’s a lot of confirmation bias out there.”  The conservative/right wing types have said nothing, in contrast to their trumpeting the earlier [and incorrect] work, and there seems to be little liberal reaction either.

In short, we all want to hang on to our biases, even in the face of information to the contrary, and the more that information challenges what we believe, the more strongly we dispute it.

Is it any wonder Congress can’t get anything constructive done?

 

The “Ap” Society

One of my smallest granddaughters is enchanted with the “aps” on her mother’s smartphone [she can’t be enchanted with mine, because I only have a new version of an old-fashioned cellphone], and everywhere I look or read, there’s another “killer ap.”  And I don’t have a problem with “aps.”  I do have an enormous problem with what they represent… in the deeper sense.

The other week, I was reading an article about the difference between inventors and “tweakers,” and one of the points made by the writer was that, in general, initial inventions seldom are what change society.  It’s the subsequent “tweaks” to those basic innovations that make the difference.  Bill Gates didn’t invent the personal computer, but the tweaks provided by Microsoft made it universal.  Steve Jobs was a superb tweaker and marketer, and those abilities led to the I-Phone, among other commercial and societally accepted and successful products, and all the smartphone clones that are changing communications patterns in technological societies.  And, of course, killer aps are another form of tweaking.

But… as I’ve noted before, for all our emphasis on tweaking and commercialization, we’ve seen very little development and implementation of basic technological innovation in more than a half century. We still generate the vast majority, if not essentially all, of our electricity based on 1950s (or earlier) principles; aircraft and automotive propulsion systems are merely tweaked versions of systems in use more than a half century earlier, and we don’t travel any faster than in 1960 (and actual travel time is longer, given security and other problems).

In some areas, we’ve actually shelved technology that was superior in performance to currently used technology for reasons of “economic efficiency,” i.e., cheaper. That tends to remind me of the ancient Chinese and the Ptolemaic Greeks, and even the Romans, who never implemented technological advances because slaves or servants were cheaper.

Take Burt Rhutan, one of the most prolific and dynamic aircraft designers of the past generation.  What I find most interesting is that for all of the technical success of his designs, few indeed have ever resulted in being produced in large numbers – and it’s not because his aircraft are particularly expensive [as aircraft go, that is].

Of course, all this raises the question of whether we’ve reached the effective limits of technology. This issue was raised more than a century ago, when some U.S. luminaries proposed closing the patent office because there was nothing new to discover.  It certainly wasn’t so back then, but all the emphasis on tweaking and commercialization I see now raises that same question once again, if in a slightly different perspective.  Have we hit the limits of basic science and technology?  Or are we just unwilling to invest what is necessary to push science further, and will we settle for a future limited to “killer aps”?

 

Of Mice, Men, and Ethics

I hate sticky traps. But sometimes, there’s no recourse, not when the rodent hides in crannies where the cats can’t follow, and in spaces where it’s impossible to place “humane” or regular traps.  But sticky traps create another problem – and that’s what to do with a living creature that looks at you with fearful eyes.  Despite having seen the damage mice can do when uncontrolled, I still hate having to dispose of them.  But it takes days to clean and sterilize the mess even one mouse can leave… and, like other creatures that sample domestic comfort, mice that are released have this tendency to return.  So I have a simple rule with various pests – stay out of the house, and I’ll leave you alone.

In the aftermath of the rodent, however, I was reading a commentary by a reviewer on “ethics” and whether characters by various authors lack ethics when they kill without showing remorse and angst, even when those they kill are people who, by any reasonable standard, are truly evil.  Since some of my characters have been charged, upon occasion, with such behavior, I couldn’t help thinking about the issue.

What it seems to me is that the issue for all too many people is either whether the “killer” feels sorry or concerned about his acts or whether the acts take place in a setting where the one doing the killing has “no choice.”  And over the years, I’ve realized that, for many, many, readers, the ones who are dispassionate or don’t feel “bad,” regardless of the impact of their actions, are generally considered as bad guys, or antiheroes at best, as in the case of Dirty Harry or others, while the good guys are the ones who reluctantly do what must be done.  If a protagonist doesn’t show reluctance… well, then he or she is either a villain, soulless, or an anti-hero without true ethics.  Part of this attitude obviously stems from a societal concern about individuals without social restraints – the sociopaths and the psychopaths – but is it truly unethical [and I’m not talking about illegal, which is an entirely different question, because all too often application of the law itself can be anything but ethical] to kill an evil person without feeling remorse?  And does such a killing make the protagonist unethical?

How can it be more “ethical” to slaughter other soldiers in a battle, other soldiers whose greatest fault may well be that they were on the “other side,” than to quietly dispose of an evil person on a city side street?  Well… one argument is that the soldiers were ordered to kill, and no one authorized the disposal of the evil individual.  By that reasoning, Nazi death camp guards were acting ethically.  Yet… we don’t want individuals taking the law into their own hands.  On the other hand, what can individuals do in such a circumstance when the law offers no protection?

These are all issues with which we as writers, and as citizens, must wrestle, but what bothers me is the idea that, for some people and some readers, the degree of ethics rests on the “feelings” of the individual who must face the decision of when to use force and to what degree.  Was I any more or any less ethical in killing the rodent vandalizing my kitchen because I felt sorry for the little beast?  It didn’t stop me from putting an end to him.  Isn’t the same true in dealing with human rodents?

And don’t tell me that people are somehow “different”?  With each passing year, research shows that almost all of the traits once cited as distinguishing humans as unique also exist in other species.  Ravens and crows, as well as the higher primates, use tools and have what the theorists call a “theory of mind.”  The plain fact is that every species kills something, whether for food, self-defense, territory, or other reasons.

So…perhaps a little less emphasis is warranted on whether the feelings about the act of killing determine whether the killing is “ethical” or not.  Admittedly, those characters who show reluctance are certainly more sympathetic… but, really, should they be?  Or should they be evaluated more on the reasons for and the circumstances behind their acts?

 

 

 

 

Insanity – Political and Otherwise

At the end of the movie Wall Street: Money Never Sleeps, the protagonist says something like, “Insanity is doing the same thing time after time and expecting a different result.  All of us are insane at times, but what happens when more and more of us are insane at the same time?”

Recent off-year city council elections here in Cedar City reminded me of this rather forcefully.  Two of the candidates running for re-election were incumbents, and both were handily defeated – and replaced by candidates with exactly the same backgrounds, views, and general attitudes of the incumbents – and those new councilmen have absolutely no experience in municipal government. As I noted more than a year ago, the voters of Utah did essentially the same thing in replacing the then-incumbent ulrea-conservative Republican Senator with an ultra-conservative clone.  In a national politics generally, the Democrats continue to reinforce their ideology and the Republicans theirs, and in general each party is continuing to do the same thing they’ve always done with the hope of a different result.

And that different result isn’t going to happen, because increased taxes [the Democratic view]can’t cover the annual deficit, let alone the debt ; and there’s no way to cut federal programs and regulations [the Republican view] to the degree necessary to reduce massive deficits without destroying both government and the economy.  But both sides resist compromise, and continue to do the same thing… and that is truly insanity, and no one is calling them on it.

From what I can see, this is exactly what’s happening politically in the United States, and perhaps elsewhere around the world as well.

Have we reached the point in society where our illusions mean more to us than the survival of our society?  Where ideological “purity” is all, and practical compromise is a dirty filthy thing not to be mentioned anywhere?

Well… certainly various forms of purity have run rampant before, such as the Nazi effort for racial purity, the endless wars/massacres over religious/ethnic/political purity, ranging from those that plagued Europe for some 500 years, to the Chinese and Russian revolutions, to Pol Pot in Cambodia, to even the Mountain Meadows massacre in Utah.  And somehow, after all the fighting was over, and the hundreds of millions of dead bodies buried or ignored, there were still two sides left, two views conflicting, if temporarily more quietly.  Protestantism and Catholicism still exist in Europe, Ireland, and the British Isles.  The Mormon Church remains predominant in Utah, but it’s far from exclusive, and non-Mormons outnumber Mormons in Salt Lake City itself. Both China and Russia have had to come to terms with capitalism, and right wing racial hate groups still exist, if in far smaller numbers, across Europe.

Perhaps… it just might be well to recall that when “ideals” ignore reality, they all too easily become illusions.  Yet, without ideals… everything is sold to the most powerful or wealthiest.  And balancing ideals with reality is also a compromise… like life.

Insanity is not only doing the same thing time and time again and expecting the same result; it’s also failing to recognize that inflexible adherence to any ideal inevitably leads to unrest, disruption, and all too often… death and destruction… all the time while each set of true believers claims that everything would be fine – if only the other side would realize the error of their ways.

 

Another Take on Hypocrisy

Some ten years ago, I attended a memorial service for a woman who had died from a heart attack – the last of a series over a year or so.  The church was filled to overflowing, and everyone had wonderful things to say about her.  She was excellent technically in the position she held, and, as a single woman, she had even fostered a wayward teen girl and tried to set her – and her daughter – on the path to a more productive life.  She worked hard and long at her job, and she was helpful to her colleagues. But she had one fault. She wasn’t averse to pointing out when she was given a stupid or non-productive assignment, and, worse, she was almost invariably accurate in her assessments.

The result?  Her superiors piled more and more work on her while effectively cutting her pay and status, and because she was in her late fifties or early sixties trying to support herself and two others, she had little choice but to keep working.  For whatever reason, the one colleague with whom she worked well had her job abolished – only to have it reinstated a year or so later and filled by a man [who didn’t last all that long, either].  Employees in other departments who tried to be advocates for her were either ignored or told that it was none of their business… and, besides, she brought it on herself because of her sharp tongue. After her first heart attack, as soon as she could, she went back to work because her position wasn’t covered by short-term disability insurance, and she was too young for Social Security.  She died, of course, some months later, after she’d lost her house and was living in a trailer.

Just another sad story, another one of the countless tales of people who have run afoul of adversity after adversity. Except… a goodly portion of those people who had offered tributes at her memorial service were the very people who had effectively undercut her and driven her to her death.

They praised her talents, but hated her honesty.  They praised her charity toward others, while practicing little toward her.  And, in the end, after the memorial service was over, she was quietly forgotten, and the once-wayward teen moved out of town, and life went on for the men who had driven an honest, if acerbic, woman to death.

Why do I remember these events?  Because, in reflecting on one woman’s death, I see them played out on a larger and larger scale, day after day, when the voices of honesty and reason are drowned in a sea of rhetoric, often quietly fomented by those who created so many of today’s major problems, especially the politicians and the financial community.  At the same time, no one with the power to resolve the situation wants to or to recognize the embarrassing facts about their part in creating the current problems… even while romanticizing the acts and deeds of deceased politicians with whom they often disagreed while paying lip service to hard-working Americans whose real wages have declined over the past decade.

But then, maybe calling the acts of the perpetrators and their subsequent rhetoric mere hypocrisy is too generous.

 

 

 

Tolerance and Hypocrisy

Tolerance of the unjust, the unequal, and the discriminatory is anything but a virtue, nor is fiction that brings to light such problems in society a vice.  Yet among some readers and reviewers there seems to be a dislike of work that touches upon such issues. Some have even gone so far as to suggest such fiction, in portraying accurately patterns of intolerance, inequality, and gender discrimination that such fiction, actually reinforces support of such behaviors.  Over the past few years, I’ve seen reviews and comments about my fiction and that of other writers denigrated because we’ve portrayed patterns of discrimination, either on the basis of gender, race, ethnicity, or sexual orientation.  I certainly hope what I’ve seen are isolated incidences, but even if they are isolated incidences, I find them troubling, especially when readers or reviewers complain that illustrating in fiction what occurred either historically or continues to occur in present-day society constitutes some form of discrimination and showing how it operates is hateful and insulting.

Discrimination is hateful, insulting, and degrading, but pretending it doesn’t exist while preaching tolerance is merely a more tasteful way of discriminating while pretending not to do so… and that’s not only a form of discrimination, but also a form of hypocrisy. It somehow reminds me of those Victorians who exalted the noble virtues of family and morality and who avoided reading “unpleasant” books, while their “upstanding” life-style was supported at least in part by child-labor, union-breaking tactics that including brutality and firearms, and sweat-shop labor in which young women were grossly underpaid.

Are such conditions better than they were a century ago?  Of course they are – in the United States and much of the developed world.  But gender/sexual discrimination still exists even here – it’s just far more subtle – and it remains rampant in much of the developing and third world.  So… for a writer to bring up such issues, whether in historical or fantasy or futuristic science fiction is scarcely unrealistic, nor is it “preaching” anything.  To this day, Sheri Tepper’s Gate to Women’s Country is often violently criticized – if seldom in “respectable” print, but often in male-oriented discussion – because it postulates a quietly feministically-dominated future society and portrays men as dominated by excessive aggression and sexual conquest, yet a huge percentage of fantasy has in fact historically portrayed men almost “heroically” in such a light. Why the criticism of writers such as Tepper?  Might it just be that too many readers, largely male, don’t like reading and seeing historically accurate patterns of sexual discrimination reversed?  And how much easier it is to complain about Tepper and others than to consider the past and present in our world today.

There’s an old saying about what’s sauce for the goose is sauce for the gander…

 

Helpful Technology?

A week or so ago, my trusty and ancient writing computer bit the dust, and I replaced it with a brand-new version, equipped with the latest version of Word.  After a fair amount of muttered expletives, I managed to figure out the peculiarities of the latest word processing miracle from Microsoft, or at least enough to do what I do.  Then I discovered that every time I closed the program, the new defaults for page setup and font that I’d established vanished when I opened the program.  My local techs couldn’t figure out why, but they did give me a support number for Microsoft.  The first tech was cheerful, and when we quickly established that I’d been doing all the right things, and she couldn’t figure it out either, she referred me to another tech.  In less than five minutes, he’d guided me through things and solved the problem – and it wasn’t my fault, but that of a piece of software installed by the computer manufacturer.  Word now retains my defaults, and we won’t talk about some of the other aspects of the program [since I’ve dwelt on those before].

All that brings me to the next incredible discovery – and that’s the blundering idiocy known as a grammar checker.  Unfortunately, the Microsoft people didn’t retain a wonderful feature of my old Word 7.0 – the separation of the spell-check and grammar features.  So… if I want to spell-check a document – which I do, because my typing is far from perfect – I must endure a grammar check.  Now… I wouldn’t mind an accurate grammar check, but what passes for a grammar check is an abomination for anyone who writes sentences more complex than subject-verb-object, and especially someone who likes a certain complexity in his prose. The truly stupid program [or programmers who wrote it] cannot distinguish between the subject in the main sentence and the subject in an embedded subordinate clause, and if one is plural and the other singular, it insists that the verb in the subordinate clause be changed to match the subject in the main sentence.

[It also doesn’t recognize the subjunctive, but even most copy-editors ignore that, so I can’t complain about that in a mere program.]  There are also a number of other less glaring glitches, but I’m not about to enumerate them all.

For me, all this isn’t a problem, although it’s truly an annoyance. But for all those students learning to write on computers it is a problem, especially since most of them have absolutely no idea about the basics of grammar, let alone about how to write correct complex sentences – and now we have a computer grammar-checking program that can only make the situation worse!

There are definitely times when “helpful” technology is anything but, and this definitely qualifies as such.

 

Good-bye?

When I returned to Cedar City after going to the World Fantasy Convention in early November, I was surprised – and appalled – to find merchants, especially our single “big-box” chain store – busy replacing the Halloween displays and immediately putting up Christmas decorations and sales promotions.  There was little space or mention given to Thanksgiving.  And I wondered if this happened to be a mere local phenomenon.  Then I went on my whirlwind tour for Scholar and discovered that in all the cities I visited, the same thing was happening.  In fact, more than two weeks before Thanksgiving, I didn’t seen any commercial references to Thanksgiving, only to Christmas, and in most stores and malls Christmas music was playing.  Then I read where some merchants were pressing to begin the Christmas madness sales at midnight on Thanksgiving Day, forcing sales personnel to stay up all night or to do with little sleep – to cram in a few more hours of sales madness, pushing “black Friday” into Thanksgiving Thursday.

Years ago, I remember reading a short story by Fred Pohl called “Happy Birthday, Dear Jesus,” that was set in a future where the “Christmas season” begins in September, and, of course, I’m sure that many readers found that delightfully exaggerated back in 1956, when the story was first published, but Fred certainly anticipated a point we’ve almost reached.

To say that I find this trend disturbing would be an understatement.  Halloween and Christmas squeezing out Thanksgiving?  A Christmas buying season now beginning in October?

Yet, on reflection, it’s certainly understandable.  Thanksgiving was a holiday originally celebrated for giving thanks for having survived hard times and having attained modest prosperity.  And how many people really give thanks today?  After all, don’t we deserve all the goods and goodies we have?  Aren’t we entitled to them?  Then, too, Thanksgiving doesn’t put that much loot in the pockets of the merchants.  It’s a time for reflection and quiet celebration at home.  It requires personal time and preparation to be celebrated properly.  You just can’t go out and spend money and buy love or assuage your guilt with material gifts.  You have to consider what your blessings are, and what you’re thankful for… and reflect upon those who don’t have much for which to be thankful.

Christmas and Halloween have much in common in current American culture.  They’ve become all about the goodies – both for the consumer and the merchants… and both our son, who manages an upscale men’s fashion outlet in New York City and my editor have made the point that the comparative success or failure of the year depends on how much they sell in the “Christmas” season.  They’re certainly not alone, and many jobs, and the earnings of many workers, depend on such sales.  Yet, the economic health of a nation depending on holiday conspicuous consumption?  That’s frightening in itself. Add to that that such consumption is crowding out times of personal family reflection and an appreciation of what we do have for a frenzy devoted to what we don’t have.

Economic necessity or not… couldn’t we still reserve a small space of dedicated time for Thanksgiving between the buying and selling frenzy?

 

 

 

 

 

 

 

 

Return to the Past?

After finishing a whirlwind tour – seven cities and some of their suburbs in seven days – I’ve seen a trend I noticed years ago becoming even stronger… and more than a little disturbing.  Once upon a time, books were so expensive and hard to come by that only the very wealthy possessed more than a few, and most people had none.  Libraries were few and reserved effectively for the well-off, because few of those less than well-off could read or could manage access to them.

What does that have to do with today or my tour?

Only the fact that, despite such innovations as ebooks and e-readers, in a subtle yet substantive way we’re on a path toward the past in so far as books are concerned.  Yes, millions of books are printed and millions are now available, or soon will be, in electronic formats, but obtaining access to those books is actually becoming more and more difficult for an increasing percentage of the population across the United States.  With the phase-out of small mall bookstores, more than 2,000 bookstores that existed thirty years ago are now gone.  While they were initially replaced by some 1300 “big-box” bookstores, with the collapse and disappearance of Borders and consolidation by other chains, the numbers of chain bookstores has now dropped at least 25%, if not more, in the last few years.  Add to that the number of independent bookstores that have closed, and the total shrinkage in bookstores is dramatic.

Unhappily, there’s another aspect of this change that’s far worse.  Overwhelming numbers – over 90%  – of large bookstores in the United States are situated in “destination” locations, invariably near or in wealthy areas of cities and suburbs, reachable easily only by automobile.  At the same time, funding for public and school libraries is declining drastically, and, in many cases, funds for books are slim or non-existent and have been for years.

But what about electronic books… ebooks?

To read an ebook, one needs an e-reader of some sort, or a computer.  In these economically straitened times, adults and children from less affluent backgrounds, especially those near or below the poverty level, have difficulty purchasing an e-reader, let alone ebooks. Somehow, this fact tends to be overlooked, again, as if reading might not even be considered a problem for the economically disadvantaged

In the seven cities I visited on my recent book tour, every single chain bookstore or large independent was located in or adjacent to an affluent area. Not a single major bookstore remains in less affluent areas.  As I mentioned in a much earlier blog, this is not a new pattern, but the trend is becoming almost an absolute necessity, apparently, for new bookstore locations. Yet who can blame the bookstores? Small mall bookstores aren’t nearly so profitable as trendy clothes retailers, and most mall rents are based on the most profitable stores. Hard times in the book industry have resulted in the closure of unprofitable stores, and those stores are almost invariably located in less affluent areas. These economic realities also affect the WalMart and grocery store book sections as well.  In particular, grocery retailers in less affluent areas are less likely to carry books at all.

But no matter what the reason, what the economic considerations may be, when a city and suburbs totaling more than two million people have less than ten major bookstores, with only one major independent, and all of those stores are located in economically well-off areas, I can’t help but worry that we are indeed on a road to a past that we shouldn’t be revisiting.

 

 

 

The Comparative Species

For all our striving as a species to find clear and absolute answers to everything, from what is “right” to the deepest mysteries of the universe, at heart, human beings remain a highly comparative species.  In its best form, this compulsive comparativeness can fuel high achievement in science and technology.  Whether we like it or not, competitive comparativeness fueled the space program that landed men on the moon, the early development of the airplane, even the development of commercial and residential electrification, not to mention untold advancements in many fields.

The worst aspects of comparativeness remind me, however, of the old saying that all comparisons are odious.

In personal affairs, comparisons tend to be subjective and unfair, particularly in politics and business.  The late Richard Nixon was pilloried for taping conversations in the White House, yet White House taping had gone on in several previous administrations.  He resigned under threat of impeachment for covering up the Watergate burglaries, yet cover-ups have occurred in government for generations.  The full extent of the naval oil reserve scandals in the Harding administration didn’t come out for decades, nor did the extent of Jack Kennedy’s extensive philandering in the White House.  While both Kennedy and Nixon had grave faults, in point of fact, Nixon actually had many accomplishments as president, while Kennedy’s sole measurable achievement was averting nuclear war in the Cuban Missile crisis, yet in popular opinion, there’s essentially no comparison.  The ballyhooed Presidential debate between Kennedy and Nixon was another example of the fickleness of comparativeness.  Among those who heard the debate on radio, a significant majority felt Nixon had “won.”  Among those who watched it on television, a majority opted for Kennedy.  Same debate, same words – but physical appearance carried the day.

Likewise, study after study has shown that men who are taller, regardless of other qualifications, receive more pay and more respect than shorter men, even those more able in terms of ability and achievement, and interestingly enough, in almost all U.S. presidential elections, the taller candidate has been the winner.

Another example surfaced with the recent deaths of Steve Jobs and Dennis Ritchie.  While the entire world seemed to know about Jobs, and mourn his early and untimely death, only a comparative handful of people seemed to know about Dennis Ritchie, who was the pioneer who developed the first widespread and fundamental computer languages [the C programming language and the UNIX system] which made possible the later success of both Steve Jobs and Bill Gates. Yet Jobs’ death appeared everywhere, and Ritchie rated a small paragraph buried somewhere in newspapers, if that.  Although Ritchie’s death was widely mentioned in technical and professional journals, it went almost unnoticed in the popular media.

In the end, the question may be: Is it that comparisons are so odious, or that the grounds on which we make those comparisons are so odious?

 

Unforeseen Results

Just before I left on this book tour [and yes, I’m writing this on the road, which I usually don’t do], I read an article on how unprepared recent college graduates and even those getting advanced degrees happen to be in terms of what one might call personal preparedness.  The article, by a professional business recruiter, stated that most graduates had little idea of even what to wear to an interview, let alone how to get one.

Then, on one of the airplane jaunts, I read about how college students are moving out of engineering and science courses because “they’re too hard,” despite the fact that the average college undergraduate studies half as much today as the average student did thirty years ago.  To complete this depressing litany, I finished up with an opinion piece by a scientist who lectures occasionally, and who cited figures to show that today’s students have trouble learning anything in science without repeated repetition of the material because they don’t easily retain what they’ve heard in the classroom without that repetition.

But to top it all off, last night I ran into an attorney who teaches part-time at a prestigious southern law school, and we got to talking after the signing at the bookstore.  What she told me was truly astounding.   She set up a class where attorneys in various fields came and discussed the actual practice of law and where the students, all in their final year of law school, were told to be prepared to ask questions and were given the time and opportunity to do so.  First off, all were buried in their laptops, and not a single one could ask a question without reference to the laptop or notebook.  Second, not a one could ask a follow-up question or one not already prepared on the computer.  Third, not a one engaged in extended eye-to-eye contact with the visiting attorneys, and fourth, not a single one asked any of the visiting attorneys for a business card, despite the fact that none of them had job offers and all would be looking for positions in six months.  Considering the fact that almost all law firms are becoming very picky about new hires and that many have actually laid off experienced attorneys, none of these law students seemed to have a clue about personal interaction or personal networking.  Oh… and almost none of them actually wore better clothes to that class.

If this is what the new, computerized interactive net-based society has to offer, we’re all in big trouble, and those of us who are approaching senior citizen status may well have to keep working a lot longer for more reasons than economic necessity.

 

No Objective Truth?

The other day, one commenter on a blog asked if I wanted to write about the growth of a belief structure in American society that essentially denies the existence of “objective truth.”  Actually, I’ve written about aspects of this before, particularly as a subset of the selective use of information to reinforce existing confirmation bias, but I find the growth of the feeling that there is no objective truth, or that scientifically confirmed “facts” remain a matter of opinion – and that everyone’s opinion is equal – to be a disturbing but almost inevitable outcome of the fragmentation of the media along lines corresponding to existing belief structures, as well as of the increasing role that the internet and social media play in the day-to-day life of most people.

The basic ground rule of any successful marketing effort is to get the target audience to identify with your product.  Usually that’s accomplished by positioning the product to appeal to biases and beliefs.  Information – which is largely no longer news or factual/objective reporting outside of stringently peer-reviewed scientific journals – apparently must no longer have more than a tangential relationship to facts or objectivity, but has its content manipulated to appeal to its desired target audience.  Now… this is scarcely new.  Modern yellow journalism dates back more than a century, but because the economics of journalistic production limited the number of perspectives that could be specifically pandered to, because the law did have an effect in so far as actual facts were concerned, and because there remained a certain basic integrity among at least some media outlets until comparatively recently, facts were not quite so routinely ignored or distorted in quite so many ways.

One of the mixed blessings of technology is that millions and millions of people in every high-tech society have access to and the ability to use comparatively sophisticated media techniques (particularly compared to those available even a generation ago) to spread their views and versions of the “facts” in ways that can be appealing and often compelling.  In turn, the multiplicity of ways of presentation and distortion of existing verified facts now in existence creates the impression that such facts are not fixed, and the next step for many people is the belief that facts are only a matter of opinion… and since everyone’s opinion is valid… why then, “my” view of which fact or interpretation is correct, or can be ignored, is just as good as anyone else’s.

This “personalization of truth” leads to many rather amazing results such as, for example, that, as the scientific consensus on the issue of global warming has become almost unanimous in the fact that, first, such global warming is occurring, and, second, that there is a strong anthropomorphic component to such warming, popular opinion agreeing with these findings has dropped almost twenty percent.

Unfortunately, occurrences such as global warming or mechanisms such as oncoming vehicles combined with high-volume earbuds, famines and political unrest, viruses and bacteria, and high-speed collisions are all present in our world. Consequently, rising sea levels, violent weather changes, fatalities due to disease among the unvaccinated, starvation, or due to failure to wear seatbelts will all take their toll, regardless of the beliefs of those who ignore the facts.

Belief is not a viable defense or preventative measure against climate change, biology, on-coming heavy objects, or other objective impingements upon subjective solipsistic unreality… no matter what or how you believe.

 

“Downers,” Stereotypes, and Literary Quality

Yesterday, when I was flying back from the World Fantasy Convention in San Diego, I read through two “best-of-the-year” anthologies… and landed in Cedar City thoroughly depressed… somewhat disgusted… and more than a little irritated.  No… I wasn’t irritated that the anthologists hadn’t picked one of my infrequent short stories.  In the first place, I know that it’s unlikely anything I write will find favor with most anthologists [although there have been exceptions].  In the second place, I hadn’t published any short fiction for the year in question.

My anger, irritation, and depression came from the same root cause.  Out of something like fifty stories, all but perhaps ten were downers.  Out of the ten that weren’t, eight were perhaps neutral or bitter-sweet, and only two could be called upbeat.  Now… I don’t have anything against downbeat or depressing stories.  I don’t even have anything against them being singled out as good stories.  Certainly they were all at least better than competently written, and some were indeed superbly written.  And I definitely don’t think a story has to be upbeat to be great or “best-of-the-year,” but after many, many years of writing professionally, and even more of reading all manner of books and stories, ranging from “genre” fiction to the acclaimed classics, it’s clear to me that the excessive citation of literarily depressing stories as classics and excellence is hardly a mark of intellectual distinction, let alone impartial judgment.

All this, of course, reinforces my feelings about those critics and anthologists who seem to dismiss anything with any upbeat feel or positive aspects… or anything that isn’t “literary mainstream.”

The latest New York Times book section contains a review with an opening along the line of “A literary novelist writing a genre novel is like an intellectual dating a porn star.”  Supposedly, reviewers who write about books should be able to transcend stereotypes, not reinforce them, but then, snobbery is often based on the enshrinement of stereotypes contrary to the snob’s world view, and all too many critics, reviewers, and even some anthologists are little more than snobs.

A good story is a good story, and a bad story is a bad story, whether it’s “literary” or genre.

More Wall Street Idiocy

I recently discovered that the cable company Hibernia Atlantic is spending $300 million to construct and lay a new transatlantic cable between London and New York [New Scientist, 1 October].  Why? In order to cut 6 milliseconds from the 65 millisecond transit time in order to get more investment trading firms to use their cable.  For 6 milliseconds?  That’s apparently a comparative age when computers can execute millions of instructions in a microsecond, and London traders must think that those 6 milliseconds will make a significant difference in the prices paid and/or received.

And they may well.  Along the same lines, a broker acquaintance of mine pointed out that New York City real estate closest to the New York Stock Exchange computers commands exorbitant rents and prices for exactly the same reason… but I find the whole idea totally appalling – not so much an additional data cable, but the rationale for its use. Human beings can’t process much of anything in 6 milliseconds so that the speed advantage is only useful to computers using trading algorithms.  As I’ve noted earlier, the use of programmed and computer trading has led to a shift in the rationale behind trading to almost total reliance on technical patterns, which, in turn, has led to increased volatility in trading.  Faster algorithmic trading can only increase that volatility, and, regardless of those who deny it, can also only increase the possibility of yet another “flash crash” like that of May 2010, and, even if the new “circuit-breakers”cut in and work as designed, the results will still disrupt trading significantly and likely penalize the minority of traders without superspeed computers.

Philosophically speaking, the support for building such a cable also reinforces the existing and continually growing reliance on maximizing short-term profits and minimizing longer-term concerns, as if we don’t already have a society that isn’t excessively short-term. You might even call it the institutionalization of business thrill-seeking and attention-deficit-disorder. This millisecond counts; what happens next year isn’t my concern.  Let my kids or grandkids worry about what happens in ten or twenty years.

And one of the problems is that this culture is so institutionalized that any executive who questions it essentially destroys his or her future. All you have to do is look at those who did before the last meltdown.

Yes, the same geniuses who pioneered such great innovations as no-credentials-check-mortgages, misleadingly “guaranteed” securitized mortgages, banking deregulation, fees-for-everything-banking, and million dollar bonuses for crashing the economy are now going to spend a mere hundreds of millions to find another way to take advantage of their competitors… without a single thought about the implications and ramifications.

Isn’t the free market wonderful?

 

Why Don’t the Banks Get It?

Despite the various “Occupy Wall Street” and other grass-roots movements around the country, banks, bankers, and investment bankers really don’t seem to get it.  Oh, they understand that people are unhappy, but, from what I can tell, they don’t seem terribly willing to accept their own role in creating this unhappiness.

It certainly didn’t help that all the large banks ducked out of the government TARP program as soon as possible so that they wouldn’t be subject to restrictions on salaries and bonuses for top executives – bonuses that often exceeded a million dollars an executive and were sometimes far, far greater.  They all insist, usually off the record, that they feared “losing” top talent, but where would that talent go?  To other banks?

Then after losing hundreds of billions of dollars on essentially fraudulently rated securitized mortgage assets, they took hundreds of billions of dollars in federal money, but apparently not to lend very much of it, especially not to small businesses, who are traditionally the largest creators of new jobs in the country. At the same time, they continue to foreclose on real estate on a wholesale basis, even when ordered not to by judges and states and regulators and even in cases when refinancing was feasible with an employed homeowner.

And then… there’s the entire question of why the banks are having financial difficulties.  I’m an economist by training, and I have problems understanding this.  They’re getting money cheaply, in some cases, almost for free, because what they pay depositors is generally less than one percent, and they can obtain federal funds at an even lower rate.

Mortgages are running 4-6%, and interest on credit card debt is in the 20% range and often in excess of 25%.  Yet this vast differential between the cost of obtaining the product and the return on it apparently isn’t sufficient?

And that brings us to the latest bank fiasco.  For years, the banks, all of them, have been urging customers to “go paperless.”  Check your statement electronically; don’t write checks; use your debit card instead. Then, after the federal government tried to crack down on excessive fees for late payments, overdrafts, and the like, now several of the largest banks are floating the idea of a monthly fee for debit card use.  Wait a second!  Wasn’t this the banks’ idea in the first place?  Wasn’t it supposed to reduce costs?  So why are they going to charge depositors more to use their own money?

And the banks still don’t get it?  With all those brilliant, highly compensated executives?

Or don’t they care?