Archive for the ‘General’ Category

Another Darwin Award?

The other day I almost committed vehicular manslaughter.  It was anything but my fault, and I’m still fuming about it.

I was driving back from the post office, approaching a light.  The light was green, and I was in the right lane, slowing and signaling for a right turn into the rightmost lane of a four-lane street.  Just as I got around the corner, a skateboarder whizzed off a sidewalk and straight down the middle of my lane going the wrong way and directly at me. I barely managed to get into the inner lane, fortunately empty at the time, to avoid hitting him. The skateboarder was no child, but a long-bearded young man, wearing earbuds and a bemused expression, easily traveling at fifteen miles an hour plus. Had I struck the distracted skateboarder, the results would have been exceedingly painful, if not fatal, for him, and possibly financially, morally, emotionally, and legally wrenching for me. 

The young man who almost hit me head-on was traveling quickly, going the wrong way, wearing earbuds and presumably distracted, and not wearing a helmet. That combination made him a perfect candidate for the Darwin Awards[ a satiric award recognizing individuals who have contributed to human evolution by self-selecting themselves out of the human gene pool by their own unnecessarily foolish actions], as did his apparent lack of awareness of just how dangerous what he was doing happened to be.

Looking at the statistics, this was anything but a freak occurrence. While in recent years, automobile fatalities have been decreasing, and overall pedestrian fatalities have decreased, injuries and fatalities have steadily increased among distracted walkers… and among skateboarders on streets and roads, rather than at skateboard parks. The number of pedestrians injured and killed while on cell phones has prompted several cities to propose penalties and citations for distracted walking, and many schools, universities, and other institutions have imposed restrictions on skateboards because of repeated occurrences of behavior dangerous to both skateboarders and others.

Part of this is because the electronics are clearly so addictive that their users lose touch with the everyday and seemingly mundane world around them, and part of the problem is that far too many young people have been given the message that they are the center of the world.  As a result, they don’t fully appreciate that if they walk or skateboard into the path of a 2,000-5,000 pound vehicle, they run a high probability of being immediately and painfully removed from both the real world and their personal illusory world… not to mention the fact that everyone else will also pay a high price.

But then… that lack of understanding may be why they’re candidates for the Darwin Award.

No One Wants to be a Stereotype

Almost all thinking people, and more than a few who couldn’t be considered the most pensive of individuals on the planet, bridle at the thought of being stereotyped. Stereotyping is decried, particularly by individuals in groups that are most subject to negative stereotypes, and stereotyping is considered by many as merely another form of bias or prejudice, leading to one form of discrimination or another. 

Yet stereotypes continue to persist, whether publicly acknowledged, and even if decried.  They persist, as I’ve noted earlier, because people believe in them.  They are two reasons for such belief, first, because belief in the stereotype fulfills some personal or cultural need, and second, because there is a significant percentage of individuals within a given group that suggests the stereotype has some validity.  And sometimes they do, often happily, but more often, unhappily.

We have some very dear Greek friends, who have a large and very vociferously vocal family passionate in expressing their views on pretty much everything – and all of them take pride in that characteristic, insisting that it is a feature of most Greek families. I have yet to meet a shy and retiring Greek, although it is certain there must be more than a few.  This is a case of fairly innocuous stereotyping, but other stereotypes can and have been brutal and fatal, as Hitler’s “final solution” for the Jewish people of Europe demonstrated.

Yet… what if a stereotype has a basis in fact, in cold and statistics, if you will?  What if, for example, “white collar crime” is indeed indicative of the overwhelming prevalence of Caucasians engaging in it [which does seem in fact to be the case]? 

Under these circumstances, when should we ignore the stereotype?  Go out of our way to make certain we don’t “prejudice” our actions or attitudes?  In some cases, probably we should.  I certainly shouldn’t be surprised or astounded to find a quiet Greek.  But in other cases… ignoring stereotypes can in fact be dangerous.  Walking down dark alleys in inner cities, stereotyped as dangerous, is indeed dangerous, and because it is, one might be better off in heeding the stereotype. 

In short, like everything else, stereotypes arise for a reason, sometimes useful, sometimes not, and sometimes very deadly, and we, as individuals, have to decide where a given stereotype fits… which requires thinking, and that, unhappily, is where most of us fail, because stereotypes are a mental shortcut, and blindly accepting or rejecting shortcuts can too often lead to unexpected and, too often, unfortunate results.

The New Monopolists

As human beings, we’re quick to react to sudden and immediate dangers, from the mythical snapping twig that suggests an approaching predator to sirens or an ominous-looking individual. Often, we react too quickly and at times totally incorrectly.  But we react… to those kinds of dangers.  We also react to perceived threats on our “rights,” not so quickly, but at times even more violently.

What we don’t react well to, and slowly, and usually less than perfectly, are to those changes in our world that have turned perceived “good things” into indicators of dangers.  And the recent Department of Justice “victory” over Apple and the major publishers on ebook pricing is just one recent example of this.  Now… I’m not exactly an Apple fan.  I own no Apple products whatsoever, and I think that the I-Phone and its clones are harbingers of disaster [although in the interest of full disclosure, I will note that my wife does own a single IPad and that I am indeed an author whose income depends very much on the health of the book market.].  As I noted much earlier, the DOJ case against Apple and the publishers was based on the case that Amazon’s dominance of the ebook market [over 90% at the time] was essentially irrelevant because Amazon was charging lower prices than those Apple and the publishers were charging under the “agency model.”  And the letter of the anti-trust laws supported DOJ, as did the courts. 

The problem/danger here is the failure of Congress, the Judiciary, and the American people to recognize that “lower prices” aren’t always better, and in fact, they can be a symptom of great danger.  Lower prices are great, assuming that your income is stable or increasing.  But are lower prices so good if they cause the actual standard of living of the majority of Americans to decline?  Certainly, homebuilders and construction workers might well argue that the oversupply and cheap prices of existing housing was anything but good for them or the economy. What is important is the relationship between wages and prices, not just how low prices are.  If prices are down twenty percent, but your income is cut in half, you lose… maybe everything.  This tends to be overlooked in today’s economy and consumer culture.

And what is the relevance to law and the Apple decision?  Simply this – old style monopoly was the restriction of trade to raise prices and increase corporate profits.  Under the old-style [and current definition] of monopoly, lower prices are not a danger but a good thing.  The problem is that today we have a new kind of monopolist, as embodied in Amazon and Walmart.  These “new” monopolists use low prices to gain a dominant market share, and once they have that share, they use their power to force their suppliers to provide goods and services at lower prices, outsourcing overseas, doing whatever it takes.  This means those suppliers must cut their costs to stay in business, and that means lower wages.  It also means that manufacturing here in the United States either automates or outsources to lower wage areas.  In the end, the new monopolist still has large-scale profits which are not so high in percentage terms, but so much larger in scale that the percentage decline is acceptable.  This kind of “new” monopoly has taken over especially in consumer goods and retail industries, but it’s also appearing, if more slowly, in everything from finance to automaking…and, at the same time, Americans keep scrambling for bargains… without realizing exactly what the long-term cost of those “low prices” happens to be.

Happy shopping!

 

Standing Ovations and “Discrimination”

My wife the opera singer and university professor has been involved in pretty much all levels of public performance and voice and opera teaching, production, and administration over more than three decades…and one of the most appalling changes she [and I as well] has noticed is the shift from a standing ovation being an infrequent occurrence after a performance to it becoming apparently almost obligatory. She is certainly not the only one in the field who has noted this. Alex Ross, the music critic for The New Yorker, made the same observation, especially in regard to Broadway plays, several years ago.

There are doubtless numerous reasons for this shift, one certainly being the aging of generations taught to believe that everyone is “wonderful,” but there are two others that likely play an equal part in this decline of apparent ability, or unwillingness, to judge quality, particularly in the arts. The first is a growing belief that, in areas of society where qualitative excellence cannot be quantified or measured “objectively,” everyone’s opinion is equal, and that what one likes is always excellent, and that anyone who suggests otherwise is simply out of step.

The other contributing factor is an almost inchoate belief within current society that suggests that any judgment embodying negativity, or even a belief that competence is not excellence, is somehow “bad.”  This is evidenced implicitly by the shift in the word “discrimination” over the past fifty years.  At one time, to show discrimination meant the ability to distinguish between good and bad, to be able to distinguish between what was good, very good, or excellent.  Now, to discriminate means to show bias or prejudice, a totally negative meaning with unfavorable connotations as well.  At present, there does not exist a single word in the English language that conveys approvingly the idea of being able to make such judgments.  Because simple and direct words are the strongest, this lack effectively, if you will, denigrates the entire concept of constructive judgment or criticism.  By the same token, critical judgment now carries the connotation, if not the denotation, of severity or negativity.

Since when is NOT giving a standing ovation a measure of negativity?  Yet it appears that audiences have come to feel that “mere” applause is not enough. 

Then again, perhaps I’ve missed it all, and standing ovations are merely the supersized version of applause, the symptom of a society that always wants more, whether it’s useful or healthy.   

Another “Elephant”?

With the outcry over the verdict in the Trayvon Martin case, rhetoric, charges, counter-charges, explanations, refutation of explanations have appeared everywhere, including comments on this website, but there’s one elephant in the room that has yet to be satisfactorily explained, an elephant, if you will, that lies at the heart of what occurred in Florida.  And that elephant, for once, isn’t the far right wing of the Republican party, but one that has been overlooked by those who ought to be most concerned for more than a generation.

Why do black youths commit homicides at rates four times as high as the average of all murders committed by youths? 

Typically, many answers are given, but the one most currently in favor is that poverty and single-parent homes create conditions that result in aliened youths more likely to join gangs and kill others.  But there are more than a few problems with this simplistic explanation.  First, the largest racial group of the poor still remains white;  nineteen million whites fall below the poverty level for a family of four, nearly twice as many whites as blacks. Second, the number of white-single-mother households has been increasing over the past decade so that single-white-mother households outnumber single-black-mother-households, as well as single-Hispanic-mother households.  During this period, youth homicide rates fell across the board, but the 2010 rate for black youth still remained nearly four times that of whites and Hispanics, despite the decline in the percentage of black children living in high-poverty neighborhoods and the increase in white and Hispanic children living in such neighborhoods.

While racial tensions remain, the vast majority of black youth killings are those of young black men killing other young black men, not black young men killing whites or other minorities, and most of the other criminal offenses committed by young blacks are against or within the black community. No matter what anyone claims, this is not an interracial issue, but an intra-racial problem, almost certainly a subcultural affect, which although exacerbated by a larger problems, is not primarily caused by such.

The answer isn’t likely to be that there is a greater genetic/racial predilection toward violence or “less civilization” by blacks, either, not given history, which has shown great civilizations raised by peoples of all colors, or even current events, in which it appears the greatest violence and killing at present appears to be that committed by white Islamists against other white Islamists, if of a different Islamic persuasion.

Like it or not, such statistics suggest that the reason for the high level of violence perpetrated by young black males doesn’t lie primarily in externally imposed conditions, even if those conditions — such as prejudice, bigotry, poverty, poor education, and police “profiling” – are debilitating and should continue to be addressed, and such conditions improved.  Both large numbers of whites and other minorities have suffered and continue to suffer these conditions and, at least so far, their young males do not murder each other at anywhere near the rate and frequency as do young black males.

Might there just be some facets of the urban black culture that contribute to this situation? Facets that cannot be remedied by outsiders, no matter how well-meaning, and well-intentioned?  Facets that outsiders risk being immediately attacked as racist for even suggesting? Facets that even notable black figures have been attacked for suggesting?  

Should you…?

The New York Times recently ran an expose of Goldman-Sachs’ venture into the commodities markets, and the result of the firm’s purchase of a company that effectively gave Goldman control over the spot market in aluminum.  The upshot of the Goldman purchase is that, as a result, the price of aluminum – that essential metal for both aircraft and soft-drink cans – has doubled, as have delivery times, and the additional cost to consumers is roughly $5 billion annually.

Now… I can see the argument for large business takeovers that benefit someone besides the company taking over, and I can see some benefits to at least someone in corporate behaviors such as those of  as Walmart that have driven out thousands of local stores through lower prices and lower wages to employees. The average consumer benefits by getting lower prices, even if the workers get screwed, and small store owners and employees lose their jobs. And there are cases where huge financial corporations do get caught for illegal manipulations, which appears to be the case in the recent charges by the Federal Energy Regulatory Commission that J.P. Morgan illegally manipulated the price of electricity.  The Goldman aluminum case is a bit different.  Prices are up, as are delivery times, and everyone gets screwed but Goldman.  The thing is… it’s perfectly legal under existing law.

In fact, as Americans are slowly realizing, a great deal of the financial machinations that led to the financial crisis, from whose results which we are hopefully finally beginning to emerge, were also perfectly legal.  It turns out that it wasn’t illegal to lend money to borrowers who could not repay those loans, at least so long as the documents weren’t fraudulent.  It wasn’t illegal to collateralize and securitize those bad investments, and it wasn’t illegal to create such a mess than the government had to bail out those institutions in order to keep the banking sector from collapsing. These are facts hammered at me by more than a few legal, financial, and mortgage types over the past several years.

All this brings up a more fundamental question.  Just because something is “legal,” does that make it right?  Should you engage in legal but unethical behavior?  All too many business types do this every day, and such behavior illustrates a basic change in American as well as other societies that has occurred gradually but inexorably over the past century or so.  Once upon a time, much of the law was merely a limited codification that outlawed what society viewed as the worst excesses of human behavior. Human social codes and behaviors also exerted a certain pressure on individuals and businesses to be more ethical.

But as laws have swelled and become more complex and prescriptive, those social conventions have eroded, more and more people seem to have come to believe that, if the law doesn’t forbid something, it’s all right to do it.  In turn, in the United States, and elsewhere around the world, people have reacted by attempting to put their own parochial religious and moral beliefs into the law… which generates more and more conflict of the type that the Founding Fathers wished to avoid by separating church and state.

Years ago, a political science professor I studied under observed that when people used power excessively and irresponsibly, society always eventually reacted to reduce or eliminate that power.  We’re beginning to see that reaction… and the result is most likely to be something that few of us will enjoy. All because we’ve decided, as a society, that if it’s not illegal, we can do it… whether we really should or not.

 

Confrontation

In a recent column, Bill O’Reilly made the observation that while Trayvon Martin’s death was a tragedy, it was also an example of the dangers of confrontation, in that Zimmerman was told by the 911 dispatcher not to follow Martin and not to confront him.  While we don’t know exactly what happened between the two, what we do know is that Zimmerman did not avoid Martin, nor Martin Zimmerman… and the result was fatal. O’Reilly went on to point out that he often has to back away from confronting stupidity or error, simply because doing so would be far too dangerous, either physically or legally.

On the surface, this is simple wisdom. Don’t get into confrontational situations because they can escalate into dangerous or even potentially fatal incidents… or result in huge lawsuits, if not both.  But, the truer that advice may be, the more it suggests how violent and/or litigation-happy our society has become… as well as how intransigent all too many people have become. I’ve seen and experienced the absolute arrogance displayed by all manner of Americans, from the anti-abortionists, the gun-rights-absolutists to militant feminists who declare that every act of heterosexual intercourse is an act of rape, to the arrogance of minority youth whose speech and attitudes show no understanding or respect of anyone clearly not able to flatten them, and that range of arrogance and intransigence also includes professors and politicians, red-necks, students and professionals… and a whole lot of others.

A great deal of this I attribute to a society-wide attitude that anyone has the right to do anything in public short of actual physical violence to another [and sometimes even that] and say anything to anyone, regardless of how hurtful, how hateful, or how anger-provoking it may be.  Or for that matter, how disruptive it may be.  Hate speech may be a “right,” but it’s neither ethical nor wise. Allowing screaming children running through the supermarket is not only unpleasant but disruptive and can be dangerous… but you risk physical damage if you suggest curtailing hate speech or someone’s unruly offspring… and that’s just the beginning.

Now… I’m scarcely arguing for confrontation, because I’m not, but whatever happened to such things as moderate behavior, in both expressing an opinion and in reacting to it?

Violent confrontation shouldn’t be socially acceptable, and neither should unruly, anti-social, or disruptive public conduct.

The Reason Why…?

I just read an online review of Princeps in which the reviewer declared that he was wrong about my motivations in writing the subseries in The Imager Portfolio that begins with Scholar.  Apparently, the reviewer had originally thought I was fighting off stagnation with Rhennthyl, but didn’t want to abandon the series.  The reviewer’s second thought was that I’d created such an enormous back-story that I just didn’t want to abandon all that work.

If this reviewer had just looked at any of my fantasy series, or even some of my science fiction, he or she might just have realized that I like to write a sweep of history… and that even in my stand-alone books, history plays a large part.  But no… the reviewer has to imply that, if I “abandon” a subseries after three books, I must be fighting stagnation or dying to use all my back-story material. What about looking at where Rhennthyl is in his life?  He’s surmounted all dramatic enemies, and now, for the remainder of his life he has to be essentially a high-level Imager bureaucrat [unless I chose to write totally unrealistic books] and a teacher, both of which are vital to the future of Solidar, but not generally the stuff of dramatic adventure. Or what about looking at what I’ve written or how… or even asking? As long-time readers know, the five books about Quaeryt and Vaelora are the sole exception to my never having written more than three books about a given set of characters.

All this points out the danger of ascribing motives to writers, and of not doing a certain amount of “homework” before writing a review.  Such ignorant arrogance is also the mark of either laziness or incompetence… or total amateurism, if not all three.  But it’s also symptomatic of all too much criticism and commentary that pervades the world-wide web, in that all too many “writers” or “critics” believe that all it takes to be either is a pedestrian command of language, a computer, and a little knowledge.  And, after all, all opinions are equally valid.

But they’re not, except in the mind of the opinion-giver. Everyone has an equal right to an opinion, but that right has little to do with accuracy… or understanding.

Now… I’m certainly not the only writer to be “blessed” with this sort of condescending “analysis.”  Almost any writer who has published for any length of time has received similar comments and reviews.  While I often wince at so-called factual reviews, which suggest flaws in style or in content (often non-existent, in my opinion), those reviews at least deal with the words on the page… rather than gratuitously attempting to ascribe motives to the author. The same is true of critiques of style, pacing, etc., all of which deal with what has been written, rather than motivational analysis.

So… for all of you critics and would-be critics out there… stick to what we wrote.  You can even suggest what we didn’t write and should have.  Leave the psychoanalysis to our wives, husbands, partners, or shrinks.  That way, you have better odds of being closer to accurate.

Education and the “Administrative” Model

A question occurred to me the other day, and that was why, in some organizations, such as colleges and universities, once one becomes an administrator, salaries go way up, and real accountability appears to go down.  Even as a tenured full professor, my wife has to fill out an annual report on what she has accomplished, and how, and then face post-tenure review every few years.  I can’t see that any administrator faces that kind of scrutiny.

Now… I suppose that wouldn’t be so bad if I could only figure out what all those administrators do.  Over the time that she’s been at the university, the student body has essentially tripled, while faculty, including adjuncts [as full-time equivalents], has only grown a little more than fifty percent, yet the administrative positions have tripled, including more deans and vice-presidents. Despite all these new administrators, the administrative requirements placed on full-time faculty have continued to increase.  The salaries for clerical staff and faculty have not, on average, kept pace with inflation, but administrative salaries have soared. Although the university president’s salary has more than doubled, as I noted in a previous blog, the Board of Regents wants to increase it by more than 13% this year, while holding faculty salaries to a one percent increase and essentially negating that by the increases in health care costs paid by faculty and by increasing the health co-pay by 50% -100%.

I tend to find this whole thing disconcerting, because the faculty members are the ones doing the teaching [and at this university, teaching, not research, is what they’re paid for], while the administrators do… well… I have yet to figure out what about half of them do, except create more work by faculty by demanding more information and more reports, and by implementing new systems that are more often than not worse and more time-consuming, at least for faculty, than the previous system. I’m certain I’m misguided in this modern age, but I was under the impression that administrative systems are supposed to support the business at hand, not hamper it.

While I have a number of problems with professional athletics, in that field, there’s at least some recognition that you can’t field a team or win games without paying players what they’re worth [if sometimes way more than they’re worth].  In education, again, at least at state universities, the big salaries seem to go to administrators – and their close relatives, the business professors.  Then come the high-profile professors, whether they’re good teachers or not.  In the middle are the tenure and tenure-track professors, and near the bottom of the full-time pile are the clerical and low-level administrative aides.  At the very bottom are adjunct instructors and teaching assistants, who now comprise over 50% of the teaching faculty at most universities.

In major league sports, even the lowest paid journeymen get a living wage, and the players outnumber the administrators. Not so in academia… which just might have a bit to do with the increasing costs of higher education.

Impeach Obama?

This past weekend, I got an unsolicited telephone message with a Washington, D.C., area code and the identifier “Impeach Obama.” I didn’t answer it.  I don’t answer most unsolicited calls, especially political ones, but the “identifier” bothered me.  I’ve been involved with or close to national politics for more than forty years, and I’ve never seen this kind of extremism before, particularly the hate-mongering in the guise of “fundamental” values on the part of groups associated with the tea partiers or the Republican Party.  I certainly don’t expect people to be wildly pleased with the president if he wasn’t their choice in the first place, but there’s a difference between informed opposition, even uninformed opposition, and rabid unthinking hatred rationalized by simplistic [and factually incorrect] sound bites and prejudice.

There’s a great deal that Obama’s done with which I don’t agree, and a great deal that I think he should have done and didn’t, but I can’t think of a single major act he’s taken that isn’t similar to at least one of his predecessors, if not several.  He’s not the first president to spy on Americans in the United States; he’s certainly not the first one to attempt to address immigration issues and to try to give illegal immigrants legal status.  He hasn’t made the kind of radical changes in the position of the federal government that Franklin Roosevelt did.  His one “arms scandal” was minute compared to Reagan’s “Iran-contra” arms deals.  He isn’t the one who struck down the Defense of Marriage Act – the Supreme Court did that all on its own.  He’s been trying to close Guantanamo Bay for years, and the Congress won’t let him. He didn’t even try to repeal the Second Amendment (although the NRA would have all its members believe that); he just wanted background checks on gun purchasers and a few restrictions on certain weapons and the size of magazines. As for the Obamacare business… has anyone else even attempted to address the plight of 46 million Americans without health insurance?  If the Republicans, or others, had attempted anything that would actually have accomplished something, I might be a tad more sympathetic, but “NO!” isn’t a program or a solution to anything.

If we’re talking about political dysfunction, the most dysfunctional branch of government isn’t the Executive Branch, but the Congress.  It can’t agree with itself on anything.  But I don’t see any large political movements to throw out members of Congress, or telephone solicitations with “Impeach Congress” identifiers. 

Some state governments are almost as dysfunctional – and stupid – especially when they pass laws that attempt to override or nullify federal law.  Like it or not, the supremacy clause of the Constitution means that states cannot override federal law, and passing laws in contravention of federal law is generally counterproductive and a waste of taxpayer dollars.  Again… I don’t see any rabid reaction to such waste and stupidity there, either.

I’ve talked with more than a few of the types that support this kind of “impeachment” rhetoric, and they all come up with semi-rational reasons.  They just can’t explain why, if they feel this way, they haven’t applied the same standards to previous presidents… or, for that matter, to other politicians… except perhaps to Bill Clinton, also perceived as too liberal, who faced impeachment essentially for lying about sexual indiscretions, as if sexual indiscretion had much to do with public policy, unlike the lies of the Reagan administration about Iran-Contra arms deals, but somehow the right wing wasn’t concerned enough about those lies to push through impeachment proceedings.  They just indicted eleven lower-level officials, all of  whom either had their sentences vacated on appeal or were pardoned by the first Bush administration.

So why do apparently Republican offshoots and/or sympathizers organize clearly significant telephone solicitation campaigns to “Impeach Obama”?  I have the very uneasy feeling that it’s a political appeal based on a barely concealed form of racism, and that appeal is being made because they either (1) don’t have another even halfway reasonable set of constructive proposals with wide enough popular appeal to win the presidency, (2) can’t raise enough support for what they really believe in; or (3) just can’t stand the thought of a black president popular enough to be elected twice.

The idea of elections is that, if the majority of Americans want a change in government, they can vote for someone else.  Clearly, a majority doesn’t want that change, or at least they didn’t in the last Presidential election.  Yet whoever is behind the “Impeach Obama” campaign can’t seem to accept the results of the election.

Whatever the reason, it’s a chilling representation of a certain mindset.

Pushing Boundaries

The other day, my wife the university professor asked another of her very good questions: “Why do so many critics equate pushing boundaries with excellence?”

Why indeed?  Does more violence, more nudity and sexual content, or the detailing of the depths of human depravity have much at all to do with excellence?  Let’s face it.  Nude human bodies are similar to other nude human bodies, and death and violence have always been with human beings. So have depraved behaviors.  With the advent of HDTV, Blu-Ray, and similar high resolution video media, nudity and violence are now depicted in stunning visual detail right in the home.  As I recall, the science fiction writer Marian Zimmer Bradley (who also wrote pornography under various pseudonyms) once made an observation to the effect that pornographic sex was like writing about plumbing.  And, in a way, excessive sword and slash fantasy is like rather crude dissection.  If adults want to watch detailed plumbing and dissection, so long as it doesn’t involve children or other perversions, that’s largely their right under the first amendment… but let’s not equate it with excellence.

At least in my mind – and historically – excellence is the concept for striving for something higher, not a depiction in greater detail of something sordid, fatal, or demeaning. And while Game of Thrones, for example, certainly has great supporters, and its visuals – at least from the trailers/ads – are stunning, I gave up on the books shortly after the first one, simply because, although Martin writes well, that skill is employed most effectively at portraying a society where there is really no such thing as excellence except in violence and betrayal.

Perhaps I’m dated, or old-fashioned, but to me, the employment of talent to portray the worse in human behavior with no counterpoint of the best in human nature is the equivalent of moral pornography, in addition to the pornography of sex and violence.  And even if it the best is portrayed along with the worst, humans being humans, they concentrate on the worst. In addition, such graphic portrayals also desensitize at least a percentage of younger viewers, a trend that is continuing in pretty much all forms of the arts, so that music must be louder and simpler to retain its appeal, movies – at least the blockbusters – are simpler (and, as an aside, there are so few good songs in movies that the Academy Awards might as well eliminate that category) and ask less and less of the audience in terms of knowledge and understanding, all of which is perfectly understandable from the marketing point of view.

Then again, it could be that pushing boundaries is the only thing some of these movies and mini-series have going for them… and the rest don’t even have that.

The Dependability Fallacy

In almost every bit of advice about success there’s something about the need to be dependable.  Even Woody Allen, who, for all of the craziness of his personal life, has certainly been artistically and professionally successful, once said, “Eighty percent of success is just showing up.”  In other words, be there on a dependable basis.

 The only problem with this is that it isn’t totally true. From what I’ve observed, in the military, in business, in government, and in education, people who are talented and dependable are all too often viewed, particularly the longer they’ve been in an organization, as solid and… well… dull, not terribly innovative and creative.  And I can also say that I’ve seen the same thing happen in the field of writing.  Time after time, I’ve watched talented and dependable people pushed aside for younger, more “brilliant” newcomers, and in a reversal of the Woody Allen percentages, I’d said that in about 80% of the cases, those new, “young,” and brilliant types managed to screw things up.  Often the work of the “dependable” individuals is actually more creative and innovative than that of those who make brilliant presentations but never actually accomplish more than a mediocre job.   

 There is, of course, an underlying reason why the “dependable” are so-often shunted aside, minimized, or even discarded, and it’s fairly obvious, and simple, and usually ignored.  All organizations have limits.  People who are talented and dependable – and responsible – understand those limits, either implicitly or explicitly. They know that, for example, why a seemingly brilliant idea won’t work, and, in many cases, has failed several times, each time with another charismatic individual who is convinced that force of personality will accomplish the impossible.  Once in a very great while that happens, but the benefits of that infrequent success doesn’t begin to cover the costs of all the unsuccessful efforts.  But no new supervisor or executive wants to be told that his or her brilliant idea won’t work, and the dependable workers get faced with an impossible situation – if you oppose the idea, you and your career are toast, and if you do your best and it fails you’re toast.

 All this, of course, also ties into the “new is better” philosophy, which is often even worse than the “we’ve always done it this way” philosophy, which, at the very least, works, if not so well as as an incrementally better way might, but in most organizations steady incremental improvements are overlooked in favor of a single “brilliant” one-time achievement. I’ve seen, more than a few times, a middle-management professional double or triple, or in some cases quintuple output with the same level of resources, but because they did it over five or ten years, it’s overlooked in favor of the professional who posts a one-year 25% increase by spending more and burning out people so that improvements for years to come are negligible.

 Then, too, in large multi-layered organizations or institutions, those who make the decisions on raises and promotions often never really understand what goes on at lower levels and rely on summaries and aggregated statistics presented by immediate subordinates who tend not to stay in any position very long. 

This is often why the best of  small companies are often quite successful… and then become less productive or even fail when they’re acquired by large conglomerates – because the expertise and dependability necessary for a smaller company to survive is less vital to more senior executives whose success often depends more on political maneuvering than day-in, day-out task-oriented performance.

 The result, from what I’ve observed, is that, in the majority of organizations and institutions, the higher one moves, the less dependability is valued, unless dependability is defined as being dependably loyal to those who can reward and advance one’s career.

Not Wanting to Know

Recently here in Cedar City, there have been several letters decrying the direction of the university as a “liberal arts” institution and complaining about the high cost of tuition.  My initial – and continuing – reaction has been along the lines of what planet are these idiots from? 

The university has always had a liberal arts/teaching focus, from the days of its founding over a century ago, and its tuition is so low that its out-of-state tuition and fees are lower than the in-state fees of many universities in other states.  Now, admittedly, tuition has increased more than the administration would like, entirely because the state legislature has decided to cut per-student funding while mandating enrollment increases, not only for the local university but for most of the state institutions.  Even so, considering the quality of many programs, state tuition here and elsewhere in Utah is a comparative bargain. Here, the music, art, and theatre areas have won national awards against much larger schools; the nursing program is rated as one of the best in the state and region; pre-law and pre-med students have an enviable rate of acceptance at graduate schools; and the physical education program has been so successful that it’s known as the “coaching factory.”

Unfortunately, this disregard for the facts isn’t just about college education here in Cedar City, but is symptomatic of a larger problem.  More and more, I see people ignoring the facts that conflict with what they feel and want.  It’s as if they actively avoid facts and circumstances contrary to their beliefs, as if they simply don’t want to know.  Whether it’s global warming or deficit spending, immigration, income inequality, decreased social mobility, education…or a dozen other subjects… they don’t want to know… and trying to get them to consider “contrary” facts just makes them angrier.

Part of this is an effect of civilization. If, earlier in history, you didn’t want to believe that the perils of the time – predators, floods, fire, famine, and raiders from other tribes, for example – you ended up dead.  Now that civilization has eliminated or limited the effects of those perils, and the dangers we face are more indirect and take more time to affect one, people ignore the facts about dangers.  In this regard, global warming is a good example.  I can recall predictions dating back almost twenty years suggesting that weather would get more violent with even modest rises in overall global temperatures.  Temperatures have risen; weather has become more violent; and still people debate whether global warming and its effects are real. 

On a personal level, there’s and even more stark and direct example — obesity.  Excessive weight is one of the primary causes of early death and other health hazards.  There’s absolutely no question of that… and yet Americans are the most obese nation on the face of the planet… and they scream bloody murder when a politician suggests banning serving soft drinks in 32 ounce sizes.  For heaven’s sake, does anyone really need a quart of carbonated beverage at one sitting?

But then, I suppose, why anyone would want that much at once is one of those facts I don’t want to know.

 

“Real” Fiction

The New York Times best-selling author Jeannette Walls was quoted in the Times this past weekend as saying, “I’m not a huge fan of experimental fiction, fantasy or so-called escapist literature. Reality is just so interesting, why would you want to escape it?”  This kind of statement represents the kind of blindness that is all too typical of all too many “mainstream” writers and critics.

In fact, the best science fiction, fantasy, and other “escapist” literature puts a reader, in a real sense, “outside” the framework of current society and reality in a way that allows a perceptive individual to see beyond the confines of accepted views and cultural norms. Some readers will see this, and some will not.  As a simple, but valid example of this, take my own book, The Magic of Recluce, in which the “good guys” are ostensibly and initially portrayed as the “blacks.”  In western European derived cultures, as demonstrated by all too many westerns, where the good guys wear white Stetsons, and the bad guys crumpled black hats, in the United States, in particular, there is an equation of the color white with purity and goodness.  But this is far from a universal norm.  In many cultures, white is the color of death, and other cultures use other colors for purity.  My very deliberate inversion of this western color “norm” was designed to get readers to think a bit about that… and then, when they’d thought a while, I started writing other Recluce books from the “white” perspective, in an attempt to show the semi-idiocy of arbitrarily ascribing “color-values” to people or societies, or values to colors themselves.

I’m far from the only F&SF writer to use the genres to explore such themes or to question values or concepts, and I could list a number of writers who do.  So could most perceptive readers of F&SF.  This fact tends to get lost because fiction is for entertainment, and if we as writers fail to entertain, we don’t remain successful professional writers for very long, and, frankly, if we’re extremely successful at entertaining, we tend not to be taken seriously on other levels. Stephen King, for example, is technically a far, far better writer than is recognized, largely because of the subjects about which he writes, and not because he writes poorly – which he does not.  Only recently has there been much recognition of this fact.

Even with critics within the F&SF genre, there’s a certain dismissal of writers who are “commercially” successful as writers of “mere” popular escapism, as though anything that is popular cannot be good.  Under those criteria, Shakespeare cannot possibly be good or have any depth.  For heaven’s sake, the man wrote about sprites and monsters, faery queens, sorcerers and witches, along with battles, kings, ghosts, and ungrateful children.

Good is good;  popular is popular; and popular can be anything from technically awful to outstanding, although I’d be among the first to admit that works that are both good and popular are far rarer than those that are popular and technically weak or flawed.  And the same holds for so-called escapist fiction, no matter what the mainstream “purists” assert.

Then too, the fact is that all fiction, genre or mainstream, is “escapist.”  The only question is how far the author is taking you… and for what reasons.

Thoughts on Self-Sabotage

Over the years, both my wife and I have encountered quite a number of individuals who had the ability and skills to succeed, and who then proceeded to commit self-sabotage, often when they were on the brink of accomplishing something they said was important to them. Another instance just occurred, and without going into details, the individual in question suddenly stopped going to two required senior level classes, while attending other classes… and getting good grades in those.  Despite promises to do better, that individual ended up flunking both courses… and being unable to graduate for at least another semester.

It’s easier to understand why people fail if their reach exceeds their abilities, or if accidents or family tragedies occur, or if they become addicted to drugs, or suffer PTSD from combat or violent abuse, or if they suffer from depression or bipolarity, but it’s hard to understand why seemingly well-adjusted people literally throw their future, or even a meaningful life, away.  Some of that may be, of course, that they’re not so well-adjusted as their façade indicates, but I have a nagging suspicion that in at least a few instances, there’s another factor in play.

What might that be?  The realization that what they mistakenly thought was the end of something was just the beginning.  For example, far too many college students have the idea that college is an ordeal to be endured before getting a “real” job that has little to do with what was required in college.  In my wife’s field, and in many others, however, what is required in college is indeed only the beginning, and the demands of the profession increase the longer you’re in it… and some students suddenly realize that what is being asked of them is only the beginning… and they’re overwhelmed.

The same can be true of a promotion. The next step up in any organization usually involves more pay, but today, often the pay increase is minimal compared to the increased workload and responsibilities… and, again, some people don’t want to admit, either to themselves or to others, that they don’t want to work that hard or handle that much responsibility.  So the “easy” way out is self-sabotage… and often blaming others for what happens.

This certainly isn’t the only explanation for self-sabotage, but it does fit the pattern of too many cases I’ve observed over the years… and it also seems to me that cases of self-sabotage are increasing, but then, maybe I’ve just become more aware of them…or maybe the “rewards” for advancement, degrees, etc., just aren’t what they used to be… at least in the perception of some people.

American Politics – Power Now?

In past blogs, I’ve discussed the insidious and potentially deadly long-terns effects of the “now” mentality, particularly on American business, and how the emphasis on immediate profits, immediate dividends, or immediate increases in stock prices, if not all three, have had a devastating effect not only on the economy, but all across the society of the United States.  There is another area of American society where the “now” culture has had an even more negative and more immediate effect – and that’s on American politics and government.

Years and years ago, one of my political mentors made the observation that, in running a campaign, you had to give the voters a good reason to vote for a candidate.  Back then, that reason was tacitly assumed to be, except in certain parts of the south, positive.  Today, if one surveys political ads, campaign promises, and the like, that reason is overwhelmingly negative.  Vote for [Your Candidate] because he or she will oppose more federal government, more spending, more gun controls.  Or conversely, vote for [Your Candidate] because he or she will oppose cutting programs necessary for children, the poor, the disadvantaged, the farmer, the environment, etc. 

The synergy between the “now” culture and the ever more predominant tendency of American voters to vote negative preferences is an overlooked and very strong contribution to the deadlock in American politics. People want what they want, and they want it now… and they don’t want to pay for it now, despite the fact that anything that government does has to be paid for in some fashion, either by taxes, deficits, inflation, or decreases in existing programs in order to maintain other existing programs.

 In addition, as a number of U.S. Representatives and Senators have discovered over the past few elections, voters no longer reward members of Congress for positive achievements.  They primarily [pun intended] vote to punish incumbents for anything they dislike.  So a member of Congress, such as former Senator Bob Bennett of Utah, can vote for 95% or more of what the Republicans in Utah want and make two or three votes they don’t like, and be denied renomination. At a time when federal programs are vastly underfunded, the combination of voter desires not to lose any federal benefits/programs, not to pay in taxes what is necessary to support those programs, and to punish any member of Congress who attempts to resolve those problems in a politically feasible way, such as working out a compromise, results in continual deadlock.

Then, add to that the fact that politicians want to be re-elected, that over 90% of all Congressional districts are essentially dominated by one political party, and that thirty-one of the states have both Senators from the same political party, and that means that the overwhelming majority of members of Congress cannot vote against the dictates of their local party activists on almost any major issue without risking not being renominated or re-elected. 

Yet everyone decries Congress, when Congress is in fact more representative of American culture than ever before.  We, as a society, want, right now, more than we’re willing to pay for.  Likewise, our representatives don’t want to pay for trying to fix things because they want to keep their jobs, right now, regardless of the future consequences.  But it’s so much easier to blame that guy or gal in Washington than the face in the mirror.

The Week’s Market “Crash”

Taken together, the drop in the various market indices on Wednesday and Thursday appear to be the largest two-day decline in almost a year and a half.  And what supposedly triggered the sell-off and decline?  The fact that the Federal Reserve indicated that it just might stop buying something like $85 billion in bonds every month.  Duh!

Believing that the continuing purchase of such bonds, otherwise known by the euphemism of “quantitative easing” (or QE), would or could go on forever makes the belief in the tooth fairy, the Wizard of Oz, and moderation by the Taliban look sensible by comparison. The financial “wizards” of  Wall Street, including high-paid hedge fund managers, program traders, and various other supposed financial icons had to know that such a program had to end or be throttled back.  So why, if they knew this, did they go into a panic?

Because they were using the stimulus of QE to run up stocks in the short run to bolster their own bottom lines – and bonuses – and didn’t believe that Chairman Bernanke would signal an end to the artificial bull market so quickly.  Ah yes, and these are the geniuses who are among the most high-paid executives/professionals in the United States.  They’re also the ones who created the mess of the Great Recession… and they’re at it again.

Despite Dodd-Frank, there still is little oversight of these self-proclaimed experts, and no real significant reform of either banking or investment banking. And Congress continues to tie itself in knots over anything requiring real oversight or reform, as witness the scuttling of the legislation that made a very modest attempt at reforming farm subsidies… or the continued hassles over fixing a broken and essential non-functional immigration system… and we won’t mention, except in passing, the fact that despite overwhelming public support for requiring background checks of firearms’ purchasers, that, too, never happened.

Just how bad will things have to get before Americans start electing politicians who are more interested in solving problems than getting elected?  I don’t know… but I’m definitely not holding my breath.

On Your Own Terms

There’s a scene in the movie Citizen Kane where Jedediah Leland tells Charles Foster Kane that Kane only wants “love on your own terms.”  It’s a great scene, and true as well as prophetic in a far larger context

There’s no doubt that, throughout history, human beings have always wanted love, and pretty much everything else, on our own terms.  In most of human history, however, almost everyone couldn’t get much of anything on their own terms, and this is still true in many parts of the world. If it didn’t rain, people were lucky to get anything to eat, let alone a Big Mac or Chateaubriand with béarnaise.  Even in the reign of Louis XIV, the “Sun King” of France, the most powerful ruler in Europe, there were times when beverages froze on the table at Versailles.

But, with the rise of more advanced technology we’ve become more and more able to get things previously unobtainable, from fresh fruits and vegetables out of season where we live to instant communications pretty much anywhere on earth.  Particularly in the United States, as a society we want everything on our own terms.  We want cheap and abundant electricity.  We want inexpensive clothing.  We want easily affordable personal transportation at our beck and call.  We want the best health care possible, and we’re getting angry that its cost is rising.  The list of what we want and can get on our own terms – or close to them – is large and growing… for the moment.

The problem with all this is that, over time, we don’t dictate the terms:  the physical condition of the world and the underlying laws of the universe do.

The current “we can have it all” of so-called responsible environmentalists is natural gas, because it emits roughly half the greenhouse gases of coal as well as far fewer other pollutants.  There are more than a few problems with this “solution,” the first of which is that the numbers backing the “replacement” of coal with natural gas don’t take into account the additional and far higher than publicized environmental costs. A number of recent studies show that from 3% to 15% of existing natural gas wells are leaking methane gas.  A NOAA study of one gas field in eastern Utah found that leaks amounted to 9% of the amount of gas produced.  Another study by Cornell University also found leakage rates at nine percent on a national basis. Studies of gas drilling have shown leakage rates of up to 17% in some basins.  One Canadian study indicated that the more typical horizontal gas fracking wells had leakage after of several years in more than half the wells.   While high pressure fracking wells are still in the minority in numbers, they have high initial production rates, and even a small percentage volume of leakage can result in a significant quantity of methane emissions. Given that there are some 500,000 natural gas wells in the United States alone, if even 3% are leaking that’s 15,000 wells oozing or spewing methane into the atmosphere, and, given that methane is a greenhouse gas 100 times more potent than CO2, when initially released and 25 times more potent even measured over a hundred years, after that’s a serious problem. And that doesn’t include the tens of thousands of Canadian wells. Even EPA studies show that leakage rates above eight percent negate any benefits from converting from coal and in fact may even accelerate global warming.  Natural gas just doesn’t leak from wells, either.  Testimony before the Massachusetts state legislature this year cited 20,000 known natural gas leaks in the state, and the U.S. Energy Information Administration estimates that more than 8 billion cubic meters of natural gas are lost each year in leakage.

I’m not against natural gas.  In fact, I’d much rather have natural gas generating my power and heating my home than any of the conventional alternatives.  BUT… unless the drilling companies and the gas power industry are willing to spend a lot more money and other resources in cleaning up production and transmission systems, there’s not going to be any environmental improvement.  In fact, present practices could make matters worse.  Of course, coal is still cheaper – unless coal-burning power plants are cleaned up to the standards of natural gas plants, in which case, the electricity won’t be any cheaper, but more expensive.  And if we don’t clean up our energy production and usage pollution, we’ll end up frying the planet that much sooner. In short, we can’t keep having cheap energy on our terms.

I could have cited different examples in different areas, but the facts and the conclusions would be similar. Over time, the universe is going to limit what we can have on our own terms… and for how long.

That’s not even a question.  The question is how long it will take us as a society to understand that point.

Stereotypes

Over the past few years the issue of stereotyping has become and remains a hot-button topic with many people, particularly those in groups subjected to the practice. The Oxford dictionary definition of the word “stereotype” is: “a widely held but fixed and oversimplified image or idea of a particular type of person or thing.”  Unfortunately, while most enlightened individuals deplore stereotyping, the fact is that even those who deplore it still engage in it, whether they realize it or not.  For example, while it is considered prejudicial to believe any young black male in a hoodie is a gang member, or up to no good, it’s perfectly all right to call every SUV or large pick-up truck a “gas-guzzler,” even if the owner has occupational or other needs no other vehicle can meet. But both are sterotypes.

At the same time, it is useful to consider that stereotypes exist essentially for one of two reasons: (1) a significant number or percentage of people (or vehicles, or anything else) is a group do in fact fit within the stereotype… OR (2) large numbers of people believe that they do.

It’s fairly obvious that stereotyping people based on misconceptions is prejudicial, but what if there’s a basis in fact?  For example, for centuries, there was, and still is, especially in western European-derived cultures, a stereotype of Jewish men as greedy, stingy moneylenders. 

While ancient Jewish law forbid excessive charging of interest, and charging interest was deplored in some texts, money-lending with interest charges was allowed by the Judaic faith, but by the fifth  century the Roman Catholic Church had prohibited the taking of interest, and in 1311, Pope Clement V made the ban on usury absolute.  In effect, all Christians were banned from money-lending; Jews were not. Since there were more than a few bans on what Jews could do in Europe, it wasn’t exactly a surprise that the banking business was initially predominantly Jewish and Jewish bankers remained active and prominent well into the 20th century. Thus, in point of fact, the stereotype of money-lenders as Jewish was accurate… but it’s highly doubtful that most Jewish money-lenders were anything like the stereotypes portrayed by playwrights and writers [such as Shylock in The Merchant of Venice], simply because acting that way would have been largely counterproductive at a time when Jews were facing continual persecution, not that reality has ever made much impact on prejudice.  Only a concerted effort toward change has been effective.

As for young black men in hoodies… that’s a problem, because, according to Bureau of Justice statistics, one in three black men will serve time in jail and 40% of young African-American males will spend time in some sort of confinement.  Part of that [possibly a very large part of that] is the result of a criminal justice system that prosecutes a higher percentage of minority youths than white youths and that, for the same offense, sentences black youths to longer sentences than those received by white youths, but… for whatever reason, unhappily, the stereotype applies to a significant percentage of black males… and that means, unhappily, that one needs to be at the least wary of young black men in hoodies on dark city streets.

In the end, there is not one problem with stereotyping, but two.  The first is obvious. Viewing every individual in a particular group as a stereotype of that group is both prejudicial and discriminatory.  The second problem is not as obvious, but just as real.  When any group has a large enough percentage of individuals who fit a negative stereotype, that group, and society as a whole, has a problem that needs to be addressed, and it’s almost certain that not all of that problem is purely prejudice. This problem is not just one for minorities.  Bankers, professional cyclists, the NRA, tea-partiers, American tourists abroad [“ugly Americans”], young male Muslims, college professors[ivory tower liberals], and Republicans, among others, all also need to face their stereotypes.

Irrational Economic Values

Ever since Adam Smith, and probably before, economists and philosophers have attempted to reduce the essence of economics to simple principles, coming up with various explanations for various aspects of economics, ranging from “the invisible hand” to “surplus value of labor” all the way to the Laffer Curve, which postulates that there is an optimum rate of taxation, above which actual tax revenues will drop.  This, of course, was the rationale for the Reagan tax cuts.  But for all these theories and studies, economics has been spectacularly flawed in its application to law and public policy, and we now face an economically critical period in history.

Today, throughout the industrialized world, and particularly in the United States, political and economic leaders are faced with a series of economic problems.  First, there is a continuing, and often growing, disparity between the incomes of average individuals and the highest-earning individuals.  Second, growing productivity and profitability is resulting in greater returns to high earners and is not resulting in significantly increased employment, and not enough to keep up with population growth. Third, an increasing number of governments lack the resources to maintain fiscal and financial stability. Fourth, governments and businesses are increasingly reluctant to spend on societal infrastructure, even as more and more of business and society are dependent on such infrastructure.  The combination of the first three situations is leading, in many cases (and those instances will likely increase), to political and social unrest, and the fourth situation, unless remedied, is likely to undermine attempts to resolve the first three problems.

The problem with almost all past economic theory and most political solutions proposed to date lies in more than a few questionable assumptions underlying various theories. The two that seem most prevalent and erroneous to me are:  (1) Individuals and organizations behave rationally. (2) Value is determined objectively.

Recent economic studies have shown that true objective rationality, particularly on the part of individual consumers and business owners, is seldom the case.  Part of that lack of rationality lies in part in the fact that all of us have core sets of beliefs at variance to some degree with the world as it is, and part lies in the fact that we never know enough to consider all the factors. This lack of rationality compounds the problem of determining economic “value.”  From the beginning of economic studies, economists and philosophers have groped to find an answer to the basic economic question of what is “value” is and how it is determined in a working society.

The so-called Classical Theory of economics effectively states that the price/value of goods is determined predominantly by the cost of the labor producing it, and in this regard, Marxism is only a variation on classical theory, since, in Das Kapital, Karl Marx asserts that the difference between skilled and unskilled labor does not factor effectively into the creation of worth.  It’s also clear that salaries and wages are determined by what the employer is able and willing to pay, and that is determined in part (and only in part) by what consumers of goods and/or services are willing to pay.

But consider the following questions.  Why are the employees of Target, Costco, and WalMart paid on very different wage scales?  All three companies provide similar goods to large numbers of consumers, and many of those goods are identical. Why do universities pay professors of similar rank and experience widely differing salaries, sometimes dependent on discipline, and sometimes not?  Why do investment banks and hedge funds continue to pay their CEOs and top executives and traders tens of millions of dollars for services that are essentially irrelevant to roughly ninety percent of the population?  And why has no government ever seriously attempted to recoup any significant proportion of the immense financial losses inflicted on national economies… or taxed them more heavily to help pay for the unemployment benefits for those whose jobs they destroyed?

The theoretical answer to all these questions is some variation on paying the “market rate” or “that’s the way business works.”

At the core of this “market” are the so-called laws of supply and demand.  The idea is that when goods and services are plentiful, prices go down, and when they are scarce, prices go up.  This works well, if mercilessly, in terms of commodities, or commoditized services.  After all, one bushel of durum wheat is pretty much like another bushel, but when it comes to people, the idea has more than a few flaws, and the more complex the society, the more the impact of those flaws is magnified.

And “scarcity” doesn’t always translate into higher wages or greater demand. Despite a demand for math and science teachers, there are never enough qualified applicants, although that’s likely because school systems won’t increase wages enough to spur demand… which points out that, for all the talk about following a business model, organizations and institutions often only do so when it suits their fancy.

Take the commoditized job of a checker at WalMart, Costco, or Target.  Theoretically, checkers provide the same service for the same pay.  Having stood in many check-out lines, I can assure you that the ability of all checkers is anything but the same.  And this is a comparatively simple job.  But because it is a “simple” job, there are many people who can handle the basics of the task, and because a really good checker who wants better pay can be replaced by someone who can just manage the basics, wages stay low…  except Costco pays better.  Why? Because they’ve learned they get better people?  But that means that service jobs are not the same…something that tends to get overlooked in economic discussions.

The same problem exists, if on a higher level, in elementary and secondary education, with an added complication.  While there are a great many would-be teachers with the proper credentials and the theoretical skills, even after years of study, the best analysts can only approximate the factors that make a truly skilled teacher. There are effective and skilled teachers with totally different approaches to exactly the same subject, who both can inspire and produce better students who learn more, and while pretty much every prescriptive and descriptive analysis of teaching can describe the basics of what constitutes an effective teacher, once you go beyond that, it’s all speculation, because truly good teaching is an art.  But… pay and value are determined by average competence, and because excellent teaching is an art, so-called “merit pay” systems have largely failed where they’ve been tried — and likely will in the future.  Yet legal codes and the threat of litigation make it effectively impossible either to identify or reward outstanding teachers. All this points out not only the problem with Marx’s assumption that, essentially all labor in a certain position or field has the same value, but the wide-scale application in industrialized societies of the same principle to all jobs with the same description. 

To make matters more complex, “commoditization” of jobs often is applied perversely, or not at all. As I’ve noted before, in my wife’s university, the professors in the performing arts fields work longer hours, provide more services to the university, and are paid far less, even when they have more degrees and experience than do professors in the field of business. That’s because the field of “business” is considered more profitable… but the job of professors is education, not business.  Likewise… why do college coaches make more than university presidents?  Because athletics are more “valuable” than administering an institution educating thousands, if not tens of thousands of students?  Then, too, on a society-wide basis, women are paid less than men with equivalent time and experience, or even less, in the same professional field.

Finally, consider the high earners in society… who are they?  I looked at the top fifty names on the Forbes list of billionaires, and 30% came out of the financial sector and 30% from the entertainment and communications sector, followed by 20% in food and retail enterprises, and 10% in natural resources.  What does that say about “value” and “rationality”?  Or about the results of blindly following a so-called market economy?

Perhaps, just perhaps, that we really don’t want to pay for either value or rationality, just what we want when we want it, and then to complain about what doesn’t get done that we don’t want to pay for.