Archive for the ‘General’ Category

Older and Depressed?

The other day one of my readers asked, “Is there anything positive you can talk about or have you slid too far down the slope of elder grouchiness and discontent?”  That’s a good question in one respect, because I do believe that there is a definite tendency, if one is intelligent and perceptive, to become more cynical as one gains experience.

Psychological studies have shown, however, that people who suffer depression are far more accurate in their assessments of situation than are optimists, and that may be why optimism evolved – because it would be too damned hard to operate and get things done if we weighed things realistically.  For example, studies also show that entrepreneurs and people who start their own businesses invariably over-estimate the chances of their success and vastly underestimate their chances of failure.  This, of course, makes sense, because why would anyone open a business they thought would fail?

There’s also another factor in play. I spent nearly twenty years in Washington, D.C., as part of the national political scene, and after less than ten years I could clearly see certain patterns repeat themselves time after time, watching as newly elected politicians and their staffs made the same mistakes that their predecessors did and, over the longer term, watching as each political party gained power in response to the abuses of its predecessor, then abused it, and tried to hold on by any means possible, only to fail, and then to see the party newly in power immediately begin to abuse its power… and so on. It’s a bit difficult not to express a certain amount of “grouchiness and discontent,” especially when you offer advice based on experience and have it disregarded because the newcomers “know better”… and then watch them make the same kind of mistakes as others did before them.  My wife has seen the same patterns in academia, with new faculty and new provosts re-inventing what amounts to a square wheel time after time.

It’s been said that human knowledge is as old as written records, but human wisdom is no older than the oldest living human being, and, from what I’ve seen, while a comparative handful of humans can learn from others, most can’t or won’t.  And, if I’m being honest, I have to admit that for the early part of my life I had to make mistakes to learn, and I made plenty. I still make them, but I’d like to think I make fewer, and the ones I make are in areas where I don’t have the experience of others to guide or warn me.

The other aspect of “senior grouchiness,” if you will, is understanding that success in almost all fields is not created by doing something positively spectacular, but by building on the past and avoiding as many mistakes as possible. Even the most world-changing innovations, after the initial spark or idea, require following those steps.

I’m still an optimist at heart, and in personal actions, and in my writing, but, frankly, I do get tired of people who won’t think, won’t learn, and fall back on the simplistic in a culture that has become fantastically complex, both in terms of levels and classes of personal interactions and in terms of its technological and financial systems. At the same time, the kind of simplicity that such individuals fall back on is the “bad” and dogmatic kind, such as fanatically fundamental religious beliefs and “do it my way or else,”  as opposed to the open and simple precepts, such as “be kind” or “always try to do the right thing.”  I’m not so certain that a great portion of the world’s evils can’t be traced to one group or another trying to force their way – the “right way,” of course, upon others.  The distinction between using government to prohibit truly evil behavior, such as murder, abuse of any individual, theft, embezzlement, fraud, assault, and the like, and forcing adherence to what amounts to theological beliefs was a hard-fought battle that took centuries to work itself out, first in English law, and later in the U.S. Constitution and legal system.  So when I see “reformers” – and they exist on the left and the right – trying to undermine that distinction that is represented by the idea of separation of church and state [although it goes far beyond that], I do tend to get grouchy and offer what may seem as depressing comments.

This, too, has historical precedents.  Socrates complained about the youth and their turning away from the Athenian values… but within a century or so Athens was prostrate, and the Athenians never did recover a preeminent position in the world. Cicero and others made the same sort of comments about the Roman Republic, and in years the republic was gone, replaced by an even more autocratic empire.

So… try not to get too upset over my observations. After all, if more people avoided the mistakes I and others who have learned from experience point out, we’d all have more reasons to be optimistic.

 

The Republican Party

Has the Republican Party in the United States lost its collective “mind,” or is it a totally new political party clinging to a traditional name – whose traditions and the policies of its past leaders it has continually and consistently repudiated over the past four years?

Why do I ask this question?

Consider first the policies and positions of the Republican leaders of the past.  Theodore Roosevelt pushed anti-trust actions against monopolistic corporations, believed in conservation and created the first national park. Dwight D. Eisenhower, general of the armies and president, warned against the excessive influence of the military-industrial complex and created the federal interstate highway system.  Barry Goldwater, Mr. Conservative of the 1970s, was pro-choice and felt women should decide their own reproductive future.  Richard Nixon, certainly no bastion of liberalism, espoused universal health insurance and tried to get it considered by Congress and founded the Environmental Protection Agency.  Ronald Reagan, cited time and time again by conservatives, believed in collective bargaining and was actually a union president, and raised taxes more times than he cut them.  The first president Bush promised not to raise taxes, but had the courage to take back his words when he realized taxes needed to be increased.

Yet every single one of these acts and positions has now been declared an anathema to Republicans running for President and for the U.S. House of Representatives and the Senate.  In effect, none of these past Republican leaders would “qualify” as true card-carrying Republicans according to those who now compose or lead the Republican Party.  A few days ago, former Florida governor and Republican Jeb Bush made a statement to the effect that even his father, the first President Bush, wouldn’t be able to get anything passed by the present Congress.

President Obama is being attacked viciously by Republicans for his health care legislation, legislation similar to that signed and implemented by Mitt Romney as governor of Massachusetts and similar in principle to that proposed by Richard Nixon.

Now… I understand that people change their views and beliefs over time, but it’s clear that what the Republican Party has become is an organization endorsing what amounts almost an American version of fascism, appealing to theocratic fundamentalism, and backed by a corporatist coalition, claiming to free people from excessive government by underfunding or dismantling all the institutions of government that were designed to protect people from the abuses of those with position and power.  Destroy unions so that corporations and governments can pay people less.  Hamstring environmental protection in the name of preserving jobs so that corporations don’t have to spend as much on environmental emissions controls. Keep taxes low on those making the most.  Allow those with wealth to spend unlimited amounts on electioneering, if in the name of  “issues education,” while keeping the names of contributors hidden or semi-hidden.  Restrict women’s reproductive freedoms in the name of free exercise of religion. Keep health care insurance tied to employment, thus restricting the ability of employees to change jobs.  Allow consumers who bought too much housing to walk away from their liabilities through bankruptcy or short sales (including the honorable junior Senator from Utah), but make sure that every last penny of private student loan debt is collected – even if the students are deceased.

The United States is a representative democratic republic, and if those calling themselves Republicans wish to follow the beliefs and practices now being spouted, that’s their choice… and it’s also the choice of those who choose to vote for them.

But for all their appeal to “Republican traditions,” what they espouse and propose are neither Republican nor traditional in the historic sense,  But then, for all their talk of courage and doing the hard jobs to be done, they haven’t done the first of those jobs, and that’s to be honest and point out that they really aren’t Republicans, and they certainly aren’t traditional conservatives, no matter what they claim.

The Derivative Society?

Once upon a time, banking and investment banking were far less complex than they are today, especially recently.  In ancient times, i.e., when I took basic economics more than fifty years ago, banks used the deposits of their customers to lend to other customers, paying less to their depositors than what they charged those to whom they made loans.  Their loans were limited by their deposits, and banks were required to retain a certain percentage of their assets in, if you will, real dollars.  Even investment banks had some fairly fixed rules, and in both cases what was classed as an asset had to be just that, generally either real property, something close to blue chip securities, municipal, state, or federal notes or bonds, or cash. With the creeping deregulatory legislation that reached its apex in the 1990s, almost anything could be, with appropriate laundering, otherwise known as derivative creation, be classed as someone’s asset.

And we all know where that led.

And for all the furor about derivatives, and the finger-pointing, something else, it seems to me, has gone largely unnoticed.  The fact is that our entire society, especially in the United States, has become obsessed with derivatives in so many ways.

What are McDonald’s, Wendy’s, Burger King, Applebee’s, Olive Garden, Red Lobster, Chili’s, and endless other restaurant chains, fast-food and otherwise, but derivatives.  What happened to unique local restaurants?  The ones with good inexpensive food often became chains, deriving their success from the original.  The others, except for a few handfuls, failed.  Every year it seems, another big name chef starts a restaurant franchise, hoping to derive success and profit from a hopefully original concept [which is becoming less and less the case].

Department stores used to be unique to each city.  I grew up in Denver, and we had Daniels & Fisher, with its special clock tower, the Denver Dry Goods [“The Denver”], and Neustaeder’s.  Then the May Company took over D&F, and before long all the department stores were generic. In Louisville, where my wife was raised, there were Bacon’s, Kaufmann’s, Byck’s, Selman’s, and Stewart’s. Not a single name remains.

Even Broadway, especially in musical theatre, has gone big for remakes and derivatives. Most of the new musicals appear to be remakes of movies, certainly derivative, or re-dos of older musicals. Every time there is a new twist on TV programming the derivatives proliferate.  How many different “Law and Order” versions are there?  Or CSI?  How many spin-offs from the “American Idol” concept?  How many “Reality TV” shows are there?  Derivative after derivative… and that proliferation seems to be increasing. Even “Snow White” has become a derivative property now.

In the field of fantasy and science fiction writing, the derivatives were a bit slower in taking off, although there were more than a few early attempts at derivatives based on Tolkien, but then… somewhere after Fred Saberhagen came up with an original derivative of the Dracula mythos, vampires hit the big-time, followed by werewolves, and more vampires, and then zombies.  Along the way, we’ve had steampunk, a derivative of a time that never was, fantasy derivatives based on Jane Austin, and more names than I could possibly list, and now, after the “Twilight” derivatives, we have a raft of others.

Now… I understand, possibly more than most, that all writing and literature derives from its predecessors, but there’s a huge difference between say, a work like Mary Robinette Kowal’s Shades of Milk and Honey, which uses the ambiance of a Regency-type culture and setting in introducing a new kind of fantasy [which Kowal does] and a derivative rip-off such as Pride and Prejudice and Zombies or Emma and the Werewolves.  When Roger Zelazny wrote Creatures of Light and Darkness or Lord of Light, he derived something new from the old myths.  In a sense, T.S. Eliot did the same in The Wasteland or Yeats in “No Second Troy.”  On the other hand, I don’t see that in John Scalzi’s Redshirts, which appears to me as a derivative capitalization on Star Trek nostalgia.

How about a bit more originality and a lot fewer “literary” derivatives?  Or have too many writers succumbed to the lure of fast bucks from cheap derivatives? Or have too many readers become too lazy to sort out the difference between rip-off and robbery whole-cloth derivatives and thoughtful new treatments of eternal human themes?

 

Coincidences?

We’ve all been there, I think, on the telephone discussing something important to us or with someone important to us… and no one else is home, when the doorbell rings, or another call comes through, with someone equally important, or both at once.  Now, it doesn’t matter that no one has called or rung the doorbell for the previous two hours and no one will for another hour or two.  What is it about the universe that ensures that, in so many cases, too many things occur at the same time?

I’m not talking about those which aren’t random, but can be predicted, like the political calls that occur from five or six in the evening until eight o’clock, or the charitable solicitations that are timed in the same way [both conveniently excepted from the do-not-call listing]. I’m talking about calls and callers and events that should be random, but clearly aren’t.  Sometimes, it’s merely amusing, as when daughters located on different coasts call at the same time.  Sometimes, it’s not, as when you’re trying to explain why you need the heating fixed now, and your editor calls wanting an immediate answer on something… or you’re discussing scheduling long-distance with your wife and you ignore the 800 call that you later find out was an automated call, without ID, informing you that your flight for six A.M. the next morning has been cancelled… and you don’t find out until three A.M. the next morning when you check your email before leaving for the airport… and end up driving an extra 60 miles to the other airport. There’s also the fact that, no matter what time of the afternoon it is, there’s a 10-20% chance that, whenever I’m talking to my editor, either FedEx, UPS, or DHL will appear at the door [upstairs from my office] needing a signature… and we don’t get that many packages [except from my publisher] and I spend less than a half hour a week on the phone with my editor.

I know I’m not alone in this.  Too many people have recounted similar stories, but the logical types explain it all away by saying that we only remember the times these things happen, but not the times that they don’t.  Maybe… but my caller I.D. gives the times for every incoming call, and when I say that there haven’t been any calls for two or three hours, and then I get three in three minutes… it doesn’t lie – not unless there’s a far grander conspiracy out there than I even wish to consider.  And why is it that I almost always get calls in the ten minutes or so a day when I’m using the “facilities”?  No calls at all in the half hour before or after, of course.

This can extend into other areas – like supermarket checkout lines. The most improbable events occur in all too many cases in whatever line I pick.  The juice packet of the shopper in front of me explodes all over the conveyor belt.  The checker I have is the only one not legally able to ring up beer, and the manager is dealing with an irate customer in another line.  The register tape jams.  The credit/debit card machine freezes on the previous customer, just after I’ve put everything on the belt.

Now… to be fair, it sometimes works the other way. There was no possible way I ever could have met my wife.  None [and I won’t go into the details because they’d take twice the words of my longest blog], but it happened, and she’s still, at least occasionally, pointing out that it had to be destiny… or fate.  Well… given how that has turned out, I wouldn’t mind a few more “improbable” favorable coincidences, but… they’re pretty rare.  Then again, if all the small unfavorable improbabilities are the price for her… I’ll put up with them all.

 

The Next Indentured Generation?

The other day I received a blog comment that chilled me all the way through.  No, it wasn’t a threat.  The commenter just questioned why state and federal government should be supporting higher education at all.

On the surface, very much on the surface, it’s a perfectly logical question. At a time of financial difficulty, when almost all states have severe budget constraints, if not enormous deficits, and when the federal deficit is huge, why should the federal government and states be supporting higher education?

The question, I fear, arises out of the current preoccupation with the here and now, and plays into Santayana’s statement about those who fail to learn the lessons of history being doomed to repeat them. So… for those who have mislaid or forgotten a small piece of history, I’d like to point out that, until roughly 1800, there were literally only a few handfuls of colleges and universities in the United States – less than 30 for a population of five million people. Most colleges produced far, far fewer graduates annually than the smallest of colleges in the USA do today.  Harvard, for example, averaged less than 40 graduates a year.  William & Mary, the second oldest college in the United States, averaged 20 graduates a year prior to 1800.  Although aggregated statistics are unavailable, estimates based on existing figures suggest that less than one half of one percent of the adult population, all male, possessed a college education in 1800, and the vast majority of those graduates came from privileged backgrounds.  Essentially, higher education was reserved for the elites. Although more than hundred more colleges appeared in the years following 1800, many of those created in the south did not survive the Civil War.

In 1862, Congress created the first land-grant universities, and eventually more than 70 were founded, based on federal land grants, primarily to teach agricultural and other “productive” disciplines, but not to exclude the classics. By 1900, U.S. colleges and universities were producing 25,000 graduates annually, out of a population of 76 million people, meaning that only about one percent of the population, still privileged, received college degrees, a great percentage of these from land grant universities supported by federal land grants and state funding.  These universities offered college educations with tuition and fees far lower than those charged by most private institutions, and thus afforded the education necessary for those not of the most privileged status.  Even so, by 1940, only five percent of the U.S. population had a college degree.  This changed markedly after World War II, with the passage of the GI bill, which granted veterans benefits for higher education. Under the conditions which existed after WWII until roughly the early 1970s, talented students could obtain a college degree without incurring excessive debt, and sometimes no debt at all.

As we all know, for various reasons, that has changed dramatically, particularly since state support of state colleges and universities has declined from something close to 60% of costs forty years ago to less than 25% today, and less than 15% in some states.  To cover costs, the tuition and fees at state universities have skyrocketed.  The result? More students are working part-time and even full-time jobs, as well as taking out student loans.  Because many cannot work and study full-time, the time it takes students to graduate takes longer, and that increases the total cost of their education. In 2010, 67% of all graduating college seniors carried student loan debts, with an average of more than $25,000 per student.  The average student debt incurred by a doctor just for medical school is almost $160,000, according to the American Medical Association.

Yet every study available indicates that college graduates make far more over their lifetime than those without college degrees, and those with graduate degrees generally fare even better.  So… students incur massive debts.  In effect, they’ll become part-time higher-paid indentured servants of the financial sector for at least 20 years of their lives.

The amounts incurred are far from inconsequential.  Student debt now exceeds national credit card debt [and some of that credit card debt also represents student debt, as well]. The majority of these costs reflect what has happened when states cut their support of higher education, and those costs also don’t reflect default rates on student loans that are approaching ten percent.

As a result, college graduates and graduates from professional degree programs are falling into two categories – the privileged, who have no debt, and can choose a career path without primarily considering the financial implications and those who must consider how to repay massive debt loads.  And as state support for higher education continues to dwindle, the U.S, risks a higher tech version of social stratification based on who owes student loans and who doesn’t.

So… should the federal and state governments continue to cut  support of higher education? Are such cuts a necessity for the future of the United States?  Really?  Tell that to the students who face the Hobson’s Choice of low-paying jobs for life or student loan payments for life.  Or should fewer students attend college?  But… if that’s the case, won’t that just restrict education to those who can afford it, one way or another?

The Tax Question

These days an overwhelming number of political figures, especially conservatives and Republicans, continue to protest about taxes and insist that taxes should be lowered and that federal income taxes, at the very least, should be left at the lower levels set during the administration of the second President Bush. Although many conservatives protest that taxes are being used for “liberal” social engineering, the fact is that there are so many “special provisions” embodied in the tax code that such “engineering” runs from provisions purported to help groups ranging from the very poorest to the very wealthiest.  In addition, much of the complexity of the tax code arises from generations of efforts to make it “fairer.”

For all that rhetoric, the basic purpose of taxes is to pay for those functions of government that the elected representatives of past and present voters have deemed necessary through the passage of federal laws and subsequent appropriations.  Or, as put by the late and distinguished Supreme Court Justice Oliver Wendell Holmes, Jr., “Taxes are what we pay for a civilized society.”

Grumbling about taxation has been an American preoccupation since at least the 1700s when the American colonists protested the British Stamp Tax and later the tax on imported British tea.  In the case of the tea tax, the colonists were paying more for smuggled tea than for fully taxed British tea, which has always made me wonder about the economic rationality of the Boston Tea Party, and who really was behind it… and for what reason, since it certainly wasn’t about the price of British tea.

Likewise, my suspicions are that the current furor about taxes, and federal income taxes in particular, may not really be primarily about taxes themselves, but a host of factors associated with taxes, most of which may well lie rooted in the proven “loss aversion” traits of human beings.  Put simply, most of us react far more strongly to events or acts which threaten to take things from us than to those which offer opportunities, and in a time when most people see few chances for economic improvement, loss aversion behavior, naturally, becomes stronger.  And most people see higher taxes, deferred Social Security retirement ages, and higher Medicare premiums as definite losses, which they are.

What’s most interesting about this today is that the leaders of the conservative movements and the Republican party are generally from that segment of society which has benefited the most in the past twenty years from the comparative redistribution of wealth to the uppermost segment of American society and yet they are appealing to those members of society who feel they have lost the most through this redistribution – the once more highly paid blue collar workers in the old automotive industries and other heavy manufacturing areas of the U.S. economy.  The problem with this appeal is not that it will not work – it definitely will work, especially if economic conditions do not improve – but that the policies espoused by the “keep taxes low/cut taxes” conservatives won’t do anything positive to benefit the vast majority of those to whom these conservatives are appealing.  They will, of course, greatly benefit the wealthy, but the comparative lack of federal/state revenues is already hurting education, despite the fact that both conservatives and liberal both agree that improved education is vital for today’s and tomorrow’s students if they are to prosper both economically  and occupationally.  The lack of money for transportation infrastructure will only hamper future economic growth, as will the lack of funding to rebuild and modernize our outdated air transport control system and a number of other aging and/or outdated infrastructure systems.

The larger problem is, of course, that the conservatives don’t want government to spend money on anything, and especially not anything new, while the liberals have yet to come up with a plan for anything workably positive… and, under those circumstances, it’s very possible that “loss aversion” politics, and the anti-taxation mood, will dominate the political debates of the next six months… which, in the end, likely won’t benefit anyone.

 

Cleverness?

Over the years, every so often, I’ve gotten a letter or review about one of my books that essentially complains about the ruthless nature of a protagonist, who is supposed to be a good person.  These often question why he or she couldn’t have done something less drastic or resolved the situation they faced in a more clever fashion.  I realized, the other day, after seeing a review of Imager’s Intrigue and then receiving an email from another writer who was disappointed that Quaeryt couldn’t be more “clever” in his resolution of matters and less reliant upon force exactly what my grandmother had meant in one of her favorite expressions.  She was always saying that some businessman or politician was “too clever by half.”

So, I believe, are some writers.  I try not to be excessively clever, because it’s highly unrealistic in the real world, but it’s difficult when there’s an unspoken but very clear pressure for authors to be “clever.”  My problem is that I’m moderately experienced in how the “real world” operates, and seldom is a “clever” solution to anything significant or of major import a truly workable solution. As I and numerous historians have pointed out, in WWII, with a few exceptions, the Germans had far more “clever” and advanced technology.  They lost to the massive application of adequate technology.  In Vietnam, the high-tech and clever United States was stalemated by the combination of wide-scale guerilla warfare and political opposition within the USA.  Despite the application of some of the most sophisticated and effective military technology ever deployed, the U.S. will be fortunate to “break even” in its recent military operations in the Middle East… and given the costs already and the loss of lives for what so far appear to be negligible gains, it could be argued that we’ve lost.  I could cite all too many examples in the business world where “clever” and “best” lost out to cheaper and inferior products backed by massive advertising.  The same sort of situations are even more prevalent in politics.

“Clever,” in fact, is generally highly unrealistic as a solution to most large scale real-world problems.  But why?

Because most problems are, at their base, people problems, it takes massive resources to change the course of human inertia/perceived self-interest. That’s why both political parties in the United States mobilize billions of dollars in campaign funds… because that’s what it takes, since most people have become more and more skeptical of any cleverness that doesn’t fit their preconceptions…  partly because they’re also skeptical of the “clever” solutions proposed by politicians.  It’s why most advertising campaigns have become low-level, not very clever, saturation efforts.  Military campaigns that involve national belief structures and not just limited and clearly defined tactical goals also require massive commitments of resources – and clever just gets squashed if it stands in the way of such effectively deployed resources.

That’s why, for example, in Imager’s Intrigue, Rhenn’s solutions are “clever” only in the sense that they apply massive power/political pressure to key political/military/social vulnerabilities of his opponents.  Nothing less will do the job.

I’m not saying that “clever” doesn’t work in some situations, because it does, but those situations are almost always those where the objectives are limited and the stakes are not nearly so high.  That makes “clever” far more suited to mysteries, spy stories, and some thrillers than to military situations where real or perceived national interests or survival are at stake.

 

The Ratings-Mad Society

The other day, at WalMart, where I do my grocery shopping, since, like or not, it’s the best grocery store in 60 miles, the check-out clerk informed me that, if I went to the site listed on my receipt and rated my latest visit to WalMart, I’d be eligible for a drawing for a $5,000 WalMart gift card.  The next day, at Home Depot, I had a similar experience. That doesn’t include the endless ratings on Amazon, B&N, and scores of retailers, not to mention U-Tube, Rate Your Professor, and the student evaluations required every semester at virtually every college or university. Nor does it include the plethora of reality television shows based on various combinations of “ratings.”

It’s getting so that everything is being rated, either on a numerical scale of from one to five or on one from one to ten.  Have we gone mad?  Or is it just me?

Ratings are based on opinions.  Opinions are, for the overwhelming majority of people, based on their personal likes and dislikes… but ratings are presented for the most part as a measurement of excellence.

Yet different people value different things. My books are an example. I write for people who think and like depth in their fiction… and most readers who like non-stop action aren’t going to read many of my books, and probably won’t like them… and those are the ones who give my books one star with words like “boring”… or “terminally slow.”  By the same token readers who like deep or thoughtful books may well rate some of the fast-action books as “shallow” [which they are by the nature of their structure] or “improbably constructed” [which is also true, because any extended fast-action sequence just doesn’t happen often, if ever, in real life, and that includes war].

Certainly, some of the rationale behind using ratings is based on the so-called wisdom of crowds, the idea that a consensus opinion about something is more accurate than a handful of expert opinions.  This has proven true… but with two caveats – the “crowd” sampled has to have general knowledge of the subject and the subject has to be one that can be objectively quantified.

The problem about rating so many things that are being rated is that for some – such as music, literature, cinema, etc. – technical excellence has little bearing on popularity and often what “the crowd” rates on are aspects having nothing to do with the core subject, such as rating on appearance, apparel, and appeal in the case of music or special effects in the case of cinema.

Thus, broad-scale ratings conceal as much as they reveal… if not more.  Yet everyone with a product is out there trying to get some sort of rating? Obviously, those with a product want a high rating to enhance the salability of their product or service.  But why do people/consumers rely so much on ratings?  Is that because people can’t think?  Or because that they’re so inundated with trivia that they can’t find the information or the time they need to make a decision?  Or because the opinion of others means more than their own feelings?

Whatever the reason, it seems to me that, in the quest for high ratings, the Dr. Jekyll idea of applying the wisdom of the crowd has been transformed into the Mr. Hyde insanity of the madness of the mob.

 

The Hullabaloo Over College Majors

Now that it’s the season for college graduation, once more the articles and commentaries are popping up everywhere – and most of them either tout certain undergraduate majors as “good” because employment in that field is up or bad because immediate job prospects aren’t as good.  What’s even worse is that politicians are getting into the act, some of them going so far as to suggest that students shouldn’t major in fields that don’t pay as well or where employment prospects are aren’t so good, with hints that government and universities shouldn’t offer aid to students interested in such fields.

There are enormous problems with the whole idea of over-emphasizing undergraduate collegiate majors, the first of which is that many students entering college don’t have the faintest idea what their true talents are or whether their interests match their abilities. This problem has worsened in the past several generations as the general academic rigor of high schools has declined and as more students enter colleges and universities without ever having been truly tested to the limits of their abilities.

The second problem is that the emphasis on a “profitable” major is also a growing emphasis on turning college into what amounts to a white-collar vocational school, rather than on an institution devoted to teaching students how to think and to learn on a life-long basis. Colleges themselves are buying into this by pushing departments into “accountability” and insisting that departments determine how many graduates are successful and employed in that field years after graduating.  But does that really measure success?

In addition, the emphasis on selecting a major based on future projected employability neglects two incredibly important factors.  The first is the student’s aptitudes.  A student who is weak in mathematics is highly unlikely to be particularly successful in fields that require that ability, no matter how many jobs exist.  Second, most students take four years or more to finish college.  Projecting what occupations will be hiring the most in four years is chancy.

As for the subjects students choose for their major, the “employability” measurements used are generally employment in the first year after graduation, and the differences in various fields aren’t that significant.  For example, in a recent Georgetown University study, there was only about a 10% difference in employment between the “worst” and “best” undergraduate majors. Such measurements strongly suggest that a student who likes a field and works hard to excel is more likely to land a job, even in a field where employment is not as robust, than a student who tries to game the employment field and who picks a major based on projected employment and earnings rather than on picking a field suited to his or her abilities. In short, it’s far better for students to be at the top of a field they like than at the bottom of one that they don’t.

More than a few studies have shown and projected that today’s educated workers will have changed fields of work three to four times between entering the workforce and retiring – and that today’s students will face even more need to change their field of work.  Such changes place a premium on the ability to think and to learn throughout life, not on a single set of skills tailored to one field or profession.  Yes, there are some fields where dedicated and directed learning is required from the beginning of college, but those fields are a minority and call for initial dedication.  They seldom attract students who are unsure of what they want to do in life or students with wide interests.

In the end, all the political and media concern about “appropriate” majors, despite protests to the contrary, ignores much of what is best about college and for the students by emphasizing short-term economic goals that cannot possibly benefit the majority of students.

 

Media Dumbing Down

When we first got satellite television some fifteen years ago, in the infrequent times we watched television, our tastes ran to channels like Bravo, A&E, History, and Biography. Now we almost never tune in those channels, or many others of the hundred available.  Why not?  Because over the last decade, those once-independent channels have been purchased by major networks, who changed the programming that made them attractive to us.

Where are the biographies of the Founding Fathers, the great industrialists, great painters, poets, revolutionaries, thinkers, architects, authors – or the other notables of the past and present?  They’re gone, replaced by hour after hour of “Notorious,” each hour devoted to some heinous criminal or another, or other uplifting shows like “Outlaw Bikers.”

As for the History Channel, where are the great events or pivotal points in history?  Also gone, replaced by documentaries on the history of plumbing and endless hours of “Swamp People” or “Pawn Stars.”

A&E used to provide a wide range of material, from architectural/history gems like “America’s Castles” to docudramas like “Catherine the Great” (starring Catherine Zeta-Jones, no less). Now what is there?  Six straight hours of “Storage Wars!”

I love science… but I can’t watch most science shows anymore, not when they’re presented at a third-grade level and when, after a commercial break, the narrator repeats the last minute before the break, as if the viewing audience were developmentally disabled.

And the commercials, endless minutes, each one ending, I suspect, with the immortal words, “But wait!  There’s more.  If you order now…”

The movie channels aren’t much better, except for TCM, because each channel takes its turn with the same movie.  How many times do you want to see “Secretariat”… and I liked that one a lot?  But most aren’t that good…

Now… if I wanted, I could subscribe to every sports event offered in the United States and hundreds more from across the world… but March Madness is enough sports for us for the entire year.

Yes… satellite/cable television once was a good thing… until the media titans took over and turned it into a true triumph of capitalism… dollars over quality, and while the dollars are rolling in and the quality degrades further, the politicians in Washington are trying to gut public television, which is about all that’s left with offerings that aren’t dealing in endless moronic variations on pop culture, sex, violence, or sports.  But then, public media channels, supposedly regulated by the FCC for the people, are only about the dollars, aren’t they?

Political Dialogue and Analysis

With just a bit less than six months before the fall elections, in one sense, I can’t wait for the elections to be over, if only to get a respite from political news and sensationalism… but even that respite isn’t likely to be very long, because politics has become not only continuing news, but something resembling a spectator sport.  And like all spectator sports, the political arena is filled with commentary.  Unlike athletic spectator sports, where the acts of the players and the results can be seen immediately, in politics the results of political actions, laws, and policies, in the vast majority of cases, can’t be discerned clearly for years, if ever.

This allows everyone to comment with “equal validity,” because very few members of the public have the knowledge of economics and politics, as well as the patience, to wait and see how things actually worked out.  Nor do most people remember what did happen accurately.  So they tend to trust the commentator whose views most nearly mirror those, not necessarily the commentator or expert who’s most likely to be right.

One of the things that appalls me the most is how both parties distort not only each others’ positions, but also employ the most inaccurate comparisons, and truly inapplicable facts and comparisons.  What makes it worse is that very few commentators or talk show hosts, or columnists, have either the ability or the nerve to suggest that such distortions are doing extreme violence to accuracy [I won’t say “the truth,” because that’s become a pseudo-religious term] and relevance.

Some of the worse offenses to such accuracy lie in the fallacious ignoring of well-known and proven facts.  For one thing, economies react slowly, often ponderously, to changes in law and policy. So like it or not, Bill Clinton got a tremendous boost from policies enacted by the first President Bush, and in turn, the first President Bush was forced to raise taxes by the policies of his predecessor, a fact gloriously ignored by those who cite the great Reagan prosperity. Admittedly, in Clinton’s case, he had enough sense to continue them when he was under pressure to change them, but the conditions for his highly praised period of expansion lay in his predecessor’s actions.  Likewise, to blame President Obama for current high unemployment and recession when those conditions were created by policies created well before his election, and when the U.S. also has to absorb economic fall-out from all across the world, is politically easy, but factually inaccurate, especially when political gridlock in Congress has restricted his ability to attack the problem in the way he would like.  But few of his critics will admit that they’re judging him as much, if not more, by Congressional inaction than by his own acts.

Comparing one economic recovery, or non-recovery, to another is not only inaccurate, but disingenuous, because the underlying factors differ greatly, yet such comparisons are a staple in the political arena, because politicians and their aides have an addiction to the simple and superficially relevant.

In addition, some factors are beyond any President, or any Congress’s, ability to change.  Oil is a fungible global economic good, and, in the short run, no change in U.S. environmental, energy, economic, or tax policy is going to measurably lower the price of crude oil in the months ahead, although the Saudi actions to flood the market with cheaper oil will likely cause a temporary respite, at least until world economic activity picks up.  Unwise government action can, as Richard Nixon proved with his ill-fated experiment with price controls, cause gasoline and heating oil shortages and increase prices in the long term.

Another problem in assessing government/political actions is determining how effective preventative actions are… or accepting the benefits while disavowing the means.  We know that the U.S. safety net for the poor has in fact historically reduced overall poverty in the United States – but which programs really work the best?  Which are failures?  Which work, but are so inefficient that they should be replaced?  How many of all the Homeland Security measures are truly necessary?   Most Americans seem to have forgotten that before the enactment of the Clean Water Act, the Cuyahoga River in Cleveland actually caught fire, or that the Potomac River was actually toxic.  Or that before the enactment of the Clean Air Act, office workers in Pittsburgh often took a second white shirt to work because the first got so soot-filled by midday that it looked black and gray?  Instead of the debate being about drinkable water and breathable air, it’s become about whether environmental protection costs too much and slows or hinders job creation, and almost no commentator questions the terms of the debate.

As I’ve pointed out all too many times, there has not yet been any determination of personal accountability for the latest economic meltdown – and now we’ve had a reminder, in the recent Citibank derivatives loss/scandal, that neither Congress nor the President [either Bush or Obama] ever truly addressed the problem, but merely papered it over.  But I’ve never heard any commentator mentioning that – or attacking the corrupt culture of the financial world and those who lead it.

Instead, we get media and political emphasis on the irrelevant, the inaccurate, the inappropriate, and the inapplicable… and the worst part of it all is that it’s only going to get worse over the next six months.

 

Excellence and Self-Promotion

I grew up in a time and a place where blatant self-promotion was deeply frowned upon.  My father made a number of observations on the subject, such as “Don’t go blowing your own horn; let your work speak for you” or “The big promoters all lived fast lives with big mansions and died broke and forgotten.”  As I’ve gotten older, I’ve learned that promotion and self-promotion have always been with us, dating back at least as far as Ramses II, who, at the very least, gilded if not falsified, in stone, no less, his achievements in battle.  And to this day most people who know American history [a vanishing group, I fear] still think that Paul Revere was the one who warned the American colonists about the coming British attack on Concord – largely because of the poem written by Henry Wadsworth Longfellow, which promoted Paul Revere, possibly because Longfellow couldn’t find enough words to rhyme with Samuel Prescott, the young doctor who actually did the warning after Revere was detained by the British.

Still… in previous times, i.e., before the internet age, blatant self-promotion was limited by economics, technology, and ethics, and there were more than a few derogatory terms for self-promotion.  And who remembers when the code of ethics of the American Bar Association banned advertising by attorneys?  Lawyers who tried to promote themselves publicly were termed ambulance chasers and worse… and disbarred from the profession. The same ethics applied to doctors and pharmaceutical companies.  Of course, there were never many restrictions on politicians, and now, unsurprisingly, there are less.

Unhappily, in field after field, excellence in accomplishment alone is seldom enough for success any more.  For more than modest success, excellence also requires massive promotion and/or self-promotion, even among authors. Some of us try relatively tasteful self-promotion, by attempting enlightening and hopefully entertaining websites, such as this.  Others go for a more sensational approach, and for some, no excess is too much.  From what I’ve observed lately, massive promotion and mere marginal competence in writing, along with cheap wares, results in sales that far outshine good entertainment or excellent writing that does not enjoy such promotion. One of the associated problems is that promotion or self-promotion takes time, effort, and money — and all detract from time, effort, and resources an an author can devote to the actual writing

Several years ago, Amazon embarked on a campaign to persuade people rating books to use their real names, rather than pseudonyms, because authors [yes, authors] were using aliases or the aliases of friends to blatantly praise their own work, and in some cases, to trash competing works. I have no doubts that the practice continues, if slightly less blatantly.

In today’s society especially, my father’s advice about not blowing your own horn leaves one at a huge disadvantage, because amid the storm of promotion and self-promotion  all too few people can either finds one’s unpromoted work or have the time or expertise to evaluate it… and if someone else blows a horn for you, it’s likely to be off-key and playing a different tune.

 

The “Competitive/Comparative” Model in Teaching

The local university just announced a merit pay bonus incentive for faculty members, and the majority of the Music Department greeted the plan with great skepticism, not because they opposed recognition of superior accomplishment… but because the proposed structure was essentially flawed.  In fact, for many university departments, and for most schools, as well as many businesses and corporations, such “merit” awards will always be fatally flawed.

Why?  Because all too many organizations regard their employees and even their executives as homogenous and interchangeable parts, even when duties, skills, responsibilities, work hours, and achievements vary widely, and those variances are even greater in the academic community and yet paradoxically, in terms of administration and pay, they’re even less recognized than in the  corporate world.

Take a music department, for example, with instrumentalists, pianists, vocalists, composers, music educators, and musicologists.  How, with any sense of fairness, do you compare expertise across disciplines?  Or across time?  Is the female opera director who built a voice program from nothing over 15 years, who has sung on low-level national stages intermittently, who is a noted reviewer in a top journal in her field, and who serves as a national officer in a professional organization more to be rewarded than the renowned pianist who won several prestigious international competitions and performs nationally, but who limits his teaching to the bare minimum?  Or what about the woodwind player who was voted educator of the year for both the university and the state, who is known regionally but not nationally  as a devoted and excellent teacher? Or the percussionist who revitalized the percussion program and performs on the side with a group twice nominated for Grammies? Or the soprano who sings in an elite choral group also nominated for a Grammy?

Then add the fact that all of them are underpaid by any comparative standard with other universities [which also indicates just how hard music faculty jobs are to find and hold]…and with other departments, even though the music faculty work longer hours as well as evenings and weekends, and the fact that the annual “merit pay” award would be a one-time annual payment of $1,000-$2,000 to only one faculty member.  In essence, the administration is attempting to address systemic underpayment and continued inequalities with a very small band-aid, not that the administrators have much choice, given that the legislature won’t fund higher education adequately and tuition increases are limited.

In primary and secondary schools, merit pay has become a huge issue, along with evaluating teachers.  Everyone, even teachers, agrees on the fact that good teachers should be rewarded and bad ones removed.  But determining who is good or average, and who gets paid what is far, far, harder than it looks, which is why most teachers have historically opposed the concept of merit pay, because in all too many cases where it has been actually implemented it’s gone to administration or parent “favorites,” who are not always the best teachers.  A competent teacher in an upper-middle-class school where parents are involved and concerned should be able to boast of solid student achievement on tests, evaluations, etc.  A brilliant, dedicated, and effective teacher in some inner city schools may well be accomplishing miracles to keep or lift a bare majority of students to grade level, while a competent teacher may only have a few students on grade level.  Yet relying on student test scores would suggest that the first teacher of these three deserved “merit pay.”  And in “real life,” the complications are even greater.  How do you compare a special education teacher with standard classroom teachers, even in the same school, let alone across schools with different demographics?

In addition, when teachers feel overworked and underpaid, and many, but not all, are, offering merit pay tends to turn people into competing for the money — or rejecting the entire idea.  I’ve seen both happen, and neither outcome is good.   Yet the underlying principle of ratings and “merit pay” is that such comparisons are possible and valid.  So far, I’ve yet to see any such workable and valid plan… and neither have most teachers. And when merit pay is added in with all the other problems with the educational system that I’ve discussed in other posts, all merit pay usually does is make the situation worse.  It’s an overly simplistic solution to a complex series of problems that few really want to address.  But then, what else is new?

 

 

“Willing” It to Be?

Last year, a voice professor listened, aghast, as a talented, but still far from top soprano vocal student announced she was going to forgo getting the more demanding Bachelor of Music degree and settle for a straight B.A., because she really didn’t need the extra work to get into the graduate school of music that was her choice. Needless to say, this spring the voice student received both her B.A. and an unequivocal rejection from graduate school. This particular scenario is becoming more and more common, according to the professor, who has been teaching at the collegiate level for over 30 years, is also a national officer of the National Opera Association, and, incidentally, is my wife.  She didn’t mean the rejection from graduate school, although that is also becoming more common in the field of music, particularly for women, because more women want graduate degrees and the competition is becoming more and more intense, but the growing tendency of students to make plans based on what they want, with no consideration of their abilities and no real understanding of the fields that they wish to enter.

In voice, for example, as my wife puts it, “good sopranos are a dime a dozen.”  For a soprano to get into a top-flight graduate school, she must not only have an outstanding voice, but excellent grades, performing experience, and a demonstrated work ethic.  On the other hand, the requirements for a bass, baritone, or tenor, while stiffer than they used to be, are not so demanding, for at least two reasons.  First, all operas and most musical theatre pieces have more male roles.  Second, not nearly so many men are interested in vocal performance, and many of those who do simply lack a work ethic.  So a hard-working male voice student with a decent voice and good grades may well have a better chance at both graduate school and a career than an outstanding soprano, because there are so many outstanding sopranos and fewer roles for them, not to mention the fact that there are also more tenured and tenure track university voice positions for tenors, basses, and baritones than for sopranos and mezzo-sopranos.

This tendency for young people to ignore reality is far from limited to the fields of performance. I’ve certainly seen it in the field of writing.  Every year I run across dozens of young would-be writers convinced that they’ll be published, if only they can get an “in” with an agent or an editor… or that, once they finish their epic, they’ll self-publish it as a e-book, and the world will reward them by purchasing tens or hundreds of thousands of copies. And I’ve read enough of what they’ve written to see why most of them can’t find an editor or agent.  But, I have to admit, occasionally, an author will make the big time through self-promotion and self-publishing –  and those authors were usually rejected by editors or publishers not because what such authors have written was poorly written, but because what they wrote was outside the boundaries of what “conventional” wisdom believed was popular. Such successes will happen… in perhaps in one in a thousand cases. I can name several cases where it has…at most a handful over thirty years.

I’m not knocking either ambition or dreams, but I am knocking the misleading idea that students can do anything they want, if they only work hard enough.  As I’ve said before, there’s a huge difference between “be all you can be” and “you can do anything you want.”  We all have both talents and skills… and limitations.  And willing yourself to be successful in areas where those skills don’t exist or are modest at best is usually an exercise in futility.

You can’t simply “will” something to happen because you want it badly enough.  Wanting it badly enough is merely desire.  Beyond desire, to reach a goal requires talent and polished skill in the field, knowledge of the field, and a willingness to work one’s way up through long and grinding work. A noted chef declared a few weeks ago that almost none of the young people seeking to become chefs in his restaurants ever wanted to start at the bottom.

If you don’t have the basic tools and mental or physical abilities required in the field, all the work in the world won’t help.  If you have the talent, but not the work ethic, you won’t make it, either.  And even if you have all of that, sometimes you might not, either, not because you aren’t capable, but because there are only so many openings at the top of any field… and sometimes it just takes luck and timing to go from being near the top to the very top.

In the field of classical music, for the past decade, professional performers and experienced music professors have been telling students just these points – and yet, very few of them, or their parents… or the politicians, appear to be listening. In the area of writing, I’ve witnessed many of my colleagues making the same points, and, frankly, I imagine this has occurred in other professions as well… so why aren’t the students listening?  Is it a media culture that shouts louder that anyone can be anything? Or is it a national epidemic of wishful – or willful – thinking? I have to wonder.

Patriarchy, Politics, and Religion

This past Wednesday, the lead story on the front page of the Salt Lake Tribune was entitled [unsurprisingly] “Multiplying Mormons expand into new turf.”   The story was based on the latest once-a-decade U.S. Religion Census.  According to the Religion Census, the fastest growing religions in the United States are Islam, the LDS Church, and Evangelical Protestant churches.  The single largest Christian faith is still the Catholic Church.  I find this combination rather unsettling, because, despite their theological and sectarian differences, all of these faiths share one commonality.  Despite all protests to the contrary, all are highly patriarchal/paternalistic and sexually chauvinistic and effectively place men in a higher socio-theologic position.

In addition, the three nominally Christian faiths [I’m including the Mormons, because they consider themselves Christians, even as some Christian faiths do not] have a large and growing presence on the political front, particularly within the Republican party. No matter what people do or don’t claim, in the end what people and what the politicians who represent them believe tends to find expression in the political dialogue, in proposed legislation, and, eventually, in law.

Once upon a time, the vast majority of the United States was more highly religious than it is today, and there were considerable sectarian differences and beliefs.  Because of those fierce differences, in effect, the founding fathers created a system that attempted to keep religion out of government… and it worked for quite some time.  I’d submit that it worked because religion was a key issue for a great many people, possibly a massive majority, and no one wanted any other faith to gain an advantage through government.  But times have changed, and although 80% of Americans claim to be “Christian,” only about 50% of Americans actually actively belong to any type of Christian congregation, and another 16% are professed or practicing atheists.

This suggests that close to half the population doesn’t possess the same burning concern about religion as it once did… but the first political problem is that these more “moderate believers” and non-believers are in the position of attacking religion or “morality” when they oppose the attempts of the “true believers” to enact religious-based standards as part of government policies and law, even when those standards effectively discriminate against women. The second problem is that the entire movement of true equal rights for women is essentially a secular movement.  It has to be, because with the exception of a few faiths with very small followings [such as Christian Science or the Wiccans], the vast, vast majority of all organized religions have a paternalistic and chauvinistic tradition, and only a few of those faiths have made much effort to change those traditions.

While there are exceptions, in those countries dominated by paternalistic religions, in general, women have fewer, and in many cases no rights.  Yet here in the United States, those religious faiths showing the greatest gains in adherents are those that are fundamentalist and patriarchal. But whenever women raise the issue, such as in the recent Democratic Party effort to point out that Republican legislative initiatives are a “War on Women,” the general reaction is that women are over-reacting. And some Republican partisans have even suggested that the current administration’s efforts to strengthen women’s access to birth control and contraception were a war on freedom of religion.

But, of course, that does raise the question of whether freedom of religion extends to using legislation to reinforce the historical patriarchal male domination of women has any place in a nation that supposedly prides itself on equality.

Spaceflight Fancy?

I recently read an interview with the noted biologist E. O. Wilson, who is rather eloquent on the need for a far more environmentally conscious public, and I was agreeing with much of what he said – until I got to the part of the interview where he essentially said that human space travel was a dangerous delusion that should be scrapped, and that, if treated properly, the earth can provide for humanity for as long it needs s place to live.  Now… I understand what he was saying in one sense, because there is no physical way that we could ever move even a significant minority of human beings off Earth to another locale.  The earth is likely to be habitable for far longer than the existence of any previous species in the history of the planet, but without greater environmental awareness and action, that habitability for humans will be threatened, if not destroyed.

Am I an unrealistic dreamer in wanting humanity to reach beyond one planet – even if only a tiny minority of men and women do so?  Or am I a realist, considering that at least once a large space object struck earth and the resulting ecological and physical disasters wiped out thousands of species, among them the dinosaurs?

One of the better traits of human beings is to reach beyond the here and now, to dream of what might be.  A second trait is that we tend to do better when we’re pursuing dreams, even impossible or impractical dreams.  We certainly made far greater strides in many fields, including technology, when we were engaged in the space race with the USSR – regardless of the motivations behind that gigantic effort.  Is it mere coincidence that the ancient Egyptian civilization that pursued its dreams of immortality, however flawed the basis of those dreams, was also the longest lasting?

We also have a tendency to become insulated and self-seeking when we don’t pursue dreams, as at present, when political and social conflict after conflict is taking place in the United States, and elsewhere, over who gets control over what.  The entire debate over healthcare is an example.  Rather than finding ways to expand healthcare coverage to those who don’t have it, all the powerful political factions are arguing over why this group and that group shouldn’t have to pay for it – an argument along the lines of “I’ve got mine; you get your own.”  The anti-immigrant debate follows the same logic, ignoring the fact that the nation made massive strides in the past based on immigrant contributions.

The science budgets of almost all major nations, except the Chinese, are dwindling, and certainly U.S. politicians have turned a blind eye and a deaf ear on all but modest scientific studies.

And what are the dreams of today? A better tiny gadget for introspection [the I-phone], video games with super graphics, the establishment of a theo-political state, the amassing of great concentrations of wealth, the celebritization of society?

No thank you, I’d prefer the dreams of endless space, and the wonders of the stars. What about you?

 

Incompetence and Uninterestedness

A week or so ago, I was making airlines reservations online – rather I was attempting to do so, but found I couldn’t because my computer wouldn’t let me get  beyond the first screen or so at the Delta website, claiming that the Delta website’s security certificate had expired or was not valid.  This had happened to me once before, because the date on my computer was wrong.  So I checked my computer.  No problems that I could find.  Then I tried the other computer.  Same results.  I called my wife at her office.  She tried on her work network.  The same results.  I called Delta. The first representative insisted it was my computer, and then I got disconnected. I tried Delta’s technical support line, waited, and got disconnected.

I waited an hour and tried Delta again.  This time the representative actually knew about the problem and informed me that the tech team was working on it – and agreed to ticket me at the online price.

But my question is: How on earth could the IT staff at one of the world’s largest airline systems, a system that depends heavily on website bookings, EVER let their website security certificate get close to expiring?  Or was this just the result of hacking?  I don’t know that I’ll ever know, but when I talked to one of my daughters, who used to run the IT division of a major chemical company, she informed me that all too many companies have IT divisions that often tend to ignore or postpone the routine “necessities” – until they become a crisis. Of course, one of the reasons she was successful was because she didn’t allow that sort of thing to happen.

I’m certain that tracking security certificates is not the most exciting of IT tasks.  Nor is the business of methodically checking to see what holes may have developed in a website’s security, but both are vital.  Just last month, the state of Utah discovered that its Medicaid/Health database had been hacked, and the hackers had access to the addresses of 800,000 people and the Social Secuirty numbers of more than 150,000… and the initial investigation concluded that “laxity” and failure to follow procedures for handling data were the principal causes.

I also find it interesting that my readers often get upset over a handful of typos in a 400-500 page book, which is annoying, and which I wish didn’t happen, but does, despite my best efforts and those of editors and proofreaders.  But those errors don’t have anywhere near the potentially disastrous impact of software glitches in an economy that has become increasingly dependent upon computers.

In the end, it boils down to one thing.  Failure to do what is required, whether what is required is routine, dull, or boring, amounts to incompetence, no matter how skilled the technicians and engineers may theoretically be, and such incompetence leads to huge problems, if not disasters.

Boredom and uninterestedness aren’t a valid excuse.  Neither is management failure to recognize the problem, regardless of the “costs.”  In the case of books, costs are a valid concern, but when lives and livelihoods are at stake, costs shouldn’t be the primary focus.

 

Culture… and Race

Over the years, even centuries, people, and even learned scholars, have offered various rationales about “race,” either saying essentially that all generalizations about race and racial traits are false, or at the other extreme, claiming that racial heritage is a significant determinant of such individual traits as intelligence, muscular ability or lack thereof, industriousness… and the list is sometimes endless. In the course of finishing my latest SF novel [The One-Eyed Man, which I just turned in and my editor hasn’t even begun to read], I thought a great deal about why people are the way they are, and what factors influence them.

On Earth, civilizations have risen, and they’ve fallen, and there have been pretty impressive civilizations raised by peoples of various colors. Ancient Egypt boasted one of the largest and most long-lasting of the early civilizations, indeed of any civilization to date.  The Nubians of the eighth century B.C. were strong enough to topple the Egyptians and ruled all the way from the southern Sudan to the sea and much of the southeastern Mediterranean.  There are massive ruins in central eastern African embodying huge palatial complexes that had to represent a large organized state.  The various Mayan civilizations not only represented an intricate and complex civilization but one with a mathematics involved enough to create a calendar that would be largely accurate for tens of thousands of years. The Aztecs and the Incas created significant empires despite lack of critical resources (such as beasts of burden and transportation).  Archeologists have now discovered traces of ancient large cities in the United States, along with significant earthworks and plazas.   At one time, the Chinese empire was without peer anywhere.  The most advanced sciences in the world at one time were Islamic. Rome controlled the entire Mediterranean basin for hundreds of years.

All of these civilizations had differing “racial” backgrounds, but all were great and advanced in their time. If one looks at modern industrial nations, the vast majority have individuals of virtually every racial background who have great accomplishments. Yet the Mayan civilizations of 1500 years ago vanished without a trace.  The great African civilizations are long gone.  So is the Roman Empire. Egypt has been an impoverished backwater for hundreds of years.

Historians will give many answers, and all too often the most common answer among most people is that “they got conquered.”  That’s true in some cases, as in the instance of the Aztecs and the Incas, but it most instances, the civilization collapsed from within, sometimes under pressure, sometimes not.  One of the most interesting and, I believe, revealing cases is that of the Mayan city-states in the northern Yucatan area. Although they had developed sophisticated water gathering and use systems and weathered extreme droughts in the past, another drought finished them off.  The people dispersed, from not a few cities and towns, but from thousands… and they never returned, leaving the magnificent ruins we see today.  While there is some evidence of battle and brutality… in most cases, that doesn’t appear.

What I found intriguing was that the final decline of the Maya coincided with the rise of a new, and more brutal, and perhaps even more fundamentalist religion, the worship of the serpent god Quetzalcoatl.  I’m not about to blame the decline on just that, but I do think it points out that the decline of almost every past great civilization is linked to a change in the “culture” of that civilization.  One can date the decline of the great Chinese empire to the time when a new emperor burned the entire fleet – the greatest in the world, that had explored the Pacific and all the way to east Africa.  Did that emperor change culture?  It’s more likely he reflected that change, but with that change from outward-looking to inward-looking, the decline proceeded.  At one time, the greatest scientists in the world were Islamic, and the western European world learned from them.  Then… over a few decades, that intellectually open culture closed, and the Islamic world went into a long and slow decline.

Too often, it seems to me, those people who profile “race” aren’t profiling race at all.  They’re profiling culture.  Like it or not, all too few blacks coming from U.S. inner city backgrounds, especially young males, are all that successful, and the murder rate is astounding. Is that racial? I doubt it.  Is it just poverty?  I doubt that as well. It can’t be racial, because very few black males who are raised outside the inner city culture demonstrate the traits of inner city black males, and one can also see similar traits of violence and anti-social behaviors in other impoverished groups, but they’re not identical because poor white culture isn’t the same as poor black culture… but it’s the culture that makes the difference, not the racial background.

And, like it or not, some cultures are toxic. The Ku Klux Klan is a toxic culture.  So, frankly is the current inner city black culture.  So is the pure white Fundamentalist Latter Day Saint culture.  So was the Nazi culture, and there are certainly others that could be named.  Not all cultures or subcultures are worthy of preservation or veneration, regardless of the diversity movements that are so popular among certain groups…  but I think it’s well past time time to make the clear distinction between culture/subculture and race.

 

Medical Economics

In late March, the U. S. Supreme Court held its hearings on the Affordable Care Act [ACA], otherwise known as Obamacare by its opponents.  At that time, polls were taken, and while a clear majority of Americans oppose the Act, a majority happens to like most of the major provisions of the law. Seventy percent of the respondents approved of the provision that forbids insurance companies from refusing to cover people with preexisting medical conditions.  A majority approves of expansion of certain Medicare coverages and the coverage of adult children of policyholders to age 26.  What a sizable majority opposes is the mandate for all Americans to obtain insurance coverage, one way or another, or to pay significant federal fines and penalties.

Exactly what will happen, however, if the Supreme Court strikes down the individual insurance mandate, but upholds the remainder of the Act?

If that occurs, and it is indeed possible, healthcare insurance costs will continue to rise, and to do so at rates as fast as regulatory authorities allow.  That won’t be because the insurance companies are greedy, but because they’ll need those increased premiums to pay the healthcare costs of their policyholders.

Why?  Because the cost-savings projected with the ACA were based on increasing the pool of those insured and because all the features everyone likes will increase costs in ways that the opponents of the Act aren’t facing.  The most notable problem that strikes me is what will happen if people who aren’t insured, and who won’t be required to purchase health insurance, come down with serious health care problems.  As the prohibition against not covering preexisting conditions kicks in, those with problems may very well be able to purchase insurance only after they get sick – and still get coverage.  Then, given the high cost of insurance, more people will opt out of health insurance, until or unless they need major medical treatment.  This could easily undermine the entire healthcare system. And most of those involved with the pending court decision have already noted that this would be a problem and that if the individual mandate is declared unconstitutional, the wider coverage and prohibition of denial of coverage for preexisting medical conditions would also have to be struck down or repealed.

As it is, there are more and more doctors who refuse to treat patients covered by Medicaid because they literally lose money on each patient, and some doctors every year who are caught “overbilling” Medicare insurance, many of whom claim to do so to cover costs. All insurance is based on spreading risk across the population and across a lifetime.  Wisely or unwisely, the ACA attempted to extend benefits by mandating extended coverage.  Without that mandate, regardless of all the rhetoric, the current economics of American medical care will require both higher insurance rates in some form and more denials of expensive medical procedures.

If the universal mandate is struck down, the only ways out of this mess, in general terms, are to:  (1) totally reform the entire health care system [which is impossible in the current climate]; (2) deny more and more care to a wider number of people [possible and likely, but politically unpalatable]; or (3) continue on the course of raising prices in some form or another [higher deductibles and co-payments, higher premiums, etc.].

Once again, we have the conflict between what the public demands and what that same public is unwilling to pay for… or wants someone else to fund.

 

 

 

A Right to be Paid for Writing?

The other day I came across a commentary in the Libertarian e-zine Prometheus Unbound, in which the commenter declared that while writers, maybe, should be paid for their work, they had no right to be paid, essentially because ideas should not be able to be copyrighted. After I got over my disbelief, and swallowed my anger, I got to thinking about the question… and decided that the commenter was not only misguided, but an idiot.

While I’d be the first to admit that ideas are central and crucial to my work, frankly, that’s not why most people buy books.  Nor are ideas the difficult part of writing, as most authors, if they’re honest, will admit.  What takes work is the process of creating a work of entertainment than embodies those ideas in a way that draws in readers.  Readers buy works of fiction to be entertained, and it takes me, and every author I know, months, if not longer, to create and provide that entertainment in novel form. By the fallacious logic suggested by this Libertarian idiot, no one in any field has the right to be paid for their work.

Why?  Because the vast majority of occupations in a modern society require the combination of ideas and knowledge with the physical effort required to put those ideas into practice, whether in providing a service or a physical product.  Just how long would any society last if doctors, dentists, teachers, plumbers, electricians, salespeople, and almost any occupation [except perhaps politicians] did not have to be paid, except at the whim of those who used their skills and services?  Not very long.

No one is forced to buy books, mine or anyone else’s, but if they do want to read something produced by an author, why shouldn’t they pay for it?  It’s one thing to question the marketing of books, and the prices that various publishers, distributors, and booksellers charge… or even to question how authors should be paid and how much.  But to claim that a creator doesn’t have a right to be paid if someone uses something that took months to produce, that’s not Libertarian, as I understood it.  Except… I looked into it and discovered that there are actually two forms of Libertarianism, one which recognizes private property of the individual as basis of societal order and one which believes in community property, i.e., socialist communalism. Obviously, the commentator belongs to the second group, because he is saying that a novel, which as a physical form of entertainment [not an idea], belongs without cost to the community. I may be a bit old-fashioned, but that doesn’t strike me as Libertarian, but as confiscatory socialism.

All professional authors know full well that there are no original plots and very few truly original ideas in fiction, but to say that authors have no right to be paid for what they produce out of those ideas because these plots and ideas aren’t original is about as valid as saying that a doctor shouldn’t be paid because all doctors know the same medical knowledge.

Knowledge without application is useless and worthless; it’s the application of knowledge that takes work, and for that work the worker has a right to compensation. One can argue and bargain about the amount and the method of payment, but the principle of pay for honest work is fundamental to any functional society.

As I’ve noted before, the idea that information wants to be free is little more than saying people want as much as they can get from other people without paying, and that’s being an intellectual freeloader, not a what I’d call a true Libertarian… but what do I know?