Archive for the ‘General’ Category

Specific Theology as the Basis for Public Policy?

Republican presidential candidate Rick Santorum recently declared that President Obama’s acts as President were about “some phony theology… not a theology based on the Bible.” Frankly, I find an assertion such as this incredibly disturbing, because what Santorum is effectively saying is that public policy should be based on his reading of Christian theology.  As I’ve noted before, as have Constitutional scholars for more than a century, while the Founding Fathers did mention the Deity, they made it very clear that specific theologies – or theological belief systems – were not to be a part of government.  Yet Santorum is apparently attacking the president – and anyone else who doesn’t believe as Santorum does – for failing to base their policies and acts upon specific doctrinaire theological points.

Like it or not, the President of the United States and the Congress are responsible for the health and welfare of all the people of the United States and for allowing all of them the same freedoms, as set out in the Constitution and as interpreted, again, like it or not, by the U.S. Supreme Court.

Now, according to the best figures I can find, 75% of Americans define themselves as members of faiths considered Christian by most theologians.  Fifteen percent are atheists, and the remaining ten percent belong to other faiths.  Of those considered Christian, 25% are Catholic, 15% Baptist, and 4% evangelical or Pentecostal Christians, meaning that, in rough terms, essentially half of the American people, assuming they follow the theological guidelines of their faiths, might agree with Senator Santorum’s theological beliefs. The problem with Santorum’s position is that as much as half the population might well not agree, and fully one quarter of all Americans are not believing Christians at all.

In addition, a study conducted by Baylor University, based on interviews with 35,000 people, definitely a healthy sample, showed that more than 40% of the people had changed their faith and belief at least once in their lifetime, which also suggests that “faith” is far less constant than the protections in the Constitution.

Even more to the point, Santorum is not talking about freedom of religion, but about imposing restrictions on all members of society, restrictions based on his theological biases, and restrictions with which tens of millions of Americans do not agree.  Those who agree with the senator are not precluded from following exactly, and with no persecution whatsoever, the dictates of their own conscience insofar as their own property and bodies are concerned.  Under the Constitution and current law, however, they are precluded from imposing those beliefs on others, and effectively limiting the rights of half the population [women]. The senator clearly wants to change this.

It’s taken a long time to reduce discrimination based on color, creed, or gender… and Santorum’s use of religion, whether intended or not, would essentially turn the clock back to a time of greater discrimination under the guise of “true religion.”

Putting power in the hands of religious true believers has been a disaster wherever it’s happened, whether in the time of the Inquisition, the Salem witch trials, the ayatollahs in Iran, the Taliban in Afghanistan, or any other number of instances.  Doing so here wouldn’t be any different… and it would be a betrayal of the Founding Fathers that all so many of those of Santorum’s stripe quote so much when it suits their needs… and ignore when it doesn’t.

 

 

 

 

 

Religion and the Constitution

From the considerable amount I’ve read about the early history of the United States, one of the goals of the Founding Fathers was to protect the government – and the people – from the heavy hand of religion… and to keep organized religion from infringing the rights of the people.  So, it’s with a sense of irony that I find so many religious zealots of so many types complaining essentially about what the Constitution was designed to do – to stop government from being a tool of religion.

When the Supreme Court decided in Roe v. Wade that women had a right to abortion, the Court essentially came down on the side of individual freedom, asserting that the “right of privacy, whether it be founded in the Fourteenth Amendment’s concept of personal liberty and restrictions upon state action, as we feel it is, or, as the district court determined, in the Ninth Amendment’s reservation of rights to the people, is broad enough to encompass a woman’s decision whether or not to terminate her pregnancy.”

Under the Constitution, that sort of finding is the Court’s to make, and unless Congress and the states see fit to amend the Constitution, or the Court extends, modifies, or reverses its ruling, that ruling is the law of the land.  Period.

When a state legislator proposes laws to forbid schools from teaching about birth control unless it’s abstinence and only abstinence, that legislator is attempting to restrict the freedom of information, and to determine exactly what information is to be conveyed, for religious purposes. This is particularly heinous because it restricts an individual’s knowledge.  When legislators oppose civil unions for same-sex couples, they’re effectively declaring that the state sanction such a legal union only on the basis of religious traditions and practices.  When states or legislators require businesses to close on Sunday – as many did at one time – that is imposing the requirement of Christian religions on commerce.  Why not require closure for the Jewish Sabbath… or on Saturday for Seventh Day Adventists?

One of the founding principles underlying the Constitution of the United States was an understanding that there’s a significant difference between freedom of religion and state imposition of religious requirements on everyone. It’s one thing to allow someone to close their business on their holy days.  That’s allowing individual freedom.  Requiring everyone to close on Sunday is using government for religious purposes.  Several years ago, here in Utah, constituents pressured local lawmakers to forbid civic functions on Monday nights because that interfered with the LDS practice of Monday home evenings.  Thankfully, such prohibitions weren’t imposed.

Virtually all widely accepted ethical, moral, or religious beliefs agree that such acts as theft, assault, forgery, fraud, murder, rape should be prohibited, and their practice punished.  The obvious conflict between freedom and legal codes lies in the gray areas where various beliefs and religious codes disagree.  But under the U.S. Constitution, as articulated by the Founding Fathers, personal freedom of action or speech should not be restricted unless it poses a clear and present danger to others…. And all too many “religiously-associated” attempts to restrict freedom of action and of speech have little or nothing to do with preventing such clear and present dangers, and far more to do with imposing restrictions on others in furtherance of one religious doctrine over another.

 

The Blame Game…

It’s official, or at least semi-official:  the United States has the highest rate of incarceration of any industrialized nation in the world… and by a huge margin.

Why?  Obviously, there’s no one single cause, but the largest factor is our drug laws, which criminalize possession of small amounts of drugs and the use of marijuana.  One of the associated problems with criminalizing marijuana is that the drug is ubiquitous and widely used, and that means prosecution and incarceration for use, possession, or distribution is in most cases highly selective, and selective enforcement is anything but just.  On the other hand, busting everyone who uses marijuana is essentially physically impossible.

As a matter of practicality, it’s becoming clearer and clearer, not that it hasn’t been so for a long time, if anyone really cared to look, that the massive criminalization of drug use is anything but healthy for the United States.  Prisons now cost most U.S. states more than they spend on all forms of education, and those costs are rising.  The massive amounts of money and profit from illegal drugs are fueling gang violence in both the U.S. and Mexico, and, in general, police efforts have a modest effect, at best, on even holding that violence in check.

So what if we changed our approach to drugs?  What if we just legalized their use for adults over 21?

Immediately, the outcry is likely to rise – What about all those poor drug victims?

Well… what about them?  What about handling the issue the way we generally deal with alcohol?  We tried outlawing alcohol for everyone, and that was a disaster.  The compromise was to forbid its use and consumption by people under 21 [or sometimes 18], and to prosecute those who supplied it to underage drinkers.

The anti-drug legalization forces tend to focus, whether they realize it or not, on saving people from their own worst impulses. This is, unhappily, a selective approach, in our society, applied in some areas and not in others, and it’s an approach that works in some cases and not in others.  Seatbelt laws work as well as they do, I’m convinced, because in a very real sense, they’re really not more than a minor change in behavior.  As a matter of fact, in a car, in any car, you really can’t move around that much anyway.  A seat-belt law restricts that movement slightly… and saves tens of thousands of lives annually – and it doesn’t lead to the development of a trillion dollar underground economy in seat-belt cutters, or the like.  The same sort of argument can be made for many [but not all] health and safety regulations.

What we might better consider is legalizing drugs, requiring standards for them – and holding drug users totally responsible for their actions.  In other words, if someone chooses to use drugs and commits a crime either under the influence or to obtain funds for such drugs, the penalties should be even tougher… because they made the choice to use drugs, knowing the possible consequences.  Likewise, penalties for pushing drugs to those underage should be extraordinarily severe.

But, of course, none of this will happen, because no one really wants to hold people responsible for their actions, whether those people are students who want good grades without working hard and without studying, or politicians who haven’t the nerve to tell constituents that they can’t have more government services without more taxes, or Silicon Valley internet companies who want free content without paying for it, or Wall Street financiers who escape prosecution for what was essentially fraud and misrepresentation….

No… someone else is always to blame.

 

“Local Control” Politics

An earlier blog talked about “code” in political speak, and several incidents that have come to my attention recently caused me to think about one particular form of “code” that’s always been a part of American politics, but is now making a resurgence, particularly with the more right-wing elements – although it’s certainly not absent from the far left, either.

That’s the specter of “local control.”

For years, “local control” was used as both a justification and a means for maintaining segregation of elementary and secondary schools across the country.  Today, combined with “states’ rights,” it’s become a rallying cry for those who dislike federal laws and mandates that are contrary to local practices. Western states who would rather fund their governments through mineral severance taxes claim that federal environmental laws and land use regulations restrict the use of “their lands” and demand greater local control.  “No Child Left Behind” regulations are cited as an example of infringement on local rights. Religious organizations that wish to deny employees health insurance that covers birth control manifest another form of local control. The government or the state isn’t mandating birth control;  it’s mandating the opportunity, and it’s up to the individual as to whether that opportunity is used.

And all too often, the push for local control is both a hypocritical protest against federal actions, often those designed to increase personal freedoms, and an attempt to restrict those freedoms. For example, here in Utah, the governor and state legislators rail against federal control, but they attempt various ways to curtail the sale of liquor, to mandate the longest waiting periods for women to have abortions, to require mandatory marriage counseling before allowing divorce proceedings to be filed, to allow local school districts to opt out of providing sex education classes, and to restrict the distribution of federal funds for programs they dislike.  Right wing legislators demand that people have the right to bear arms, even though weapons kill tens of thousands of people, while railing against abortion and contraception on the grounds that life is sacred.  If life is that sacred, why don’t they try to ban weapons as well as birth control and contraception?

So-called “local control” also pops up in other ways.  Some thirty years ago, Brigham Young University, which is essentially owned and operated by the Mormon church, used to have faculty who were not of the LDS faith, and full-time faculty were either tenured or on tenure track, allowing them at least a modicum of protection if their public views were at variance with those of the church. In more recent years, BYU has abolished tenure, and, from what I can tell, all faculty must be members in good standing in the LDS faith.  The combination of lack of tenure and the need for standing in the Mormon Church allows the church total control over the faculty.

Interestingly enough, a Utah state legislator has proposed, in two sessions running, legislation to abolish tenure at most state colleges and universities, ostensibly to make it easier to get rid of “bad professors.”  What’s interesting about this is that the state’s Board of Regents implemented a post-tenure review system over five years ago, which has been tightened considerably in recent years… but that’s apparently not enough.  Given that the majority of faculty and administrators at the affected institutions are LDS, what would be the likely impact of this increased “local control”?  Might it just be a far greater reluctance of non-LDS faculty to even want to teach in Utah?  Might it just be, in effect, to turn state colleges and universities into institutions more “in line” with local, i.e., LDS, values?  Wouldn’t that possibly in practice effectively violate the idea of separation of church and state?  And wouldn’t that be essentially antithetical to one of the fundamental purposes of higher education – to broaden a student’s exposure to other values and cultures?

From what I can see, in most cases, people advocating more “local control” are really saying, “We want to do things our way, even if it tramples on the rights of others, because our way is right.”

 

Mine! Mine! Mine!…. Ours! Ours! Ours!

This past Sunday a former Utah resident, Josh Powell, turned his Washington state residence into an inferno, killing himself and his two young sons, aged five and seven. While the exact reasons for his actions may never be fully known, what is known is that his wife, now presumed dead, vanished slightly more than two years ago under mysterious circumstances, leaving everything, including ID, car keys, and wallet, at home and that the now-deceased husband was a definite “person of interest.”  What is also known is that the courts awarded custody of the boys to the missing woman’s parents, and the husband had fought this tooth and nail, declaring that the courts had no right to take away his children.  Unhappily, this is certainly not the first time events such as this have occurred, but, to me, it’s symptomatic of a certain mindset, usually more prevalent in males, but certainly not limited to them, which regards far too many aspects of life as theirs exclusively.

Although English common law of two hundred years ago did in fact make women and children – and all they possessed – possessions of the husband, the law in both the United Kingdom and the United States has changed considerably, to the point that, at least legally, women are not the possessions of their husbands, and courts regard parents as guardians and custodians of children – and not owners.  And the U.S. Civil War resulted in the abolition of slavery, a practice that was a legally accepted way of allowing a slaveholder to declare that intelligent human beings were “Mine! Mine! Mine!”

In some countries, of course, men can still insist that women and children are their personal possessions, as witness the news story about the Afghan man who killed his wife because she had the effrontery to bear him a daughter rather than the son he had demanded.

Unfortunately, the “Mine! Mine! Mine!” mindset doesn’t limit itself to just spouses and children, but seems to be making a resurgence in other areas as well even in the United States. This is why we have, at least in Utah, state legislators breaking the law by riding ATVs across roadless areas and declaring that those federal lands don’t apply to them – because “it’s my right” to have access any way I want. It’s also “my right” to own and employ assault weapons and fifty caliber machine guns.  And “my right” to insist that the government force women to have children forced on them by rape or incest.

There is, of course, the other extreme – those who claim that essentially everything is “ours” and that government exists merely to decide how much of “our” stuff each of us gets to keep and use. Over forty years ago, in “The Tragedy of the Commons,” the ecologist Garrett Hardin pointed out how, when everything is held in common, almost never is it cared for, at least not without a great deal of social control and regulation.  In short, true “communism” or “communalism” has never proved to be workable.

The upshot of all this is that no rights can be absolute in any civilized society, especially the rights to insist that other people are “Mine!  Mine! Mine!”  or that everything belongs to everyone. And the first tragedy of the Powell case is that an ultra-possessive father and husband could not bring himself to understand that.  The second tragedy is that most politicians, especially those on the exteme fringes, don’t understand either… or choose not to in order to court political favor.

 

More Problems with “Simple” Solutions

President Obama has apparently now decided to try to punish universities and colleges, even state universities and colleges, who raise their tuition by “excessive” amounts.  This is, pardon my language, absolutely asinine.  It’s also addressing a very real problem with a simplistic approach that shows either no understanding of the problem or no intention to really address it, if not both.

To begin with, he doesn’t have the leverage to do this directly, but only through the threat of withholding direct federal funding, which doesn’t include federal loans and grants made directly to students, and which amounts to less than 3% of total federal funds going to students and institutions of higher education. The real reason for the increase in student tuition, particularly at state colleges and universities, is the significant decline in the funds provided by the state legislatures over the past several decades. In just the last year, state support to state universities and colleges dropped more than 7%.

Over the last 30 years, tuition for an undergraduate degree has increased roughly 600%, while the cost of living has increased 250%

Why and how did this happen?  It happened because, over the last twenty-five years, the number of students pursuing an undergraduate degree increased by almost 45% at a time when the percentage of state spending on higher education declined, resulting in a huge decrease in the percentage of tuition costs subsidized by state governments.  So, although total state funding of higher education did initially increase by some twenty percent [until about ten years ago], that increase was overwhelmed by a huge influx of additional students, and without additional state funding the only way the state colleges and universities could cope was by increasing tuition.

My wife’s own university is a good example.  The number of students enrolled has almost tripled in the last twenty years, while the faculty has increased by less than forty percent.  Faculty salaries have been frozen for at least six years out of the last eighteen, and yet tuition increases have averaged roughly 7% for the last three years, and an 11% increase is budgeted for next year.  On average, faculty pay has averaged increases of 3-4% per year over the last 20 years, and that includes rank and merit raises.  Over the same period, in real dollar terms, despite an outstanding record, and two promotions, in real dollar terms, my wife now makes only about 10% more as a tenured full professor with an incredible record of achievement than she did twenty years ago as a newly-hired untenured assistant professor.

These numbers are similar for virtually every public university in the country, as study after study shows.  The problem is not, despite popular beliefs, high-paid professors and wasteful spending, but simply a massive increase in students coupled with a lagging of state support – and President Obama’s threat to colleges and universities totally ignores the basic economics.

One of the more disturbing results of this funding crisis is that, on average, the salaries of tenured or tenure-track assistant, associate, and full professors at state colleges have dropped from being roughly comparable to those at private colleges and universities twenty years ago to being 20% lower than those at private schools today.  Add to that the fact that professors at public institutions generally teach larger classes [represented by the fact that the student-teacher ratio is almost 50% higher at public institutions], and the discrepancy becomes effectively larger.  What this also means is that, over time, a larger percentage of the best professors will migrate to private colleges and universities, and not only will students at state institutions have the problems of larger classes, capped enrollments in classes [because classrooms are only so large and in specialized classes, professors are limited in the number of students they can effectively educate], but also fewer of the very best professors, particularly in the years ahead, when senior tenured professors, who have remained in state institutions because they’ve established roots there, retire.

And the threat of withholding $3 billion in federal funds does nothing to address the problem.

 

 

“Selective” Politics

The governor of a state I know very well just delivered his state of the state address, which seemed to consist largely of state boosterism and a blistering attack on the federal government.  Needless to say, both were well received, given that the state is one of the “reddest” in the nation.  Of course, much of what he said distorted economic statistics and political reality, not to mention the Constitution, which is possibly the most misinterpreted United States government document.  This governor, alas, is not unique.  He may be a bit more extreme in his distortions than others, but all politicians do it, whether they are on the right or the left.  They select and distort, and seldom are they called on it directly by the media.

Oh, there are political cartoons… sometimes.  There are thoughtful articles… that few read.  And any politician who’s truly direct and honest… most don’t stay politicians long.  I can still remember, and this dates me, the violent abuse that President Carter took [and I didn’t vote for him] when he made a direct and simple statement:  “Life isn’t fair.”

So why do governors and other politicians continue such misrepresentation?

Because they judge that it’s what the majority of the voters who elected them want to hear.  No one wants to hear that their state has one of the highest foreclosure rates in the country, more than three years after the housing bubble burst.  No one wants to hear that their state ranks fiftieth out of fifty states in per pupil spending on elementary and secondary education, and that it spends roughly half as much as the state that ranks forty-ninth… or that the socio-economic make-up of the school age population disguises how poor that education is, so much so that, on average, only about half the students who do attend college can manage to graduate in six years or less, and that close to half require remedial work in some field or another before they can even begin true college work.  Nor do these so highly principled opponents of the federal government ever mention that for every dollar the state spends on Medicaid, which they oppose, the federal government kicks in three dollars… or that while they’re bashing the feds, they’re also lobbying ferociously to keep open federal facilities in the state, and spending taxpayer dollars in conducting such lobbying.  And of course, no governor dares to tell his legislature to stop wasting time on passing resolutions condemning the federal government for programs and laws that the Supreme Court has found constitutional time after time.

Why… if he did that, he might not get re-elected.  Or he might actually have to address the real problems.

So why do the majority of Americans put up with this sort of idiocy from their state and local elected officials?  Is it because they feel they’ve lost control of their lives?  Yet, is electing and re-electing self-serving demagogues the way to regain that control?  From what I can tell, the more honest, the more pragmatic, the more realistic a candidate is in assessing and presenting the situation facing the country and government, the less chance he or she has of being elected, regardless of party.  And yet… voters are polarizing along party lines, even as both parties select candidates who are the least likely to come up with workable solutions.

And historians thought the “know-nothings” and “yellow dog Democrats” of the nineteenth century were bad?

 

Political Science

The other day a reader sent me a question, essentially asking for a recommended reading list for books that offered insight into the political process, adding the observation that she’d learned more about politics from my books than from all the college political science courses she’d taken.  As a practical matter, I can’t comply with her request, for the simple reason that there isn’t a single book, or even a short list of books, that would do that subject justice.  Over the past fifty years, I’ve read hundreds, if not thousands, of books dealing with politics and history and thousands of articles, and each one has added to my understanding, either of politics or of the shortcomings of some writers in the field. The principal reason why no short list of books will suffice for a good understanding of politics is that, at least in my opinion, the vast majority of books on politics approach the subject in a typical “American” fashion. They’re almost invariably “how” books – how this politician got elected, how this campaign was run, how the Federal Reserve botched the real estate bubble – or some form of biography or, occasionally, the history of political developments.

Usually, buried in lesser paragraphs but not always even mentioned, there are some explanations of why things happened, but very seldom do those “why” explanations deal with the basic structure of politics and government, which is something that my fiction often does.  Those explanations are sometimes referred to as didactic or boring, but they do offer reasons why my characters do what they do or why they can’t do what they’d really like to do.

I grew up in a family that was active in low-level politics.  My father was a town councilman and acting mayor.  My mother was a state officer in the League of Women Voters.  Politics was always a dinner-time subject, and several congressmen and senators were personal friends of my parents, as was a noted U.S. Supreme Court Justice.  For whatever reason, I also read histories, any kind of histories, voraciously.  But when I got to college, a small but prestigious Ivy League school noted for its political science department, I was absolutely stunned to learn how little my professors actually knew about electoral politics, legislative branch politics, or politics on the local level.  They were all experts on the Presidency and the administrative branch, but not on the other branches.  Later in life, after a tour and half in the U.S. Navy and some industrial and real estate jobs [as a lowly and not terribly effective day-to-day realtor], I ended up as a Congressional staffer, first as a legislative director, then a staff director, before I became director of Legislation and Congressional Affairs at the U.S. EPA. After that, I spent another six years as a consultant to a firm that attempted to influence federal government policy in various areas through the presentation of factual and political matter, call it selective fact-based lobbying as opposed to contribution-based lobbying.  Those experiences confirmed, in general, that very few academic political scientists truly understood the entire political structure.

Frankly, I don’t know of anyone writing books on politics that has my kind of background, not that there isn’t, but such authors must be rare.  There are certainly many distinguished authors who know far, far, more than I do about given subjects, but the problem is that very few of them have a breadth of experience and the inclination to ask why things are as they are and how they came to be that way.  The same is also true of most politicians and their staffers.  They’re preoccupied with obtaining, holding, and exercising power, but most are limited in that because they often don’t understand the nature of power in a larger sense, of the forces that shaped and are still shaping and reshaping the political structure.  They are, however, masters of manipulating the structures close to them and to maximizing their own political power.  This, by the way, is why very few senators or representatives make good presidents, because their careers and their focus are based on getting elected from a very specific constituency that can never represent the wide range of interests and problems that face the American President.  Even a senator from California or Texas is limited, because very few of them truly understand the executive branch, while, on the other hand, very few political appointees understand either industry or the congress, and those do do understand one seldom understand the other.

Add to those factors the fact that most readers of non-fiction want to know the juicy items and “hows” of politics more than the “whys.”  And the “whys” are often disconcerting and unpleasant.  After running an office in the Reagan Administration, I have a much better understanding of exactly why the Civil Service is both cumbersome and slow to react – and it makes perfect sense, given the pressures and structures [and it would take a long article to explain, which most people would dismiss, as I know after trying to explain on a number of occasions in the past… yet that is why it works the way it does, like it or not].

The other problem is that, in my books, I can show what lies behind intrigues, but in the real world it’s even more complex and even a slight reference to a name or an event can trigger a lawsuit, and without concrete references, people tend not to believe what they regard as “theories,” particularly if those theories conflict with their beliefs and biases.  Even when a wealth of information is provided, as it has been in some cases, people tend not to believe what contradicts their preconceptions, whereas, as I’ve pointed out before, presentation of real or realistic factors and structures in a fictional setting allows readers to consider what I’m showing in a more objective nature, whether they agree with my presentation or not.

And that is an abbreviated, but all too long, explanation of why I can’t offer a short answer to the poor reader’s question – and why I’m not about to re-enter American politics again, either as a non-fiction writer, a staffer, or a candidate.

 

Rules for the Sake of Rules?

For the last few months of 2011, Cedar City and the local papers kept announcing that at long last, beginning on January 4, 2012, Cedar City’s airport would finally have jet passenger service.  And indeed on the afternoon of January 4, a Delta regional jet took off from the airport bound for Salt Lake City, replacing the twin turboprop Brasilia Embraers that have provided service for almost twenty years. Unhappily, that single jet flight has been the only one, since as soon as the regional jet landed in Salt Lake, the FAA forbid any further passenger jet flights by Delta until an environmental assessment was completed.  And that will take a minimum of two months, more likely four.

Was this for safety or environmental reasons?

Not so far as I or anyone else can determine.  The Cedar City airport has a runway almost 10,000 feet long that was recently upgraded and is capable of landing full-sized passenger jets.  It’s an approved diversion airport and has landed DC-10s.  It is already certified as meeting all standards for jet aircraft, and there are more charter and business jet flights every day than the two arrivals and two departures Delta/Skywest operates.  The regional jets would not have added to the number of flights, only upping the passenger capacity per flight from 30 to 50.

As for environmental reasons, even as a former environmental consultant and an appointee to the U.S. EPA during the Reagan Administration, I can’t see any.  To begin with, the Embraers previously providing service – and once more providing that service – are turbo-props.  In other words, the engines turning those props are jets.  So all that we’re talking about is a slightly larger jet engine.  The airport is located away from most residential areas, but with only four flights a day, and those being flown by a comparatively small regional jet, the noise isn’t an issue.  The airport has often been used by military tankers [KC-135s, as best I can determine] for practice approaches and touch-and-go landings [and their noise can be deafening], and it’s a tanker base for Forest Services fire-fighting aircraft, all of which are far louder and more polluting than regional jets.

So why does the FAA require an environmental assessment when the airport already complies?  Because the FAA says so.  That’s why.  And why did the FAA wait to issue this edict until after Skywest had begun service, when Skywest had arranged for the aircraft and filed the schedule/aircraft changes months in advance?  Well… there’s no answer for that, either.

I may not like environmental rules that stop construction to attempt to save endangered species [like the Utah Prairie Dog], and I may question the benefits of other environment rules and legislation, but I can at least understand their goals and purpose. But… this? It’s an exercise in bureaucratic turf management with absolutely no value or purpose, since it’s already been determined, by the FAA, no less, that there’s no adverse environmental or safety impact.

This sort of bureaucratic nonsense is exactly why so many Americans get fed up with government and is certainly one of the reasons why Republicans are making hay in their campaign against the president and why all too many Americans have given Congress the lowest approval ratings ever.

But… “rules is rules.” Right?

 

 

Politics 2012: “Code” and Hypocrisy

On Sunday, a CBS commentator ripped into several of the Republican candidates for president as well as a past Democratic President for extreme hypocrisy.  The only problem I had with what she said was that she didn’t go nearly far enough – on figures in either party, on the media, on Silicon Valley and Hollywood, just for starters.

When one of the leading Republican candidates running on a “family values” platform has had three marriages, with several affairs with other women while married to someone else, what exactly does this say?  What it says to me is that “family values” is really “code” for “I’m for the traditional, patriarchal, chauvinistic society of the 1950s, and let’s not have any serious talk about gender or sexual equality.”  Now… if that’s what you want… and that’s what the voters want, why not say it?  Because it would reveal too much about what too many people really want?  No… hypocrisy is so much more comfortable.

And when did “downsizing” and corporate deconstruction become “jobs creation” instead of unfettered pursuit of profit regardless of the human costs?

And how exactly does legislation that extends government into family planning [or prohibition of family planning methods] and declaring that felonious acts [rape and incest] require the victim to bear a child, an additional punishment… how does that square with the constant rhetoric against “big government”?  Why not just say that any man can get any woman pregnant by any means and she has to have the child?  But don’t justify it under family values or as part of a railing against big government.

But let’s not let those on the other side feel too self-righteous or comfortable, either.

When they talk about the rich paying their fair share of taxes [and, again, I agree with the premise that the top one percent shouldn’t have the right to a 15% tax rate on earned income because of a special definition, when those of us making far less are taxed at rates from 19% upward], they’re really talking about trying to find a way to get more revenue so that they don’t have to think about taxing the 53% of the population who pay no federal income taxes… and that’s hypocritical, too, especially for a nation whose government is supposed to be of all the people and for all the people, because it says that “we want the rich to pay more in taxes while lots of people pay nothing.”  Shouldn’t the majority of Americans pay something in federal income taxes, assuming we are going to remain even semi-democratic?

The “liberals” just mounted a huge campaign against two pieces of legislation designed to stop internet piracy and protect copyright.  I’d be the first to admit that the procedures used to bring the bills up… and some of the provisions… leave more than a little to be desired, but the hue and cry about intellectual freedom is as hypocritical as they come.  As the comedian and commentator Bill Maher noted, “People just want free shit.”  Google and Facebook want content as cheap as they can get it, and millions of Americans and others really like their pirated books – and I know about that, because every novel I’ve ever written is available somewhere free and pirated.  Hollywood, of course, wants to keep every dime it can, regardless of whether the methods tromp all over the first amendment.  For all the rhetoric, though, it’s not about censorship, but about “free media” in the worst sense of the word “free” on one side and big media profits on the other.

Politicians on both sides are against immigrants, especially illegal immigrants.  Of course, every single person on the North American continent is either an immigrant or the descendant of immigrants.  So what they really mean is something along the lines of, “I want immigrants here for cheap jobs no one else will take, but don’t give them real opportunity or education because they might actually work harder and their children might take jobs from mine.”  Just look at how hard the other candidates blasted Rick Perry for wanting to allow higher education to the children of immigrants.

And then there’s education, where both sides have proposed all sorts of “reforms,” ranging from “No Child Left Behind” to demanding more and more of teachers who have fewer and few real resources or throwing more and more funding at schools.  Yet none of these “popular” and politically easy fixes have worked — while both people and politicians have largely ignored the few schools that have actually made education work.  And why haven’t they taken the good examples?  Because they require firm standards and making parents and students responsible, not just teachers, and no politician ever wants to suggest that perhaps, just perhaps, a lot of the problem isn’t with the teachers, but with the students and their parents.

So… when making your choices, such as they are, in the weeks and months ahead, try, just try to think about what all those slogans and buzzwords really mean… and try not to get too ill over all the hypocrisy they embody.

 

Responsibility

The other day, my wife informed me that one of her favorite lamps had stopped working.  Well, actually, if I’m going to be totally truthful, she told me before Christmas, but since the lamp is replaced by a Christmas lamp, I didn’t get around to dealing with the lamp until the other day.  I discovered, as I’d suspected, that the three-way switch-bracket had shorted out and needed to be replaced.  No problem – except that I had to go to several stores to find a replacement switch, because, apparently, there’s not much of a market for replacement parts for lamps.

Once I got the part, it took less than ten minutes to replace the old switch and get the lamp back in service.  I didn’t look at the printed directions for replacement, of course, because I’ve done the task more than a few times, but when I was about to toss the package on which the directions were printed, I noticed a large “WARNING” label. I couldn’t help but wonder what I was being warned about… and if I’d made some terrible mistake.  So I read the warning.  What did it say?  It warned me to unplug the lamp before trying to replace the old switch and install the new one.

I wish I could say that was a joke, but it isn’t.  Are there people out there stupid enough to try to replace a part of an electrical appliance while it’s still plugged in?  Apparently so.  And apparently, the manufacturer was, understandably, trying to reduce the possibility of a lawsuit brought by someone either that stupid or someone extraordinarily callous and opportunistic.  As I was pondering this, putting away my tools, I happened to glance at my comparatively new step-ladder and saw the warning that told me not to stand on the very top step.

Have we dumbed down everything so much that people don’t know that electric current can kill?  Or that standing on top of a ladder is dangerous?  Whatever happened to common sense?  Or have we reached the point that no one has to take responsibility for their own actions, particularly if those actions are stupid?  Or is it that the lawyers have changed the law so much that, effectively, no one is responsible for their own acts?

Whatever the reason, we’re now inundated with warnings and cautions, and often the cautions for an ad for a drug take as much time as the commercial itself.  Look – all medicines do things to your body.  Anything you ingest can do that, and I certainly read the information on either prescription drugs or even over the counter medications, but does reading all the cautions aloud in a commercial really help… or does it merely cause most people to tune out the fact that any medication can have deadly side-effects for some people?  Those affected are usually a tiny percentage, but that doesn’t make the impact any less severe for those people, and that’s why using any drug or medication should be considered carefully.

But… that’s clearly not happening.  Prescription drug use is up, way up, and often not even because people are ill.  For example, there’s almost an epidemic of college students using ADHD medications to help them concentrate and study for exams or to write papers.  And why do they need those meds?  Because they clearly didn’t think ahead.

All the warnings in the world won’t help if people don’t think about what they’re doing; all they do is raise the bar for legal shysters… and, in a perverse way, invite even more warnings and litigation.

 

The World of “Now”

Once upon a time – and I suppose a fairy-tale beginning is appropriate – when young people were asked what they wanted to be when they grew up, the questioner would receive a plethora of answers:  president, a fireman, a police officer, an astronaut, a baseball star, etc..  Today… the most common response is: “I want to be famous.”

The pop art icon Andy Warhol once said something to the effect that everyone would have fifteen minutes of fame, and whether “everyone” will ever have that, Warhol was certainly right about the fifteen minutes part.

As I see it, though, never has fame been so short-lived, and the problem with this mindset is that the incredibly fleeting nature of present-day fame has also tarnished the value of experience, which is far different from fame.  Fame results from the praise of others;  experience is a combination of knowledge, skills, and understanding gained over time, yet an older practitioner in almost any field is usually regarded as old-fashioned and less able than a younger, more “vital” person.  And frankly, outside of the limited field of athletics, that’s a total fallacy.  Yet the fame and media culture has sold this image, and people, especially the American people, seem to have bought it lock, stock, and barrel.

Of course, the fact that fame is fleeting has always been acknowledged by human beings. The Romans had a slave whisper to a conqueror during his triumphal chariot ride through Rome, over and over, “Fame is Fleeting.”  A.E. Housman wrote the poem “To an Athlete Dying Young” with the lines:

Runners whom renown outran
And the name died before the man…

I can remember a time when there was at least grudging respect for age and experience and when such scorn for anything not current was expressed by phrases such as “that’s so yesterday.”  Once, American students actually had to know who the past presidents were and what they did. Once, most actors, not just a fortunate few, had careers lasting more than a handful of years.  Once, executives had to have experience in the business they were running. As I noted earlier, I doubt it was coincidental that Borders Books failed, given that the company, in its later and declining years, kept hiring CEOs and other executives who had no experience in the book industry, although they were semi-“famous” for accomplishments elsewhere.

Studies of CEOs have shown that, in general, the most effective CEOs are the ones who are the least famous, but the highest-paid ones [who are seldom the best] are the tallest and best-looking. And yet, with the growing cult of “fame,” companies go for “big names” and impressive appearance, whether they have the experience and the talent for the position, and at least one major financial company is headed by a big-name whose lack of competence has already been publicly shown.

The problem with the “now” culture is that it’s the culture of the moment, and that’s the culture of lemmings, where everyone follows the current fad, the “flavor du jour,” and current fads never last.  Because they don’t, and because they tend to exclude those with experience beyond the present and the accepted, when times change, and they always do, those in control make mistakes that older and wiser heads would caution against.  Or put another way, there’s a very good reason why Warren Buffet is one of the richest men in the world… and why Donald Trump has had to be bailed out of the majority of his projects, despite the “celebrity apprentice” un-reality reality show.

Fame and public personality are all too often just a flash in the pan, fool’s gold.  So why do so many people seek fame and try to emulate the merely famous, all too often ignoring the people who’ve actually accomplished something lasting more than minutes or months?

 

Fiction

There’s a certain amount of accuracy in the old saying “Truth is stranger than fiction.”  Every professional writer also knows, whether he or she will admit it or not, that the best fiction is far more “true” to life than life is.  Seemingly “impossible” coincidences and occurrences happen in life.  Almost everyone knows of one or has experienced it, but, especially if it’s a happy or fortunate one, it won’t ring true in fiction.

Is that because, at heart, we all know that impossible doesn’t happen most of the time? There’s a rule of thumb in writing that the only kind of coincidence or improbable happening that will work is one that goes against the protagonist… and even that’s iffy.

The word “fiction” comes from a form of the Latin verb “fingo,” meaning to fix or make, but, in Latin, there was usually a connotation of falsity attached to its use, such as putting on a brave face, and even a statue was an “imago fictum,” a made image, if you will. So how has fiction, or at least “serious” fiction, come to the point where it has to be, if you will, truer than true, or seeming to be more true to life than life is?

For that matter, why is “uplifting” fiction seldom considered “great” literature?  Does fiction have to be depressing, with a dark ending, to be considered “great”… or is it just the critics who think so?

Then, the other day, I ran across a forum entitled something like “Worst/Most Overrated Books.”  I couldn’t help myself – sometimes I do have a morbid streak – and I read through the entries.  I was amazed, because only two of the entries I read described what I would have thought were really “bad” books.  Interestingly enough, both those entries came from librarians.  Almost all the complaints were about books that someone, and sometimes, lots of someones and critics, had suggested were good books, books by people like J.D. Salinger, Ian M. Banks, Orson Scott Card, Stephen R. Donaldson, Ernest Hemingway, Victor Hugo, Nathaniel Hawthorne, John Steinbeck, and a number of others. Two entries even mentioned the Bible and Shakespeare as vastly overrated.  Even the very worst works by any of these writers are vastly superior to much of the true garbage being published today… and yet… why do readers pick on what they think of as the weakest work of good writers?  Because those works don’t meet the readers’ expectations?

And how many of those expectations come from the readers’ perceptions of what “reality” is and by how much the writer fails to portray the “reality” the reader wants?

Of course, if that’s so, it does suggest that most “professional” critics either lead pretty dismal lives or have a rather poor opinion of life and the world in general.

Miscellaneous Thoughts

There’s more than one kind of wisdom.  One way of classifying wisdom is by category: what to do; how to do it; when to do it.  But there’s also the other side:  What NOT to do [i.e., bad idea]; how not to do it [i.e., bad implementation of a good idea]; and when not to do anything [i.e., when to leave well enough alone].

One of the biggest problems in politics today is the fixity with which both politicians and voters hold their ideas.  Those on the far right insist that cutting taxes and spending is always the right thing to do, while those on the far left are all for the opposite. At times, each has been correct, but it’s not just knowing what to do.  It’s knowing how and when to do it… and when to leave well enough alone.  Yet the ideologues insist that there’s only one “right” answer, and that, essentially, it’s right all the time.

There’s also the tinkerer’s philosophy:  If one idea doesn’t work, try something else. That’s even before asking whether the implementation or the timing was good. Unfortunately, while there are times when it works, it’s often corrupted into a version where even when things are going well, the tinkerers decide that they could be better if something else were tried.  I’ve seen all too many organizations, from government to education to private industry, where goals and missions and organizational structure changed so quickly that nothing was going to work.  What’s so often forgotten is that the larger the structure, just like a massive ocean liner, the longer it takes to change course.  Why?  Because any organization that has survived has developed practices and procedures that work.  They may not work as well as other practices, but because they do work in most cases and for most people, changing takes time and explanation, and Americans, in particular, are often far too impatient.

One of the ideas behind the American government is the idea that power must be shared, and that the party in power gets the chance to implement its ideas, and if they don’t work, then the people can vote them out. For most politicians, though, the idea of sharing anything is a total anathema. Congressional districts need to be gerrymandered so that the seat always remains with one party.  Political appointees of the other party must be kept from their positions to which they have been appointed by a president of the other party, no matter what.  By using a “hold,” a single senator can keep a nomination from ever even being voted on by the Senate, yet, so far as I can tell, that particular procedure appears nowhere in the Senate parliamentary procedures.

What’s almost fatally amusing about this is that over the past generation, neither party has been exactly either effective at improving government or the living conditions of anyone but the wealthy, and yet each holds to both its ideas and as much power as it can, claiming that if it only had more seats and power, it would fix things.  If asked exactly how, each side falls back on generalities, and when the few politicians who actually want to do something come up with specifics, such as adding a year to the retirement age some ten years from now, or eliminating tax subsidies for billion dollar corporations with record profits, or suggesting spending federal funds on concrete improvements in infrastructure, the entire political system turns on them.

Looking at it from where I sit, it seems as though most people aren’t happy with things as they are, but they’re even less happy with anyone who wants to change things, and when they do want change, they want it their way, or no way at all… and that’s no way to make things better, at least not in a representative democratic republic.

 

The Charity “Model”

Before and during the holiday season, we were inundated with supplications from various charities, especially the ones to whom we’ve given in the past. We’ve managed to gently request that most of them stop calling – which has to be done on a charity by charity basis, because they’re exempt from the blanket provisions of the “do not call” list – and we’ve also informed them that we will not EVER pledge or respond to telecommunications requests for funds.  Even so, the postal and internet supplications continue ad infinitum… or so it seems.

No matter what one gives, it’s never enough. There are more homeless orphans, political prisoners, third world inhabitants needing medical care, starving refugees, endangered species, abandoned and homeless pets… the list and the needs are truly endless.  I understand that.

What drives me up the wall is that many of those charities and causes in which I believe and which I support seem to increase their petitions – even though my wife and I only give to them once each year and request that they not bother us more than once each year.  Now… I know that almost all fundraisers are taught to “develop” their clientele and press for more funds from those whose donations show they are sympathetic.  For what it’s worth, I’ve served notice that pestering us for more support is more likely to get them less… and that other worthy and less obnoxious causes may well get what they used to receive.

There’s also the question of “gratitude.”  One state university with which my wife and I are acquainted has adopted a de facto policy of not acknowledging “small” contributions, those under $1,000.  Apparently, the development office can’t be bothered.  Interestingly enough, the small “Ivy League” college from which I graduated responds to donations of any size with not only a receipt, but a personal letter, often with a hand-penned personal notation, to donations of any size – and in the early years after my graduation, some of my contributions were modest indeed.  Just guess which institution has been more successful in raising funds, and has an alumni participation rate of over 70%.

In her time as the head of several local non-profit arts/music organizations, my wife has had to raise funds, and she made it a policy to hand-write thank-yous to every single donor.  In every case, the organization was in debt when she took over, and in every case, the number of donors rose, and she turned it over to her successor with a healthy surplus.  She’s adopted a similar policy as the chair of a national educational music association… and again the outcome of recognizing donors has resulted in a significant and healthy increase in donations and support.

Yes, in economic hard-times, people often cannot contribute as much or as often as once they may have, even though the needs are often greater, but those who give don’t like to be pestered and guilt-tripped, and they would like a little personal recognition for their concern and generosity.

It’s something to think about anyway.

 

More Musings on Morality

What is morality?  Or ethics?  The simple answer is “doing the right thing.”  But the simple answer merely substitutes one definition for another, unless one can come up with a description or definition of what “right” or “ethical” or “moral” might be.  A few days ago, a reader (and writer) asked what would seem to many to be an absurdly abhorrent question along the lines of, “If morality represents what is best for a culture or society, then isn’t what maximizes that society’s survival moral, and under those circumstances, why would a society that used death camps [like the Nazis] be immoral?”

Abhorrent as this type of question is, it raises a valid series of points.  The first question, to my way of thinking, is whether ethics [or morality] exist as an absolute or whether all ethics are relative.  As I argued in The Ethos Effect, I believe that in any given situation there is an absolutely objectively correct moral way of acting, but the problem is that in a universe filled with infinite combinations of individuals and events, one cannot aggregate those individual moral “absolutes” into a relatively simple and practical moral code or set of laws because every situation is different.  Thus, in practice, a moral code has to be simplified and relative to something. And relativity can be used to justify almost anything.

Taking, however, that survival on some level has a moral value, can a so-called “death camp” society ever be moral?  I’d say no, for several reasons.  If survival is a moral imperative, the first issue is on what level it is a moral imperative.  If one says individual survival is paramount, taken admittedly to the point of absurdity, in theory, that would give the individual the right to destroy anyone or anything that might be a threat. Under those circumstances, there is not only no morality, but no need of it, because that individual recognizes no constraints on his or her actions.  But what about group or tribal survival?  Is a tribe or country that uses ethnic cleansing or death camps being “moral” – relative to survival of that group?

Again… I’d say no, even if I agreed with the postulate that survival trumps everything, because tactics/practices that enhance one group’s survival by the forced elimination or reduction of others within that society, particularly if the elimination of other individuals is based on whether those eliminated possess certain genetic characteristics, or fail to possess them, is almost always likely to reduce the genetic variability of the species and thus run counter to species survival, since a limited genetic pool makes a species more vulnerable to disease or even the effects of other global or universal factors from climate change to all manner of environmental changes.  Furthermore, use of “ethic cleansing” puts an extraordinary premium on physical/military power or other forms of control, and while that control may, in effect, represent cultural/genetic “superiority” in the short run, or in a specific geographic area, it may actually be counter-productive, as it was for the Third Reich, when much of the rest of the world decided they’d had enough.  Or it may result in the stagnation of the entire culture, which is also not in the interests of species survival.

The principal problem with a situation such as that created by the Third Reich and others [where so-called “ethic cleansing” is or has been practiced] is that such a “solution” is actually counter to species survival.  The so-called Nazi-ideal was a human phenotype of a very narrow physical range and the admitted goal was to reduce or eliminate all other types as “inferior.”  While there’s almost universal agreement that all other types of human beings were not inferior, even had they been so, eliminating them would have been immoral if the highest morality in fact is species survival.

Over the primate/human history various characteristics and capabilities have evolved and proved useful at different times and differing climes.  The stocky body type and small-group culture of the Neanderthals proved well-suited to pre-glacial times, but did not survive massive climate shift. For various reasons, other human types also did not survive. As a side note, the Tasmanian Devil is now threatened by extinction, not by human beings, but because the genetics of all existing Tasmanian Devils is so alike that all of them are susceptible to a virulent cancer – an example of what could happen when all members of a species become too similar… or “racially pure.”

Thus, at least from my point of view, if we’re talking about survival as a moral imperative, that survival has to be predicated on long-term species survival, not on individual survival or survival/superiority of one political or cultural subgroup.

 

Public Works or Public Boondoggle?

For the past several months, an almost continual simmering issue at City Council meetings here in Cedar City has been over the new aquatic center.  First, there were the charges and countercharges over the cost overruns, and although most people eventually conceded that the additional work was necessary, there was great debate over the price tags.  Then came the continuing arguments over the operating costs, which most likely resulted in two incumbent city council members being defeated in the municipal election and the third whose term was up not even running for re-election.  At present, revenues only cover a bit more than sixty percent of the operating costs, and all three of the newly elected councilmen declare that the center should be self-sustaining.

Right!  A survey by one of the state new organizations discovered that not a single aquatic center in all of Utah had revenues that covered its costs.  One managed to recover almost eighty percent of its annual operating costs, and one only managed about fifty percent, and all the rest fell in between.  Why?  Because, like it or not, the people who use aquatic facilities are predominantly either families or seniors, and the majority of both have limited funds.  Increasing fees drops the number using the facility, and if fees are considered too high for the local community, total revenue drops even with increased per capita fees.  Add to that the fact that Cedar City is a rural university town located in a county with the lowest family income in the state, and the potential for raising fees is pretty limited.

This debate raises the eternal question about publicly funded projects.  Which are justified and which are boondoggles?  Comparatively, very few people seem to complain about public park budgets, for which no out-of-pocket fees are ever collected, but many would say that’s because they’re open to everyone.  Open, yes, but I have to say that although we have good parks here, and I’m for them, and for my tax money being used for them, I’ve set foot in them only twice in the eighteen years I’ve lived here.  I’m for them, and for the aquatic center, because they make the community a better place.  I’m also for them because I’ve lived all over the USA, and I can see that the tax levels here are low, most probably too low, and the local politicians certainly aren’t spendthrifts with the public money.  Sometimes, though, they’re idiots.

Cedar City is home to the Utah Shakespeare Festival, a good regional theatre [it won a Tony some ten years ago as one of the best regional theatres in the United States] based largely on the campus of Southern Utah University.  Founded some fifty years ago, it’s grown from a three-day event to almost a half-year full repertory theatre.  The university, however, has also grown enormously over the past two decades, from around 3,000 students to over 8,000, and there’s really not enough theatre space for both the University theatre, dance, and music programs and the Festival.  The Festival professionals have recognized this, and for years have been working on an expansion plan that would make the Festival far less dependent on university facilities.  In order to obtain some state and foundation funding, the Festival requested a grant of two million dollars from the local RDA, controlled by the city council, in order to demonstrate the required local support.  Several council members objected, and the entire $20 million plus expansion project was threatened before reason finally prevailed.

Was that $2 million a boondoggle?  Scarcely.  Economic studies have shown that the Festival generates between thirty-five and forty million dollars annually for local businesses, and provided a great economic cushion for the town some thirty years ago when the iron mines closed, and that’s been with minimal economic support from the town. For fifty years the town has benefited from the University’s support of the Festival.  Yet the decreasing percentage level of state support for the University [and any higher education institution in Utah] and the need to raise student tuition to compensate has placed the University in a position where it can no longer be so generous to the Festival.  Despite the enormous economic benefit to the town from the Festival, some politicians would call a two million dollar grant a boondoggle.

A decade ago, local politicians decided the town needed a good local theatre, one independent of the educational institutions… and they built one that holds almost 1000 seats, with good acoustics and associated modest convention facilities.  As a consequence, Cedar City has been able to host events from traveling operas to American Idol vocalists and everything in between.  But once again, the new councilmen are demanding that the theatre make money… despite the fact that the previous director [who was forced out by the new council] came very close to doing so.  NO decent performance theatre in a town of 40,000 people can do that [a lot of Broadway theatres can’t, and they charge exorbitant rates, which isn’t possible here].  But what that “borderline” economic performance doesn’t show is the thousands of people who travel to Cedar City from nearby and sometimes not so nearby rural areas for those shows and other events, and the hundreds of thousands, if not millions of dollars they spend in town on those trips.  Nor does it count the food and lodging paid for by the performers [and when those performers include 100 member symphony orchestras, that’s not inconsequential].

Especially in rural areas like Iron County, whether a town or small city prospers or withers depends not just on low taxes, but also on the quality of life, and often a “good” quality of life can generate enormous economic benefits, which tend to flow back in tax and other indirect revenue sources.  Past management of the quality of life has led to Cedar City being named as an outstanding community for both families and retirees, but with the recent rise of Tea Party type politicians, there’s been a cry for lower taxes and spending, despite the fact that they’re already too low.  There’s a huge difference between managing public facilities well and concentrating on profit-loss figures from single facilities or projects as an indication of their community usefulness and “profitability.”

Yes… there are many public boondoggles, and I’ve seen all too many of them, but just because a public facility or expenditure doesn’t cover its operating costs directly doesn’t mean it’s a boondoggle… or that the town isn’t “profiting.”   And that’s something too many people and politicians fail to understand.

 

 

The Hidden Costs of Transportation

A number of family members visited us over the holidays, and I ended up having to ship gifts, ski clothes, etc., back to them.  Some of them stayed almost a week, which we appreciated because we live great distances from them and with everyone working [which, as I’ve mentioned before, more and more often requires more and more time and effort for those who have jobs and wish to keep them], we don’t get to see them often.  Staying longer does require a few more clothes, especially in the case of small children, even though our washing machine was busy at many times, and more clothes means more weight.  More weight means checked suitcases… and since Southwest doesn’t fly to Cedar City, checked bags add to the cost of travel.

Then I recalled that, at one time, a little over ten years ago, a checked bag was not only free, but you could put 60 pounds of clothes and gear in it, rather than the current 50 pounds. That ten pound reduction doubtless reduced the strain on baggage handlers, and most probably accounted for some fuel savings – and cost savings – for the airlines.  All in all, though, these cost-savings measures for the airlines add to the cost for the traveler.  They also add to the inconvenience, since the overhead luggage bins are not adequate for all the carry-ons if a flight is full – and most are these days.  Then, too, there are the charges for seats with slightly more leg-room, and the elimination of in-flight meals in coach [often replaced with a “menu” of items for which the costs are just short of exorbitant].

Airport security also adds to the time spent in travel – from an additional 30-45 minutes at small airports to more than an hour at major hubs. And time is money, meaning that the more security agents on duty [to reduce waiting] the higher the cost to government.

Then I discovered that, because December 26th was a holiday this year, all the packages we’d hoped to ship back to the various coasts on Monday had to wait until Tuesday, and one of my sons and I wasted gas and money to discover that – because the local shippers never said that they were closed – they just left messages on their telephones that they were busy and asked us to leave messages or to call back.  Now, except for the various layers of government, banks, and the stock market, most other businesses – except for the shippers – were open, obviously believing that Sunday, December 25th, was the holiday, and not Monday.

Given the “efficiency,”  “effectiveness,” and self-centeredness of government, banks, and financiers, to find shippers following their lead gave me a very disconcerted feeling… and, well… you all know what I think about government, banks, and financiers, not to mention the airline industry.

 

The Difference Between Science and Scientists

Recently, I’ve posted a few blogs dealing with various aspects of personal opinion and confirmation bias and how the combination can, to an outsider, make any individual, in certain circumstances, look like a complete idiot.  That even includes scientists, sorry to say, yet “science” as a whole has an unprecedented record of accuracy over time, regardless of what climate change deniers and creationists say.  If scientists can be as personally biased and opinionated as all the rest of us, how does “science” end up with such a long-term record of accuracy?

There’s one basic reason, and that is that the modern structure of science, if you will, requires proof, and all the proof that is submitted is subject to scrutiny and attack from all quarters.  What emerges from this often withering barrage almost always turns out – in time – to be more correct and more accurate than that which preceded it.  That’s not to say that, upon occasion, it hasn’t taken the scientific establishment time to get things right, but eventually better techniques and better thought proved that plate tectonics was correct, just as, regardless of the creationists, there’s an overwhelming body of evidence in favor of evolution, and that relativity provides a more accurate picture of the universe than did Newton, or the Ptolemaic theorists.

But there are several “problems” with the scientific method.  First, establishing more accurate knowledge, information, or theories takes time, and often large amounts of resources, as well as winnowing through and considering a fair amount of uncertainty at times. Second, it requires reliance on data and proof; mere opinion is not sufficient.  Third, it’s not as set in stone as human beings would like.  The early Greek scientists had a fair idea about the earth and the moon, but their measurements and calculations were off.  As methods, equipment, and techniques improved, so did the measurements, and Newton did far better, and his methods and theories result in a high degree of accuracy for most earth-bound measurements and systems, but Einstein and his successors have provided an even more accurate explanation and more accurate measurements. And fourth, at present, the scientific method isn’t absolutely precise in predicting specific future results of massive interacting inputs.

That lack of absolute precision in dealing with future events often causes people to doubt science as a whole, even though its record is far better than any other predictor or prediction system.  Part of its accuracy comes from the fact that science as a structure adapts as more information becomes available, but some people regard this adoption of new data and systems as unsettling, almost as if they were saying, “If science is so good, why can’t you get it right the first time?”  An associated problem is that science is far more accurate as a descriptor than a predictor, and most people subconsciously assume that the two are the same.

Even so, one could easily adapt Churchill’s statement about democracy to science, in saying that it’s the poorest way of describing the universe and predicting how things will happen – except for any other way that’s ever been tried.  And that’s because the structure of modern science is greater than any individual scientist.

 

 

 

Lateness as a Reflection on the Pool of Self

The other Sunday, I was finishing up my morning walk/run with the crazy sweet Aussie-Saluki some two blocks from home, and the church bells rang the hour.  A few minutes later, as we passed the church, I saw cars speeding in and people hurrying into the church.  A block later, people were still hurrying to the church [not my church, since I confess to being a less than diligent congregant at another one]. Once upon a time, I was indeed a most religious young man, president of a church youth group and an acolyte at services every Sunday. Consequently, I had the chance to observe just how many people were late to services, and, frankly, late-comers were rare, extraordinarily quiet, and invariably their body posture reflected a certain discomfort. I doubt I saw as many late-comers in all the years I served as an acolyte as I saw on my walk on that single recent Sunday morning.

This observation got me to thinking, realizing that lateness and/or lack of interest in punctuality has become an increasing staple in our society.  When my wife produces an opera at the college, there are always between twenty and fifty attendees who come in after the first break, and that doesn’t count those who struggle in during the overture.  When we attend local concerts, the same thing is true.  More and more college professors I encounter relate their tales of students who cannot seem to arrive on time, and some have had to resort to locking doors to avoid disruptions from late-comers.  My wife even got a jury notice emphasizing that, if she were picked for jury selection, she needed to be punctual or she could face a stiff fine. This morning, in the paper, there was a story about a surgeon who was late to a court appearance — and who was imprisoned when the judge was less than impressed.

What exactly has happened to a society where cleanliness was next to Godliness and punctuality was a virtue?  And where even professional people who should know better don’t?

Oh… I know this is a western European-derived “virtue.”  When my wife did a singing tour of South America, no concert ever started “on time,” and in one case, the performance actually started more than an hour after the announced time because there was social jostling among the “elite” to see who could be the most fashionably late… as if to announce their power to make others wait.  And I have to confess that I tend to have an obsession with being on time because my father almost never was.

Still… what is it about being late?  Is it because, as our lives have gotten more and more crowded [often with trivia], we have trouble fitting everything in?  Is it because, with an internet/instant communications society, each of us feels more and more like the center of the universe, and our schedule takes precedence over that of others?  Is it merely a way of demonstrating personal power and/or indifference to others, or a lack of caring about the inconvenience being late can cause to others?  Is it a symptom of the growing emphasis of “self” over others?

I don’t have an answer… but I do know that I think most uncharitable thoughts about late-comers to anything, apparently oblivious or even enjoying the scene, whose lateness disrupts everyone else’s concentration and enjoyment… or even more important activities, like judicial proceedings.  And I seriously doubt I’m alone in those thoughts.