Archive for the ‘General’ Category

Not Wanting to Know

Recently here in Cedar City, there have been several letters decrying the direction of the university as a “liberal arts” institution and complaining about the high cost of tuition.  My initial – and continuing – reaction has been along the lines of what planet are these idiots from? 

The university has always had a liberal arts/teaching focus, from the days of its founding over a century ago, and its tuition is so low that its out-of-state tuition and fees are lower than the in-state fees of many universities in other states.  Now, admittedly, tuition has increased more than the administration would like, entirely because the state legislature has decided to cut per-student funding while mandating enrollment increases, not only for the local university but for most of the state institutions.  Even so, considering the quality of many programs, state tuition here and elsewhere in Utah is a comparative bargain. Here, the music, art, and theatre areas have won national awards against much larger schools; the nursing program is rated as one of the best in the state and region; pre-law and pre-med students have an enviable rate of acceptance at graduate schools; and the physical education program has been so successful that it’s known as the “coaching factory.”

Unfortunately, this disregard for the facts isn’t just about college education here in Cedar City, but is symptomatic of a larger problem.  More and more, I see people ignoring the facts that conflict with what they feel and want.  It’s as if they actively avoid facts and circumstances contrary to their beliefs, as if they simply don’t want to know.  Whether it’s global warming or deficit spending, immigration, income inequality, decreased social mobility, education…or a dozen other subjects… they don’t want to know… and trying to get them to consider “contrary” facts just makes them angrier.

Part of this is an effect of civilization. If, earlier in history, you didn’t want to believe that the perils of the time – predators, floods, fire, famine, and raiders from other tribes, for example – you ended up dead.  Now that civilization has eliminated or limited the effects of those perils, and the dangers we face are more indirect and take more time to affect one, people ignore the facts about dangers.  In this regard, global warming is a good example.  I can recall predictions dating back almost twenty years suggesting that weather would get more violent with even modest rises in overall global temperatures.  Temperatures have risen; weather has become more violent; and still people debate whether global warming and its effects are real. 

On a personal level, there’s and even more stark and direct example — obesity.  Excessive weight is one of the primary causes of early death and other health hazards.  There’s absolutely no question of that… and yet Americans are the most obese nation on the face of the planet… and they scream bloody murder when a politician suggests banning serving soft drinks in 32 ounce sizes.  For heaven’s sake, does anyone really need a quart of carbonated beverage at one sitting?

But then, I suppose, why anyone would want that much at once is one of those facts I don’t want to know.

 

“Real” Fiction

The New York Times best-selling author Jeannette Walls was quoted in the Times this past weekend as saying, “I’m not a huge fan of experimental fiction, fantasy or so-called escapist literature. Reality is just so interesting, why would you want to escape it?”  This kind of statement represents the kind of blindness that is all too typical of all too many “mainstream” writers and critics.

In fact, the best science fiction, fantasy, and other “escapist” literature puts a reader, in a real sense, “outside” the framework of current society and reality in a way that allows a perceptive individual to see beyond the confines of accepted views and cultural norms. Some readers will see this, and some will not.  As a simple, but valid example of this, take my own book, The Magic of Recluce, in which the “good guys” are ostensibly and initially portrayed as the “blacks.”  In western European derived cultures, as demonstrated by all too many westerns, where the good guys wear white Stetsons, and the bad guys crumpled black hats, in the United States, in particular, there is an equation of the color white with purity and goodness.  But this is far from a universal norm.  In many cultures, white is the color of death, and other cultures use other colors for purity.  My very deliberate inversion of this western color “norm” was designed to get readers to think a bit about that… and then, when they’d thought a while, I started writing other Recluce books from the “white” perspective, in an attempt to show the semi-idiocy of arbitrarily ascribing “color-values” to people or societies, or values to colors themselves.

I’m far from the only F&SF writer to use the genres to explore such themes or to question values or concepts, and I could list a number of writers who do.  So could most perceptive readers of F&SF.  This fact tends to get lost because fiction is for entertainment, and if we as writers fail to entertain, we don’t remain successful professional writers for very long, and, frankly, if we’re extremely successful at entertaining, we tend not to be taken seriously on other levels. Stephen King, for example, is technically a far, far better writer than is recognized, largely because of the subjects about which he writes, and not because he writes poorly – which he does not.  Only recently has there been much recognition of this fact.

Even with critics within the F&SF genre, there’s a certain dismissal of writers who are “commercially” successful as writers of “mere” popular escapism, as though anything that is popular cannot be good.  Under those criteria, Shakespeare cannot possibly be good or have any depth.  For heaven’s sake, the man wrote about sprites and monsters, faery queens, sorcerers and witches, along with battles, kings, ghosts, and ungrateful children.

Good is good;  popular is popular; and popular can be anything from technically awful to outstanding, although I’d be among the first to admit that works that are both good and popular are far rarer than those that are popular and technically weak or flawed.  And the same holds for so-called escapist fiction, no matter what the mainstream “purists” assert.

Then too, the fact is that all fiction, genre or mainstream, is “escapist.”  The only question is how far the author is taking you… and for what reasons.

Thoughts on Self-Sabotage

Over the years, both my wife and I have encountered quite a number of individuals who had the ability and skills to succeed, and who then proceeded to commit self-sabotage, often when they were on the brink of accomplishing something they said was important to them. Another instance just occurred, and without going into details, the individual in question suddenly stopped going to two required senior level classes, while attending other classes… and getting good grades in those.  Despite promises to do better, that individual ended up flunking both courses… and being unable to graduate for at least another semester.

It’s easier to understand why people fail if their reach exceeds their abilities, or if accidents or family tragedies occur, or if they become addicted to drugs, or suffer PTSD from combat or violent abuse, or if they suffer from depression or bipolarity, but it’s hard to understand why seemingly well-adjusted people literally throw their future, or even a meaningful life, away.  Some of that may be, of course, that they’re not so well-adjusted as their façade indicates, but I have a nagging suspicion that in at least a few instances, there’s another factor in play.

What might that be?  The realization that what they mistakenly thought was the end of something was just the beginning.  For example, far too many college students have the idea that college is an ordeal to be endured before getting a “real” job that has little to do with what was required in college.  In my wife’s field, and in many others, however, what is required in college is indeed only the beginning, and the demands of the profession increase the longer you’re in it… and some students suddenly realize that what is being asked of them is only the beginning… and they’re overwhelmed.

The same can be true of a promotion. The next step up in any organization usually involves more pay, but today, often the pay increase is minimal compared to the increased workload and responsibilities… and, again, some people don’t want to admit, either to themselves or to others, that they don’t want to work that hard or handle that much responsibility.  So the “easy” way out is self-sabotage… and often blaming others for what happens.

This certainly isn’t the only explanation for self-sabotage, but it does fit the pattern of too many cases I’ve observed over the years… and it also seems to me that cases of self-sabotage are increasing, but then, maybe I’ve just become more aware of them…or maybe the “rewards” for advancement, degrees, etc., just aren’t what they used to be… at least in the perception of some people.

American Politics – Power Now?

In past blogs, I’ve discussed the insidious and potentially deadly long-terns effects of the “now” mentality, particularly on American business, and how the emphasis on immediate profits, immediate dividends, or immediate increases in stock prices, if not all three, have had a devastating effect not only on the economy, but all across the society of the United States.  There is another area of American society where the “now” culture has had an even more negative and more immediate effect – and that’s on American politics and government.

Years and years ago, one of my political mentors made the observation that, in running a campaign, you had to give the voters a good reason to vote for a candidate.  Back then, that reason was tacitly assumed to be, except in certain parts of the south, positive.  Today, if one surveys political ads, campaign promises, and the like, that reason is overwhelmingly negative.  Vote for [Your Candidate] because he or she will oppose more federal government, more spending, more gun controls.  Or conversely, vote for [Your Candidate] because he or she will oppose cutting programs necessary for children, the poor, the disadvantaged, the farmer, the environment, etc. 

The synergy between the “now” culture and the ever more predominant tendency of American voters to vote negative preferences is an overlooked and very strong contribution to the deadlock in American politics. People want what they want, and they want it now… and they don’t want to pay for it now, despite the fact that anything that government does has to be paid for in some fashion, either by taxes, deficits, inflation, or decreases in existing programs in order to maintain other existing programs.

 In addition, as a number of U.S. Representatives and Senators have discovered over the past few elections, voters no longer reward members of Congress for positive achievements.  They primarily [pun intended] vote to punish incumbents for anything they dislike.  So a member of Congress, such as former Senator Bob Bennett of Utah, can vote for 95% or more of what the Republicans in Utah want and make two or three votes they don’t like, and be denied renomination. At a time when federal programs are vastly underfunded, the combination of voter desires not to lose any federal benefits/programs, not to pay in taxes what is necessary to support those programs, and to punish any member of Congress who attempts to resolve those problems in a politically feasible way, such as working out a compromise, results in continual deadlock.

Then, add to that the fact that politicians want to be re-elected, that over 90% of all Congressional districts are essentially dominated by one political party, and that thirty-one of the states have both Senators from the same political party, and that means that the overwhelming majority of members of Congress cannot vote against the dictates of their local party activists on almost any major issue without risking not being renominated or re-elected. 

Yet everyone decries Congress, when Congress is in fact more representative of American culture than ever before.  We, as a society, want, right now, more than we’re willing to pay for.  Likewise, our representatives don’t want to pay for trying to fix things because they want to keep their jobs, right now, regardless of the future consequences.  But it’s so much easier to blame that guy or gal in Washington than the face in the mirror.

The Week’s Market “Crash”

Taken together, the drop in the various market indices on Wednesday and Thursday appear to be the largest two-day decline in almost a year and a half.  And what supposedly triggered the sell-off and decline?  The fact that the Federal Reserve indicated that it just might stop buying something like $85 billion in bonds every month.  Duh!

Believing that the continuing purchase of such bonds, otherwise known by the euphemism of “quantitative easing” (or QE), would or could go on forever makes the belief in the tooth fairy, the Wizard of Oz, and moderation by the Taliban look sensible by comparison. The financial “wizards” of  Wall Street, including high-paid hedge fund managers, program traders, and various other supposed financial icons had to know that such a program had to end or be throttled back.  So why, if they knew this, did they go into a panic?

Because they were using the stimulus of QE to run up stocks in the short run to bolster their own bottom lines – and bonuses – and didn’t believe that Chairman Bernanke would signal an end to the artificial bull market so quickly.  Ah yes, and these are the geniuses who are among the most high-paid executives/professionals in the United States.  They’re also the ones who created the mess of the Great Recession… and they’re at it again.

Despite Dodd-Frank, there still is little oversight of these self-proclaimed experts, and no real significant reform of either banking or investment banking. And Congress continues to tie itself in knots over anything requiring real oversight or reform, as witness the scuttling of the legislation that made a very modest attempt at reforming farm subsidies… or the continued hassles over fixing a broken and essential non-functional immigration system… and we won’t mention, except in passing, the fact that despite overwhelming public support for requiring background checks of firearms’ purchasers, that, too, never happened.

Just how bad will things have to get before Americans start electing politicians who are more interested in solving problems than getting elected?  I don’t know… but I’m definitely not holding my breath.

On Your Own Terms

There’s a scene in the movie Citizen Kane where Jedediah Leland tells Charles Foster Kane that Kane only wants “love on your own terms.”  It’s a great scene, and true as well as prophetic in a far larger context

There’s no doubt that, throughout history, human beings have always wanted love, and pretty much everything else, on our own terms.  In most of human history, however, almost everyone couldn’t get much of anything on their own terms, and this is still true in many parts of the world. If it didn’t rain, people were lucky to get anything to eat, let alone a Big Mac or Chateaubriand with béarnaise.  Even in the reign of Louis XIV, the “Sun King” of France, the most powerful ruler in Europe, there were times when beverages froze on the table at Versailles.

But, with the rise of more advanced technology we’ve become more and more able to get things previously unobtainable, from fresh fruits and vegetables out of season where we live to instant communications pretty much anywhere on earth.  Particularly in the United States, as a society we want everything on our own terms.  We want cheap and abundant electricity.  We want inexpensive clothing.  We want easily affordable personal transportation at our beck and call.  We want the best health care possible, and we’re getting angry that its cost is rising.  The list of what we want and can get on our own terms – or close to them – is large and growing… for the moment.

The problem with all this is that, over time, we don’t dictate the terms:  the physical condition of the world and the underlying laws of the universe do.

The current “we can have it all” of so-called responsible environmentalists is natural gas, because it emits roughly half the greenhouse gases of coal as well as far fewer other pollutants.  There are more than a few problems with this “solution,” the first of which is that the numbers backing the “replacement” of coal with natural gas don’t take into account the additional and far higher than publicized environmental costs. A number of recent studies show that from 3% to 15% of existing natural gas wells are leaking methane gas.  A NOAA study of one gas field in eastern Utah found that leaks amounted to 9% of the amount of gas produced.  Another study by Cornell University also found leakage rates at nine percent on a national basis. Studies of gas drilling have shown leakage rates of up to 17% in some basins.  One Canadian study indicated that the more typical horizontal gas fracking wells had leakage after of several years in more than half the wells.   While high pressure fracking wells are still in the minority in numbers, they have high initial production rates, and even a small percentage volume of leakage can result in a significant quantity of methane emissions. Given that there are some 500,000 natural gas wells in the United States alone, if even 3% are leaking that’s 15,000 wells oozing or spewing methane into the atmosphere, and, given that methane is a greenhouse gas 100 times more potent than CO2, when initially released and 25 times more potent even measured over a hundred years, after that’s a serious problem. And that doesn’t include the tens of thousands of Canadian wells. Even EPA studies show that leakage rates above eight percent negate any benefits from converting from coal and in fact may even accelerate global warming.  Natural gas just doesn’t leak from wells, either.  Testimony before the Massachusetts state legislature this year cited 20,000 known natural gas leaks in the state, and the U.S. Energy Information Administration estimates that more than 8 billion cubic meters of natural gas are lost each year in leakage.

I’m not against natural gas.  In fact, I’d much rather have natural gas generating my power and heating my home than any of the conventional alternatives.  BUT… unless the drilling companies and the gas power industry are willing to spend a lot more money and other resources in cleaning up production and transmission systems, there’s not going to be any environmental improvement.  In fact, present practices could make matters worse.  Of course, coal is still cheaper – unless coal-burning power plants are cleaned up to the standards of natural gas plants, in which case, the electricity won’t be any cheaper, but more expensive.  And if we don’t clean up our energy production and usage pollution, we’ll end up frying the planet that much sooner. In short, we can’t keep having cheap energy on our terms.

I could have cited different examples in different areas, but the facts and the conclusions would be similar. Over time, the universe is going to limit what we can have on our own terms… and for how long.

That’s not even a question.  The question is how long it will take us as a society to understand that point.

Stereotypes

Over the past few years the issue of stereotyping has become and remains a hot-button topic with many people, particularly those in groups subjected to the practice. The Oxford dictionary definition of the word “stereotype” is: “a widely held but fixed and oversimplified image or idea of a particular type of person or thing.”  Unfortunately, while most enlightened individuals deplore stereotyping, the fact is that even those who deplore it still engage in it, whether they realize it or not.  For example, while it is considered prejudicial to believe any young black male in a hoodie is a gang member, or up to no good, it’s perfectly all right to call every SUV or large pick-up truck a “gas-guzzler,” even if the owner has occupational or other needs no other vehicle can meet. But both are sterotypes.

At the same time, it is useful to consider that stereotypes exist essentially for one of two reasons: (1) a significant number or percentage of people (or vehicles, or anything else) is a group do in fact fit within the stereotype… OR (2) large numbers of people believe that they do.

It’s fairly obvious that stereotyping people based on misconceptions is prejudicial, but what if there’s a basis in fact?  For example, for centuries, there was, and still is, especially in western European-derived cultures, a stereotype of Jewish men as greedy, stingy moneylenders. 

While ancient Jewish law forbid excessive charging of interest, and charging interest was deplored in some texts, money-lending with interest charges was allowed by the Judaic faith, but by the fifth  century the Roman Catholic Church had prohibited the taking of interest, and in 1311, Pope Clement V made the ban on usury absolute.  In effect, all Christians were banned from money-lending; Jews were not. Since there were more than a few bans on what Jews could do in Europe, it wasn’t exactly a surprise that the banking business was initially predominantly Jewish and Jewish bankers remained active and prominent well into the 20th century. Thus, in point of fact, the stereotype of money-lenders as Jewish was accurate… but it’s highly doubtful that most Jewish money-lenders were anything like the stereotypes portrayed by playwrights and writers [such as Shylock in The Merchant of Venice], simply because acting that way would have been largely counterproductive at a time when Jews were facing continual persecution, not that reality has ever made much impact on prejudice.  Only a concerted effort toward change has been effective.

As for young black men in hoodies… that’s a problem, because, according to Bureau of Justice statistics, one in three black men will serve time in jail and 40% of young African-American males will spend time in some sort of confinement.  Part of that [possibly a very large part of that] is the result of a criminal justice system that prosecutes a higher percentage of minority youths than white youths and that, for the same offense, sentences black youths to longer sentences than those received by white youths, but… for whatever reason, unhappily, the stereotype applies to a significant percentage of black males… and that means, unhappily, that one needs to be at the least wary of young black men in hoodies on dark city streets.

In the end, there is not one problem with stereotyping, but two.  The first is obvious. Viewing every individual in a particular group as a stereotype of that group is both prejudicial and discriminatory.  The second problem is not as obvious, but just as real.  When any group has a large enough percentage of individuals who fit a negative stereotype, that group, and society as a whole, has a problem that needs to be addressed, and it’s almost certain that not all of that problem is purely prejudice. This problem is not just one for minorities.  Bankers, professional cyclists, the NRA, tea-partiers, American tourists abroad [“ugly Americans”], young male Muslims, college professors[ivory tower liberals], and Republicans, among others, all also need to face their stereotypes.

Irrational Economic Values

Ever since Adam Smith, and probably before, economists and philosophers have attempted to reduce the essence of economics to simple principles, coming up with various explanations for various aspects of economics, ranging from “the invisible hand” to “surplus value of labor” all the way to the Laffer Curve, which postulates that there is an optimum rate of taxation, above which actual tax revenues will drop.  This, of course, was the rationale for the Reagan tax cuts.  But for all these theories and studies, economics has been spectacularly flawed in its application to law and public policy, and we now face an economically critical period in history.

Today, throughout the industrialized world, and particularly in the United States, political and economic leaders are faced with a series of economic problems.  First, there is a continuing, and often growing, disparity between the incomes of average individuals and the highest-earning individuals.  Second, growing productivity and profitability is resulting in greater returns to high earners and is not resulting in significantly increased employment, and not enough to keep up with population growth. Third, an increasing number of governments lack the resources to maintain fiscal and financial stability. Fourth, governments and businesses are increasingly reluctant to spend on societal infrastructure, even as more and more of business and society are dependent on such infrastructure.  The combination of the first three situations is leading, in many cases (and those instances will likely increase), to political and social unrest, and the fourth situation, unless remedied, is likely to undermine attempts to resolve the first three problems.

The problem with almost all past economic theory and most political solutions proposed to date lies in more than a few questionable assumptions underlying various theories. The two that seem most prevalent and erroneous to me are:  (1) Individuals and organizations behave rationally. (2) Value is determined objectively.

Recent economic studies have shown that true objective rationality, particularly on the part of individual consumers and business owners, is seldom the case.  Part of that lack of rationality lies in part in the fact that all of us have core sets of beliefs at variance to some degree with the world as it is, and part lies in the fact that we never know enough to consider all the factors. This lack of rationality compounds the problem of determining economic “value.”  From the beginning of economic studies, economists and philosophers have groped to find an answer to the basic economic question of what is “value” is and how it is determined in a working society.

The so-called Classical Theory of economics effectively states that the price/value of goods is determined predominantly by the cost of the labor producing it, and in this regard, Marxism is only a variation on classical theory, since, in Das Kapital, Karl Marx asserts that the difference between skilled and unskilled labor does not factor effectively into the creation of worth.  It’s also clear that salaries and wages are determined by what the employer is able and willing to pay, and that is determined in part (and only in part) by what consumers of goods and/or services are willing to pay.

But consider the following questions.  Why are the employees of Target, Costco, and WalMart paid on very different wage scales?  All three companies provide similar goods to large numbers of consumers, and many of those goods are identical. Why do universities pay professors of similar rank and experience widely differing salaries, sometimes dependent on discipline, and sometimes not?  Why do investment banks and hedge funds continue to pay their CEOs and top executives and traders tens of millions of dollars for services that are essentially irrelevant to roughly ninety percent of the population?  And why has no government ever seriously attempted to recoup any significant proportion of the immense financial losses inflicted on national economies… or taxed them more heavily to help pay for the unemployment benefits for those whose jobs they destroyed?

The theoretical answer to all these questions is some variation on paying the “market rate” or “that’s the way business works.”

At the core of this “market” are the so-called laws of supply and demand.  The idea is that when goods and services are plentiful, prices go down, and when they are scarce, prices go up.  This works well, if mercilessly, in terms of commodities, or commoditized services.  After all, one bushel of durum wheat is pretty much like another bushel, but when it comes to people, the idea has more than a few flaws, and the more complex the society, the more the impact of those flaws is magnified.

And “scarcity” doesn’t always translate into higher wages or greater demand. Despite a demand for math and science teachers, there are never enough qualified applicants, although that’s likely because school systems won’t increase wages enough to spur demand… which points out that, for all the talk about following a business model, organizations and institutions often only do so when it suits their fancy.

Take the commoditized job of a checker at WalMart, Costco, or Target.  Theoretically, checkers provide the same service for the same pay.  Having stood in many check-out lines, I can assure you that the ability of all checkers is anything but the same.  And this is a comparatively simple job.  But because it is a “simple” job, there are many people who can handle the basics of the task, and because a really good checker who wants better pay can be replaced by someone who can just manage the basics, wages stay low…  except Costco pays better.  Why? Because they’ve learned they get better people?  But that means that service jobs are not the same…something that tends to get overlooked in economic discussions.

The same problem exists, if on a higher level, in elementary and secondary education, with an added complication.  While there are a great many would-be teachers with the proper credentials and the theoretical skills, even after years of study, the best analysts can only approximate the factors that make a truly skilled teacher. There are effective and skilled teachers with totally different approaches to exactly the same subject, who both can inspire and produce better students who learn more, and while pretty much every prescriptive and descriptive analysis of teaching can describe the basics of what constitutes an effective teacher, once you go beyond that, it’s all speculation, because truly good teaching is an art.  But… pay and value are determined by average competence, and because excellent teaching is an art, so-called “merit pay” systems have largely failed where they’ve been tried — and likely will in the future.  Yet legal codes and the threat of litigation make it effectively impossible either to identify or reward outstanding teachers. All this points out not only the problem with Marx’s assumption that, essentially all labor in a certain position or field has the same value, but the wide-scale application in industrialized societies of the same principle to all jobs with the same description. 

To make matters more complex, “commoditization” of jobs often is applied perversely, or not at all. As I’ve noted before, in my wife’s university, the professors in the performing arts fields work longer hours, provide more services to the university, and are paid far less, even when they have more degrees and experience than do professors in the field of business. That’s because the field of “business” is considered more profitable… but the job of professors is education, not business.  Likewise… why do college coaches make more than university presidents?  Because athletics are more “valuable” than administering an institution educating thousands, if not tens of thousands of students?  Then, too, on a society-wide basis, women are paid less than men with equivalent time and experience, or even less, in the same professional field.

Finally, consider the high earners in society… who are they?  I looked at the top fifty names on the Forbes list of billionaires, and 30% came out of the financial sector and 30% from the entertainment and communications sector, followed by 20% in food and retail enterprises, and 10% in natural resources.  What does that say about “value” and “rationality”?  Or about the results of blindly following a so-called market economy?

Perhaps, just perhaps, that we really don’t want to pay for either value or rationality, just what we want when we want it, and then to complain about what doesn’t get done that we don’t want to pay for.

Keeping Up With the Times

After my encounter with an excessive plethora of unexpected internet viruses, and the comments from readers, my wife the professor made the observation, “The law hasn’t kept up with the internet.”  We both laughed, because it is so obvious as to be totally laughable.  It’s also laughable in a sadder way because it’s become apparent that the law will never be able to keep up with the internet… and quite possibly with other aspects of advanced technology.


For years now, a number of high-tech companies have been trying to protect themselves with not only patents, but with secret and secretive production processes, sometimes, I’ve been told, forgoing patent protection because they believe that a patent is a roadmap for a competitor.  At the same time, we have so-called genetics companies trying to patent genes obtained or derived from people and other organisms, which suggests some fairly frightening future scenarios.

Then there’s the growing reliance on high-speed information technology and information transfer.  As I’ve mentioned earlier, nanoseconds matter in the world of securities trading, and the fact that they do requires almost total reliance on high-speed computers and sophisticated algorithms.  The federal government is pushing for standardized electronic medical records, and pretty much every state government, every major corporation, and every federal department and agency is becoming increasingly reliant on such technologies.

And yet, this increasingly complex and interdependent web of information makes both our economy, as well as its underlying infrastructure, and thus not only our economy, but every industry and service, ever more vulnerable to technological disruptions, the causes of which could range from massive solar flares to inspired hackers, dedicated and sophisticated cyber terrorists, foreign computer operatives, and unexpected algorithm failures or applications.

We don’t have a legal structure that is designed to deal adequately with either massive electronic misfeasance or malfeasance, and even if we did, we don’t have the means to track down even a fraction of the perpetrators, let alone a way to legally and physically punish them.  And it’s highly unlikely that we ever will. This is not a problem unknown in human history.  In fact, in a sense, we deal with it every day, because no government anywhere can monitor all its people all the time and deal with all the possible violence they could commit. Historically, social codes have been far more important than laws… but social codes only work well with populations that share common values, which raises the overwhelming question, so to speak – what happens when neither laws nor social codes are able to restrict wide-scale information hacking, cyber-sabotage, intellectual property piracy, and out-and-out information systems terrorism? 

It’s clear that some organizations can muster the technology and skills to thwart or counter such; it’s also clear that most of us can’t, not on a continuing ongoing basis. Nor, at present, do most nations have adequate back-ups and alternative infrastructure and communications systems ready to take over in the event of information system failures on a national scale.  Yet the push for greater information technology integration continues, again fueled by promises of lower costs and greater efficiencies… or at least greater efficiencies until everything collapses.

Why isn’t anyone looking at this problem seriously?  Because, of course, it’s too expensive to resolve… and I fear that when people suddenly realize that something needs to be done, it will be far too late.

 

One of My Computers is Down…

…and I’m angry, not quite raging rip-down-walls-mad, but close. Despite two different reputable anti-virus systems, both current, some virus or perhaps more than one, has rendered it useless, so much so that I had to turn it over to computer professionals for rehab.  All the files I need, except one, are backed up, and I can reconstruct that one, if necessary, but the time, money, and inconvenience resulting from this sort of event are more than a little irritating.

And for what?  So that some cyber-psycho can get his or her kicks out of destruction, out of wasting people’s money and time?  Or so that some sociopath can make other people’s hardware and software unwitting tools for some grand nefarious project that will victimize even more people?

I’m fortunate; I have back-ups; and I can afford repairs, but there was a time when this sort of thing would have thrown a huge monkey-wrench into life and family finances… and that’s still true for all too many people.

I’m definitely not a Luddite.  I like technology.  But as I’ve posted before, computer technology opens whole avenues of mischief, crime, and destruction.  It also has destroyed borders in so far as criminal activities are concerned.  Among other misperspectives, the right-wing opponents of immigration reform have totally missed the boat on crime.  Thanks to computers, a lot of the criminals outside our borders don’t even have to enter the U.S. in order to victimize Americans.  Without immigration reform, all we’re doing is making criminals out of those who came here to find work and opportunity and keeping the brightest of the rest of the world from coming here.

Computer viruses, worms, scams, identity theft, and the like are all part and parcel of crimes at a distance, where the perpetrator doesn’t see or have to face the misery he or she has caused, usually is never apprehended, and even if caught is seldom punished in any degree of relation to the harm caused.  It’s just another aspect of the technological desensitization of society, which includes rampant violence in every form of entertainment, sensationalistic news, computer/video games glorifying crime and violence, and the possibility of drone attacks all over the world.

Me, I’d just settle for an active virus-defense system that would immediately wipe the computer of anyone sending me such a virus or worm.

 

The Value of Prevention… and the Question of Responsibility

One of the functions that government performs best, in the sense that it is a function that can seldom be performed on a societal-wide basis by any other entity, is the one that citizens often have the least understanding of and/or the least willingness to support, with one or two notable exceptions.  That function?  To keep bad things from happening… or from getting worse once they happen.

Examples of such are police protection, trash collection, clean water, and effective sewer systems.  All of these are well-accepted government services with a large element of prevention embodied in their function. Other preventive government functions are not so well understood or accepted.

At one time, environment standards were fought tooth and nail by industry.  Now various industries fight new or tighter emission standards, but those standards are designed for one function – to prevent the emission of pollutants that harm people, and without those standards we had rivers where nothing could live, and some that even caught fire, air that caused hundreds of thousands of deaths, water supplies containing virulent carcinogenic chemicals… and so forth.

More than a few people have complained about the massive federal government spending on counter-terrorism, now estimated at $150 billion annually, just in the United States.  In 2011, the most-recent year of the Global Terrorism Index, there were 4,564 terrorist incidents that led to 7,473 deaths.  In the United States, there were actually more terrorist attacks in the 1970s than in any decade since – although the 9/11 attack was the deadliest in U.S. history – but since 9/11 the overall numbers of successful terrorist attacks have continued to decline, almost certainly due to increased security measures and, some would say, a certain restriction on personal freedoms.  But consider this.  Since 9/11, only 37 people have died from terrorist attacks and assaults in the United States.  While this apparatus hasn’t prevented those deaths, how many others has it prevented?  How can anyone tell? 

Vaccination is another area of prevention, although some parents still don’t understand vaccination or the need for it, but that’s an area where some quantification does exist. For example, even today, according to the World Health Organization over 100,000 people die every year from measles, yet there have only been few hundred cases annually in the United States over the past several decades, all of which occurred in unvaccinated individuals.  Before the development of the vaccine, there were often close to a million cases a year in the U.S., and as many as 7,000 deaths.  More recently, nearly a million people died world-wide annually from measles in years before 1999, when more wide-spread use of vaccines became available.  Yet in recent years, there have been parents who insist that the vaccine is more deadly than the disease, despite long-standing figures that shown mortality from measles ranges from one death in a thousand cases in healthy and well-nourished individuals to as high as 300 in a thousand (30%) for weakened or malnourished individuals.  By comparison, severe side effects from the vaccine are less than one in a million, and fatal side effects so low that they cannot be quantified.  For all that, some parents still insist that their child is safer without being vaccinated.

Another area of successful prevention is that of automobile safety. The all-time high in automobile deaths was almost 55,000 in 1972, when the population was a third lower than it is now, but by 2011, that had dropped to 32,367, the lowest total in 60 years. Since 1960, the number of vehicles on the road has tripled and population has increased by 50%, yet automobile fatalities per 100,000 people have been halved.  The cost?  Adjusted for inflation, the cost of an average new car is roughly 140% higher than that of a 1973 new car, when the first significant mandated federal safety standards were imposed.   Assuming that no such standards were implemented, a conservative estimate suggests that we would have seen roughly a half-million more deaths than actually occurred, and most likely at least as many additional injuries. The problem with trying to quantify the costs is that it’s impossible to determine how much of the reduction in fatalities comes from improved design and how much from safety features and other factors, such as seatbelt laws.  That preventive measures have had a huge impact isn’t even in doubt, but we still have thousands of deaths a year because drivers don’t wear seatbelts, either because they don’t think it can happen to them or because they’re exercising a perverse form of civil disobedience.

Similar questions arise in healthcare.  Some critics have pointed out that the largest cause of death in the United States is heart disease, followed by cancer, strokes, and hospital infections.  Yet the most effective form of prevention is a healthy life-style, particularly avoiding obesity, tobacco use, and excessive consumption of alcohol while engaging in regular exercise. For all that knowledge, over forty million people still smoke, and over 30% of the population is obese, while excessive consumption of alcohol is a problem faced by at least 15% of the population. The cost of a single day’s treatment in a hospital for someone with a suspected heart attack can easily exceed $5,000 and the course of treatment for an actual heart attack can run many times that, and we – or our insurance carriers, or both – as a nation spend an estimated $500 billion in healthcare expenditures that could be greatly reduced if more people made, or were able to make, a greater effort toward a healthier lifestyle.

Some kinds of prevention, such as requiring vaccinations, drastically reduce death rates and costs for a tiny fraction of just what burial costs would be.  Others, such as automobile safety features, are still obviously cost-effective, but both are effective because the prevention is not only required, but it can be largely implemented.  Basic environmental standards are clearly cost-effective, but regulators and attorneys continue to argue about the need for tighter or additional environmental regulations and whether they improve health and the environment compared to the cost to those who must comply.  Nonetheless, some kinds of prevention can only be accomplished by government.  No individual, for practical purposes, can prevent air and water pollution or require automotive safety standards, or clean drinking water and safe sewage disposal.

In healthcare, the matter is even stickier.  Healthcare providers – or the government – cannot not only not require people to adopt a healthy lifestyle, but are greatly limited in requiring people who maintain unhealthy lifestyles to pay their full share of the additional healthcare costs required by such individuals. In fact, the current direction of U.S. healthcare is away from requiring individual responsibility, even as a host of government regulations require it in other areas.

Prevention — who pays for it?  Who should?  How much? And to what degree should people be made personally responsible for their own failures to prevent the preventable?

Another Look at U.S. Priorities

A recent Associated Press news story highlighted a fact that we all know – CEO pay has been going up again ever since a brief two year decline following the initial 2007 economic meltdown, and is now at its highest level ever.  In 2012, according to data from Equilar, an executive pay research firm, the “average” CEO made $9.7 million, up 6.5% in 2012 from 2011.  By comparison, the pay for all U.S. workers rose an average of 1.3% per year over the last three years.

Even more interesting is the fact that the two highest paid CEOs were from the entertainment and media industry, with the highest compensated CEO raking in just over $60 million [not including deferred stock compensation].  In fact, five out of the top ten were in entertainment and media.  Another interesting fact is that the area with the highest average CEO pay is health care, while the lowest is that of public utility CEOs, not that they’re exactly impoverished with an average pay packet of $7.5 million.

There are a number of conclusions one might draw from this, but the one that stands out, at least to me, is that the highest paid executives come from the field that provides the least tangible value to its consumers.  We need food, water, shelter, power, heat, and medical care.  We don’t physically need packaged entertainment.  While everyone complains about the costs of health care – and in most cases those costs are far too high – especially when one considers the pricing model of the pharmaceutical industry, where U.S. consumers foot the bill, and the rest of the world gets lower-cost prescription drugs – health care does provide a tangible benefit and has improved our lives.  I’m not sure we can say that about the U.S. entertainment industry.

But entertainment – and today’s media is in fact entertainment, including almost all so-called news – obviously fills a psychological need – and one for which people are willing to pay – and one that is extraordinarily profitable – just like the illegal drug industry.  Come to think of it, there’s a certain similarity.  Both have products that make their consumers feel good, and both have negative long-term effects… and the content of both is essentially unregulated… and both are highly profitable for those at the top.

And like it or not, how we as a culture spend our money and reward those who provide goods and services says more about us than we’d like to admit.

Selective Raises in Higher Education

Last week the head of the Utah State Board of Regents proposed pay hikes for all of the college and university presidents in the state system, as much as by 24% in one case.  The reason cited was that the state has trouble keeping good university presidents.  The past two presidents of the University of Utah now make far more heading large universities elsewhere, and the president of Southern Utah University is leaving to take the head position at Eastern Kentucky University at double the salary he made in Utah.  Keeping Utah education “competitive” makes sense, so far as it goes.  The problem is that it stops with the upper administration.

Faculty salaries at state universities were frozen from 2008 to 2010, and faculty members have received raises of one percent per year for the past two years, with another one percent increase scheduled for the coming school year.  This wouldn’t be all that bad, given the current economic climate, except for the fact that the salaries of existing faculty members have been frozen for something like six of the last twenty years, and annual raises have exceeded 2% only in about three of those twenty years – and faculty salaries on average are in the lowest twenty percent nationwide.

A number of Utah universities have dealt with the salary cap by filling the positions of departing or retiring faculty, partly by hiring more adjuncts and partly by setting a much higher salaries for new faculty, so that longer term and more loyal faculty effectively get penalized… and so that good professors who don’t have to worry about spouses’  jobs or family connections have a tendency to depart for greener pastures, none of which helps improve faculty morale or higher education.  

Yet the Utah legislature, which continually touts education as a priority, spends so little per pupil on elementary and secondary education that even Idaho – the next lowest in the United States – spends nearly  20% more on each student than does Utah.   Utah’s public school spending per pupil is 43% below the national average, and the fact that it has the most crowded classrooms is just one reflection of that.  So is the amount of remedial help high school graduates need when they reach college.

The same sort of mentality applies to the legislature with regard to higher education as well. There’s a great deal of lip service, but a real reluctance to provide funding.  And when university presidents raise money from private donors for needed facilities, the legislature balks at providing the funding for operating and maintaining such facilities.  At the same time, part of the universities’ annual state funding is based on enrollment growth.  So… let’s get this in perspective.  They want more students with less funding for each student, and they require tuition increases, adding to the burden on students, while underpaying faculty, effectively forcing universities to court donors for funds to build needed facilities that the legislature doesn’t want to maintain  … but they want to reward the university presidents.

Does that sound familiar?  Of course.  It’s the current big business model.  As one critic suggested, Utah really shouldn’t be applying the “big business” model to education, not if it wants to improve education.

But then, does the legislature really want that… or just to create the impression that it cares about real educational improvement?  After all, it’s easy to pay the CEO more… and much, much harder and more costly to fix the larger problems.

The Coming Demise of the “Now” Culture?

Human beings have always been creatures of the present, as exemplified by the old saying, “eat, drink, and be merry, for tomorrow we may die.”  Admittedly, that was originally a soldiers’ mantra, but it is beginning to appear that it’s become almost a way of life now, especially in the United States.

The most obvious aspect of this is texting and tweeting, where people literally risk death to get largely meaningless messages right “now.”  Emails have replaced letters, and a plethora of text abbreviations have proliferated because so few want to spell out phrases – or can – and the result, regardless of protests to the contrary [which have also proliferated], is that language has been not only truncated but cheapened as more and more electronic communicators adopt simplistic abbreviations, rather than attempting to take the time to express their own feelings in their own words.  But then, it may be that they’re simply incapable of doing so.

But there’s another physical problem created by the “now” nature of the internet.  More and more, those who use it are turning from text to visual images, not to mention the various real-time streaming features, all of which consume enormous amounts of bandwidth.  In less than a decade, even with all the planned expansions, the entire internet/world-wide-web is likely to come to a screeching overload/traffic jam halt… unless tens of billions of dollars are invested in new and expanded infrastructure or some new compression or routing routines are adopted.  Even so, the speeds of today may soon be a thing of the past.

As I’ve noted a number of times, the business/corporate sector has been totally co-opted by the “now.”   Long-term planning is 18 months.  The value of a corporation is strictly based on current stock prices, revenues, and sales, and at the slightest whiff of news – good or bad – that value instantly changes. Corporations have spent hundreds of millions of dollars, if not billions, to gain an advantage of little more than nanoseconds in securities trading.  Yet, over the past few years, we’ve had several “flash crashes” in the stock market, the last one resulting from hacked phony information fed from an AP account.  And yet, for all these warnings, the need and desire for “profit now,” has resulted in even greater reliance on high-speed, algorithm-based computerized program trading… putting the global banking and financial system at even greater risk.

We have an national infrastructure crisis, with millions of bridges needing repair, a national power grid that has become increasingly overloaded and fragile, scores of cities with inadequate highway and public transportation systems, an air traffic control system that is already antiquated and susceptible to disruptions, a hundred nuclear power plants with nowhere to dispose of their spent nuclear fuel,  civic water systems losing billions of gallons annually to leaky pipes and conduits – and a political system that won’t deal with any of it because the politicians are too fearful of a public that opposes any tax increases now, regardless of the long-term costs and implications.

As more than a few readers have noted, we also have a parenting problem – because far too many parents don’t want to deal with the present “unpleasantness” of disciplining their children, and because we have too many those same parents not wanting anyone else to impose restrictions on those children, we have marginal, if that, discipline in far too many schools, and far too many young people growing up with little or no idea of the conduct requirements necessary to obtain and hold a job – such as showing up for work every day and on time.

Although everyone pays great attention and lip service to education, the emphasis in practice is almost totally on the present.  How do we raise test scores?  How do we get graduation and retention rates up now? Or at the collegiate level, how can we change education so graduates get jobs now?  Everywhere is the complaint that the cost of higher education falls too heavily on the students, but what is the reaction from state legislatures, who used to fund a significant share of the costs of state universities?  No one wants to raise taxes now; so we’ll hire more part-time adjuncts at near starvation wages and continue to raise tuition.

And what are the other proposed popular solutions to problems in education?  Let’s reward teachers for improvements in testing, graduation, and retention.  Just where is the emphasis on critical thinking?  Or the discussion of what kind of education is relevant for what types of learners?  Or what type of education will foster the ability to allow students to keep learning once they’re out of the education system, something that’s particularly relevant given that, according to an NCES study, 40% of  Americans are either functionally illiterate or are only able to read on the most basic level.  Other studies show that from 33% to 42% of all college graduates will never read another complete book after graduation.

Another aspect of the “now” culture is the inability or unwillingness to look at the implications of current “now” trends.  The other day my wife walked into one of the largest department stores in Salt Lake City, a store that is one of hundreds of a national chain, and walked out, unable to buy anything because “the computers are down.”  What happens when sales and inventory, and even climate control [the air conditioning was “down,” too] are tied into systems that, because of their increasing complexity, are more prone to fail?   Last month, I had to re-schedule a doctor’s appointment because, when I got to the office, I was informed that the doctor couldn’t see me – or anyone else – because the computers were down and no one could access my medical records.  That’s not a big problem for a routine check-up; it’s a huge problem if the emergency room’s access to records goes down.   Banks are trying to become more efficient by greater reliance on electronic banking and ATMs.  What happens when there’s a power failure or a computer failure?  Or a terrorist hacking of the financial system – especially when so many Americans, especially those under 30 or so, don’t even carry any cash and instead rely on their debit or credit card – or their IPhone – to pay for goods and services?

A recent article in the New Yorker featured an interview with the head of the FBI’s cyber-crime unit.  The upshot was that, with a literal handful of exceptions, essentially every single computerized system in the United States is vulnerable to current “spear-phishing” information piracy.  This includes power plants and power distribution systems, air traffic control, public utilities, and all corporate headquarters, including high tech and defense contractors. Even classified plans, such as those for the F-35, the advanced strike fighter, have been pirated, most likely by the Chinese government.  And yet, computer systems security is woefully underfunded at a time when everyone is using more and more computers for more and more information transfer.

Unless matters change, and quickly, I worry that the “now” generation may well end up having neither a “now” nor much of a future.  But then, the future’s not now… so almost no one seems to worry as much as I do.

Shameless Self-Promotion

Over the years, in the military, business, and government, I’ve watched those who’ve been successful, and, especially in larger organizations, or government and academia, an inordinate number of those who’ve been successful in getting advanced have been shameless self-promoters whose acts and accomplishments are far less than what they represent and almost invariably less than those of at least a few of their colleagues.  So why are such individuals so successful?

First, they deceive themselves into believing that they’re better than they are, and having done so, they have no doubts about themselves, unlike more honest and introspective colleagues.  This puts the more honest competitors for the same position at a significant disadvantage. Moreover, often those who might well do a better job, and often have in fact done so, are reluctant to be ruthlessly self-promoting because, first, that kind of self-promotion usually results in denigrating others [subtly, of course, in the case of highly skilled self-promoters] and involves a certain degree of intellectual dishonesty.

Now… there’s nothing wrong with blowing one’s own horn, because, all too often, if you don’t, no one else will. But all too many superiors tend to assume that if someone doesn’t blow their own horn, they have no accomplishments to tout… or that if they tout those accomplishments honestly or modestly, such accomplishments are less that those touted with the equivalent of a full brass band.  And, in all too many organizations, quiet and honest self-promotion gets lost in the din.

Shameless self-promoters are also usually masters at minimizing the accomplishments of others, and the best do it with praise, showing a certain “generosity” that suggests that maybe those accomplishments weren’t that great, but that the individuals are devoted and work hard.

The shameless self-promoters tend to offer simplistic and excessively optimistic solutions, and then blame others when the results don’t materialize, again with that “generous” deprecation, such as “the team tried hard, but…” or “the finance types are good people, but they just don’t understand.”  The combination of self-centeredness and simplicity appeals to many harried superiors, because far too many of those superiors don’t want to hear of difficulties, needs for more resources, etc.

The shameless self-promoters are extraordinarily adept at “sucking up” to those above them who can help them rise in the organization and politely ignoring those who cannot… but once they’ve reached a level where those who once helped them can no longer do so, the self-promoter will quickly and quietly move away and find others even higher up to whom he or she can address praise and interest.

Now… there’s no secret to this general pattern or formula of behavior.  It’s been noted for generations.  What I find so amazing is that it continues to work, generation after generation, in culture after culture.

Beliefs, Fundamentals, and Extremes

From what I’ve observed, and from what history reports, the majority of violence wreaked in human history has been primarily caused by two kinds of people – by those who are mentally unbalanced, either temporarily or permanently, and by those with extreme views of some sort [and some might claim that extreme views are a form of mental imbalance, but I’m not in that camp]. Since I’m not a psychologist or psychiatrist, I’ll forgo, at least for now, in commenting on mental instability, save that there appear to always have been those who are mentally unstable, and make a few observations on extremism.

To begin with, extremism leading to violence always seems to manifest itself in beliefs of some sort.  These beliefs may be religious, political, social, or even in some other secular area.  This does NOT mean that all extremists are prone to violence, but it does mean that the tendency to violence is far more prevalent in those with extreme views. I’ve certainly met some fairly extreme vegetarians and environmentalists.  I’ve even met a few, believe it or not, extreme pacifists. Certain religions at certain times seem to have created more extremists prone to violence. 

I’d submit that extremism is usually an offshoot of fundamentalist tendencies in an individual’s belief structure, again whether religious, political, or secular.  Those with fundamentalist beliefs of whatever sort share the conviction that adhering to a simple, basic, belief structure is the only “right” way.  Such fundamentalists can be violent and vicious, sometimes against others who believe only slightly differently.  One has only to look at the Hundred Years War, the internecine violence in England in the time of Henry VIII and his immediate successors, the present violence between various Shiite and Sunni factions in the Middle East.  This can occur along political lines as well.  The Tea Party faction of the Republican Party has been merciless in trying to weed out moderate and liberal Republicans.  I’ve been attacked fairly violently for suggesting a few restrictions on gun ownership, and I’ve in no way proposed taking away all guns.

So why do so many “fundamentalists” of all sorts get so angry – and sometimes violent, especially when challenged?

Someone I respect, who has great insight, suggested that it is because everyone has a core set of beliefs, and that those who become most violent are those who, first, identify most strongly with a simple and basic set of beliefs, and second, feel threatened by those who do not believe as they do.  I also think that such individuals are easily angered when they feel others do not “respect” their views.  The last factor is economic.  Often, extremists feel that those who do not respect their views will force them to conform or even take either “rights” or property from them… or they feel that such others already have.  This last case was clearly a factor in the rise of both Communism and Nazism, where the political leadership tied the both real and perceived hardship of the people to specific groups who became the focus of group violence.  Certainly, income and social inequality, real and perceived, have fueled a great deal of extremism  

Do extremists with tendencies toward violence become attracted to fundamentalist extremes, or do groups with fundamentalist extremist views influence their believers toward violence?  Or is it even possible to separate the two.

What is clear from a reading of the present, the past, and history is that there have always been extremists, and that the majority of violence comes from them… and that they go to great extremes [of course] to justify their both their views and their acts as being necessary for either the greater good or for self-defense.

What else is new?

Famous, Happy… and Making a Difference…

This is the season for high school and college graduations… and a time when the famous and semi-famous are often invited to provide inspiring graduation speeches.  I’ve never been asked to speak at any graduation, because I’m obviously not even semi-famous enough, but I’ve often thought about what I might say.

 Over the years, I’ve heard students, in responding to questions about what they intend to do, express sentiments such as “I want to be famous.” Or they want to be happy or rich.  The more idealistic among them want to do something meaningful or “make a difference.”  And, of course, all too often, graduation speakers talk about “these talented graduates” and how they can change the world.  They offer inspirational advice that implies close to instant achievement… and sometimes more.

 Now, perhaps I’ve been at the wrong graduations at the wrong time, but the ones I’ve attended, and there are more than a few, given the number of offspring we’ve had, often miss one of the most basic points.  I’m sure that some speaker, somewhere, has made this point, but I suspect that it’s fairly rare. 

 All the lofty aspirations too many students and speakers mouth are all results, and sometimes, as in the case of being happy or famous, they’re not even goals that anyone can attain directly.  There is no business and no profession that creates happiness or fame directly [Hollywood and the Internet notwithstanding], and there’s not a single profession entitled “make a difference.” To be happy, you have to take satisfaction in what you do in life and in the people with whom you associate.  That means acquiring significant expertise in a field, and that requires, usually, long and dedicated effort.  The same is true of relationships; they just don’t happen.

 As for doing something meaningful or making a difference, that generally requires even more education and years of effort.  In his book, Outliers, Malcolm Gladwell makes the point that in every field, to be successful takes not only innate talent but at least 10,000 hours of dedicated and focused high-level effort.  That’s 10,000 hours of practicing piano or singing, always trying to master more and more difficult pieces, not to mention needing a solid mentor and teacher. That’s 10,000 hours of writing computer codes, building your own hardware and programming it.  That’s generally a minimum of ten years of intensive application in a single field, most of it after finishing formal education.  Athletic success has to start earlier, of course, as does most musical performance, because muscles have to be trained as they develop… but it still takes 10,000 hours.

 So… all those lofty aspirations… those of you about to graduate can pretty much kick them aside unless you want to work with incredible dedication for the next ten years, and that’s just the beginning!  As for the less lofty aspirations, such as being happy, achieving them still requires an interest in and a dedication to something that you like doing that pays the bills, because, frankly put, no one stays happy long if you can’t put food on the table, clothes on your back, and a roof over your head.

 Talent, intelligence, and ideals are just the beginning of the beginning… and that’s something that’s not often emphasized enough.  Not that anyone’s going to ask me to give that speech.

Authority, Civility… and Civilization

Yesterday Ricardo Portillo died in a Salt Lake hospital.  He died from terminal brain injuries caused by a single punch to his temple.  Why?  Because he was a volunteer soccer referee and he had yellow-carded a seventeen year old for excessively rough play.  While he was writing up the yellow card, the seventeen year old walked up and punched him in the side of the head.  Portillo never saw it coming.  I’d like to think that this sort of violence and anger is unusual.  It’s not.

 Everywhere I look, I see a growing anger at authority, whether it’s the referees in sports contests, the police, the government, parents, or children.  And this anger is just like what happened to   Ricardo Portillo, in the sense that it’s all out of proportion to what seemingly generated it.  In Portillo’s case, the player wasn’t even ejected from the game; that takes a red card or two yellow cards. The soccer game wasn’t even in a tournament, just a routine local match.  Week after week, there are stories about angry sports contestants, and even more often, stories about out-of-hand parents and fans.  Referees in most sports take incredible abuse.  Why?  Why should they be targets?  They’re doing their best, and, in almost all cases, especially on the professional level, they’re impartial. 

 Every day, there’s another incident of “road rage,” where someone goes berserk, because of another driver’s behavior.  Sometimes, frankly, it’s understandable, especially when someone tries to cut in front of people who’ve been politely and patiently waiting, in order, but in both cases, that of the initial offender and that of the outraged offender, the individuals are over-reacting and wanting it “their way” regardless of the impact on others – and the results are often tragically out of proportion to the offense.

 We see the same thing in politics and political rhetoric. Day after day, I read and hear the violence in the words of all too many gun owners, everything about how the government will have to take their guns from their cold dead hands, about how the government is out to take their freedoms and their guns.  It’s absolutely senseless. The legislation about which they’re getting so enraged deals with banning one class of guns out of hundreds and limiting magazine size – and as many gun owners have pointedly told me, the magazine size makes little difference.  Obviously, this rage is fueled by fear, but exactly what is there to fear?  The politicians are so cowed by this rage that they aren’t about to do anything, and there has never been anything close to a national consensus, liberals notwithstanding, in the entire history of the United States, for outlawing all individual ownership of firearms.

 This rhetorical viciousness is everywhere, and it often goes beyond rhetoric. The anti-abortion extremists have gone so far as to physically threaten and even murder doctors who perform abortions.  Like it or not, there are two sides to the abortion debate, especially when the life of the mother is endangered, or she is a victim or rape or incest, if not both.  Yet vitriolic absolutist rage isn’t going to solve anything. It’s just going to engender more rage.

 The anger over health care is another example.  The issue is two-sided.  Failure to have health care destroys people and families… and some people simply can’t afford it.  Likewise, many small businesses face crippling financial burdens.  [I’ll admit I don’t have much sympathy for multi-billion dollar businesses like WalMart who hire tens of thousands of part-timers to avoid paying health care… and then cry poor.]  But the vitriol in the rhetoric is astounding.

 According to Theodore Roosevelt we need to struggle for “true liberties which can only come through order.”  He also stated that “The first principle of civilization is the preservation or order.”  There’s also the quote attributed to Jefferson – “Without order, there is no liberty,” but for all the truth behind it, I can’t find any evidence that he actually said it.

 On the face of matters, it would seem evident that without order, societies don’t work, and establishing order requires a certain amount of civil authority, but more and more, Americans, as well as others around the globe, seem to take umbrage when that authority applies to us – or those we support.  It’s all right to use drones against foreign terrorists, but not against American citizens.  Miranda rights are absolutely necessary for American-born citizens [read WASPs], but not for immigrants or foreign-born citizens.  It’s fine when the referee punishes a player on the other team, but not on “our” team, and especially not my son or daughter, and I can yell and scream and threaten the ref.  Or, I can text safely while driving, so that there shouldn’t be any laws restricting my ability to use electronic devices while driving, and I’ll get really angry if I get a ticket for it.  Or, if I’m a celebrity caught driving drunk, I can threaten the officer who arrests me.

 Regardless of who said it or who didn’t, liberty and order are inseparable in any workable society.  Without liberty, the most ordered society will fail, and we’ve seen that happen time and time in our lifetime.  Likewise, without order, there is no society… and no way to protect liberties – except for the strongest and most ruthless.

So why are so many people so enraged at attempts to maintain order? 

The World – A Better Place Today?

If someone had asked that question a century or so ago, in most places in the world there would have been one of two answers.  In the western hemisphere, or in those areas dominated by western hemisphere culture, the answer most predominantly would have been, “Of course.”  And in the remainder of the world, the answer would most likely been, I suspect, a variation on “Has it changed?”

 The problem with trying to answer that question today is defining what one means by “better.”  If we’re talking about general health, better nutrition, less deadly and widespread violence, then, in general, the world is a better place, that is, if you’re not in Somalia, Taliban-controlled Afghanistan, parts of Africa… and similar locales. But other aspects of “better” aren’t so clear.

More people can theoretically read, if one defines reading as the ability to decipher the meaning of symbols in print… but, at least in the United States, based on what I and all too many others have seen in higher education, high level comprehending literacy and the ability to concentrate on written material has declined even as technical computer skills have increased. The retained knowledge database of most individuals has declined, most likely because any fact is easily found through smartphones or computers.  Better or worse?  That depends on the definition… and the priorities behind the definition.

 There are certainly more nations where citizens can vote, and according to various foundations, in general there’s more freedom, but given the political structures in many countries, that “freedom” often means little real choice, which means that matters may be “better” politically, but not nearly so much better as the Pollyannas claim.

 In the high-tech western nations, child labor is rare, and air and water pollution is far less than it was a century ago…but in all too much of the world, those conditions are likely worse.  Whether matters are better depends on where you are… and how high – or low – your income is.

 The problem with deciding whether the world is a better or worse place is that most of us decide based on where we live, and no one place is representative of the world.  More troubling than that is the fact that most of those who can make their views known about the state of the world are those who are anything but representative, because in a media intensive world, the vast majority of those who can even participate are the comparatively more affluent and advantaged. This isn’t anything new; it goes back as far as the invention of writing because, then, only the advantaged could write [and even the slaves who served as scribes were more advantaged than most others].

 In the end, it’s a good idea to remember that “better” is a comparative, and that it all depends on what is being compared by whom… and for what reason.

Absolute Rights?

Absolutes?  I’m skeptical of them, if not downright hostile. Sometimes an absolute is a good guide.  After all, as a general matter of principle, it is not a good idea to go around taking other people’s things or shooting people. Or imprisoning them.   But… as I’ve noted on more than a few occasions, human beings have this desire for things to be black or white, absolutely good or absolutely evil.  We don’t live in a black and white world.  We live in a world filled with all shades of color and, for that matter, innumerable shades of gray, and we – and our societies – have to live in that world and, if we want even a modicum of civility and civilization, we have to create customs and governments that recognize that those shades and colors exist.

 The other day I got a posting on the blog insisting that the right to bear arms was a constitutional right and that my proposals to license and regulate firearms would negate that right because a constitutional right could not be restricted or taxed and still remain a “right.”  After I put my jaw back in place, I thought about the naiveté; the lack of understanding of what society is; the lack of knowledge about what the Constitution is and what it established, and what it did not; and the total self-centeredness represented by that comment… and the fact that all too many Americans share those views about “rights.”

 First, we need to start with the Constitution itself, and the first ten amendments, popularly known as the Bill of Rights.  The First Amendment states that the Congress shall make no law “abridging the freedom of speech.”  But more than a score of U.S. Supreme Court decisions have established that the freedom of speech is not absolute, especially where that freedom harms others or has the clear potential to do so.

 The Fourth Amendment prohibits “unreasonable” search and seizure and states that a search warrant cannot be issued without “probable cause,” but again, a number of Supreme Court cases have made clear that there are exceptions to those requirements, i.e., that the Fourth Amendment is not an “absolute right.”

 The same is true of the Second Amendment. One of the earliest Supreme Court decisions involving the Second Amendment was issued in 1875 and stated that the Constitution does not establish the right to keep and bear arms, but affirms an existing right.  A number of other Supreme Court decisions followed establishing the fact that the federal and state governments can establish reasonable limits on that right, and in 2008 the Heller decision stated “the right is not unlimited. It is not a right to keep and carry any weapon whatsoever in any manner whatsoever and for whatever purpose…”

 Those who object to the Supreme Court decisions in such cases often complain that the Court is perverting or destroying the Constitution.  Yet the Constitution plainly states that “The judicial Power of the United States shall be vested in one supreme Court…” and that “The judicial Power shall extend to all Cases, in Law and Equity, arising under this Constitution, the Laws of the United States, and Treaties made…”  In short, like it or not, in cases of dispute about what is or is not Constitutional, the Supreme Court decides.

 Now… people can complain about such decisions, and they can try to change the laws or try to keep new laws on such subjects from being enacted, but what they cannot claim – not accurately, anyway – is that such restrictions are “unconstitutional.” Some will then reiterate the idea that any tax or restriction negates a “right.”

 What they seem to ignore or forget is that the entire concept of an unfettered “absolute” right is contrary to the entire idea of what we call civilization.  Of course, the fact that so many people want to assert their individual and “absolute” rights in so many areas suggests that civilization may itself be endangered. Take the idea of absolute property rights.  We do not allow individuals totally unfettered rights to property. A business or individual cannot dump whatever trash and toxic chemicals he wants into the river or stream that flows through his property.  As a society, we recognize, at least in theory, that many individual actions can adversely affect or kill others, and we attempt to restrict such actions because it is all too clear that there are too many individuals who will not restrict their actions for one reason or another. Now… one can complain that there aren’t enough restrictions or that there are too many or those that exist are too onerous, but the fact that some restrictions are necessary for any society to survive has been proven, as the founding fathers put it, “self-evident.”

 In the end, anyone who declares that he or she has any “absolute” right is merely declaring that their “rights” transcend the rights of others.  “Your right” to free speech through four hundred decibel speakers denies your neighbors right to a decent night’s sleep.  Your right to dispose of your wastes any way you want fouls the stream and denies those downstream equal rights to clean water.  Your right to smoke in close quarters endangers someone else’s health.

 Anyone who claims an inviolable absolute right either doesn’t understand the requirements of a civilized society… or puts what they think are their “inviolable rights” above everyone else’s inviolable rights.  Either way, it’s dangerous for the rest of us, not to mention being a form of narcissistic denial of reality.