The Presumption of Competence

Once upon a time, when students or employees performed competently, they got a grade or their paycheck. If the work happened to be competent, in the case of the student the grade, depending on how many years ago this took place, was either a “C” or a “B.” For the employee, the paycheck didn’t change with competent work. That was what was expected for competent performance.

In recent years, however, students and younger employees alike seem to want more than mere acceptance of competence. College students’ evaluations of teachers are filled with comments with phrases like “didn’t make me feel special” or “expected too much” or “failed to encourage student self-esteem.” In addition, most students seem to think that showing up and presenting merely competent work merits an “A.” An ever-increasing number often fail even to buy and read the required textbooks for their classes. And yet there is still a continuing grade inflation in both high school and in college. In many areas of study, such as in English literature and writing, in general, students know less and write far less capably than did their predecessors. Fewer and fewer business students have any innate sense of estimation, and more and more seem lost without computers and calculators. Part of this is the result of a greater percentage of the student population going on to college, many of them falsely encouraged by too much cheerleading, too little emphasis on competence, and a society that tends to punish teachers who insist on excellence and the mastery of basic skills.

We’ve seen the same thing in the financial community, where so-called excellent performance — that later turned out to be even less than competent — was rewarded with bonuses ranging from the hundreds of thousands of dollars into the millions. The last time I checked, the minimum salary for a professional NFL player was something like $400,000. A recent study just cited in the Wall Street Journal made the observation that, given the structure and requirements of most large public corporations: (1) few CEOs were truly excellent; (2) excellent CEOs could make a slight positive difference greater than merely competent CEOs in a comparative handful of instances; (3) merely competent CEOs were adequate for the job in the majority of cases; and (4) even terrible CEOs took a while to destroy a company, except in a few exceptional cases. Yet corporate boards all presume that their CEO is excellent, and that is seldom the case.

From what I can see, fewer and fewer Americans, especially the younger ones, seem to understand the concept that every job requires basic competence and the fact that doing a job competently shouldn’t have to result in cheerleading, bonuses, and constant positive feedback — and continual promotions. Then again, if that’s what it takes to motivate someone to do a job, maybe that’s not what he or she should be doing. Rather than trying to bribe people like that, maybe their superiors should just fire them. As for the students, a lot more Bs, Cs, or even Fs wouldn’t hurt either.

They Did It All by Themselves

The other day I read a short news story about the success of the singers at the local university in a regional competition. The story highlighted each of the singers, and the only mention of their background was the name of the university. I said something about that to the head of vocal studies at the university, and she said, with a rueful smile, “They did it all by themselves.” Her unspoken point was, of course, that these students hadn’t gotten there all by themselves. Each had a professor, or several, who spent hours each week with him or her going over diction, tone, phrasing, etc., not to mention the classes in theory, literature, methods… and all the rest of the curriculum. Then, to top matters off, when the subject came up later among another group, someone else said that the Music Department was so fortunate to have such talented students, as if all their professional education meant nothing at all.

While this is just a small example of a problem that’s much larger, it did get me to thinking. Over the years, I’ve watched various sports, and I’ve found it amazing to see how a quarterback, for example, who’s done well with one team, suddenly doesn’t do so well with another, while another who was considered washed up with a former team shines with his second team. A good part of the answer is that he didn’t do it all by himself. You can be the best passer in the world, but it won’t matter if the offensive line can’t or won’t give you the time to throw.

Likewise, in the financial and business world, a great deal of media focus and excessive salary and other compensation goes to the CEO. But just how much of that is really deserved, and how much goes to all those below the CEO who did all the grunt work that make things work out well? Microsoft seems to be doing just as well now that Bill Gates isn’t in charge. Might that not indicate that, while he came up with the initial ideas and entrepreneurship, for the last decade or so, he really didn’t do it all by himself?

As an author, which is one of the more solitary occupations these days, I still can’t do it all by myself. I need an editor to catch any stupidity that might linger in the manuscript, a copy-editor to catch the inevitable typos and stupid little mistakes, someone to publish the book, someone else to provide a cover that conveys the idea/impression of what is between the covers, and a whole lot of bookstores and booksellers to carry and sell the books. And that’s what’s needed for a one-person operation, since, contrary to some popular opinion, books do not just produce themselves, walk off the shelves, and carry themselves to the check-out registers.

Yet… in field after field from collegiate activities to professional sports, to education and business, there’s this myth that people “did it all by themselves.”

Did they really? All of them?

The Hidden Costs

Over the past decade, especially, the advocates of “the market system” have pushed and pressed that free markets are the best and most efficient way of allocating resources and determining social and political priorities. Their rhetoric is true, yet extraordinarily misleading at the same time. Market systems, even malfunctioning ones, do allocate resources far more effectively than any “command and control” system, as the failure and/or transformation of virtually every dictatorial or government-directed system has demonstrated.

Unfortunately, this “efficiency” is only comparatively better than other systems and certainly not nearly as efficient as its advocates claim. Winston Churchill once commented to the effect that democracy was the worst system of government, except for everything else that had been tried. So, too, is the so-called “free market” system one of the worst ways of allocating resources — except for all the alternatives.

So-called “free market” systems have a number of severe systemic problems. Some have become very obvious over the past year or so. One defect is that prices are determined on the margin at the moment. This means, in real terms, that unregulated prices can spike or crash literally in minutes, and that the effects on society can be devastating. Another is that the balance of supply and demand, if not mitigated by society, can result in millions without jobs or incomes, and a high concentration of wealth in the hands of a few.

A third, and largely overlooked, and, I believe, even greater flaw in so-called market pricing is that such pricing is highly inaccurate in assessing the costs of goods and services. This inaccuracy comes from the fact that the prices of goods and services do not reflect the so-called externalities, or as I would term them, more accurately, the hidden costs. The examples of such costs are numerous. Until the creation of environmental laws, factories were allowed to degrade and pollute the environment without restriction, and millions of people either died or had their health permanently injured. Without employment safety laws, employers could, and did, keep costs down by using the cheapest and often the most dangerous equipment and practices and did not have to shoulder any significant fraction of the costs of workplace injuries. These types of externalities have been known now for decades and are commonly recognized, even if the means by which they have been addressed are often deplored by the more conservative advocates of “free markets.”

The problem of hidden costs, however, is far from being completely solved, or even addressed in many cases. In some areas, this is recognized, as on the environmental front, where advocates of the “free markets” continue to oppose measures to deal with global warming and air pollution. In other areas, there’s little or no awareness of such costs.

Take the issue of influenza or swine flu. The causes are known. The means of preventing its spread are also relatively well-known as well, but health authorities are becoming more and more concerned about the danger of pandemics. Why should this be a problem? If sick people just stayed home or in the hospital until the diseases would run their course, how could they infect others? Except all too many people can’t do that. They don’t have health insurance. They won’t get paid if they don’t work. In this time or downsizing and ever greater worker efficiency, there’s often literally no one else to do the work. Take a very simple example. A project/report of some sort is due. Because of downsizing, there’s exactly one expert/analyst left who can do it. If the report isn’t done, all sorts of negative events occur… violations of law, penalty costs, loss of revenue-bearing contracts. The key person has a mild case of the flu, comes to work, works through the illness and gets the job done. In the process, he or she infects three or four other people, one of whom infects an asthmatic colleague or friend who dies. Does that death, or all the other 13,000 flu deaths reported this year so far, ever show up as a negative cost on the business’s or agency’s balance sheet?

How many salmonella deaths have resulted from unsafe food industry practices directly attributable to “cost-minimization”? How many heart-attacks from work pressures caused by too few people doing too much work? There have been scores of lawsuits over the past three decades, in which major corporations were found guilty of manufacturing products that led to user deaths or guilty of practices that created deaths or ill health for thousands of people — and yet the same “free market” cost-minimization pressures persist and the same kinds of practices continue.

So… yes, the so-called free market is better than the alternatives so far tried, but let’s not have any more rhetoric about how wonderful it is and how much better it would be if the government just got out of the regulation business. We’ve already been there, and millions of innocents paid the price… and to some degree, millions still are. There’s definitely room for improvement, because, so far, markets don’t capture all the hidden costs of production and operation, and until they do, so-called free markets won’t be nearly so accurate as adherents claim they are in balancing prices and costs.

The Fascination with the "New"

I’m convinced that, with regard to innovation, most human beings tend to fall into three groups — those who are fascinated and intrigued with the newest gadget or technology, those who want nothing to do with it, and those who will employ it if it’s not too much bother to learn to use. All too often, though, especially in the United States, each new tool, gadget, or methodology is over-hyped by its proponents to the point that, initially, it tends to get either adopted willy-nilly or rejected out of hand.

As for my own personal preferences, I admit I’m a tool-user. If the new gadget doesn’t take too long to learn and will accomplish something I need done better and faster, I’ll consider it. If it takes a lot of learning for marginal improvement, chances are I won’t adopt it until there’s something better around… or until I’m forced to do so. One reason for my attitude is simply that almost all new technology doesn’t just do the “old stuff” better and faster [and sometimes it doesn’t even do that], it also incorporates all sorts of other capabilities, and those, in effect, require the individual to do more and more, often faster and faster, and usually for less compensation.

Take the internet and high-speed connections. These days, it’s expected that an author will have a website and a blog and answer at least some email [if only from editors and agents]. By its nature, email almost demands a quick response, and if you don’t respond quickly, you get more email. Having email access, even with the best spam filters, means spending some time deleting spam, if only to allow you to continue receiving the emails you need to receive.

Once an author commits to a website and presence, he or she commits to more time spent on something other than writing for actual income, and that time has to come from somewhere, either from previously personal time, from the full-time job, or from writing. From what I’ve seen, while there is some financial return [one hopes] from exposure to new readers, there’s also the “tar-baby” syndrome. That is, you’re stuck with it, because if you retreat from that presence, you’re ungrateful, or you’ve become isolated or all-too-egocentric, or fame has gone to your head, or…

The electronic forum doesn’t replace all the other aspects of writing. An author still has to produce, edit, and revise. If the author attended conventions, he or she still has to, because the tar-baby effect applies there as well. The electronic world just adds another dimension and another requirement for effort and professionalism — and this is true across most professions requiring paperwork and communications.

So why is it that everyone is so enthusiastic about so many devices and innovations that gnaw away at that most precious of personal resources — time?

The Non-Extrapolated Future

In today’s world, everyone predicts the future. We don’t think of it in that way, of course, but we do. If you go grocery shopping for special pasta to entertain friends on the coming weekend, you’re essentially predicting the future — that you and they will be there and healthy and will enjoy a pasta dinner. Businesses that plan next year’s product line are making predictions about the future. Making contributions to a retirement plan is another form of prediction. In a way, so even is voting for a political candidate. Science fiction writers try to make a living by predicting the future in a fashion that is, hopefully, both intriguing and enjoyable. Economists make their living by trying to predict economic trends.

Most of this kind of prediction is based on extrapolation, on taking existing knowledge and trends and merely extending these trends into the future. Such past-based extrapolation can at times be not only inaccurate, but extremely dangerous, as has been the case with the business and economic types who predicted that good economic times, ever-rising housing and stock prices, and enormous personal deficit financing could continue indefinitely.

Extrapolation can be very effective, if used cautiously, because technology and semi-basic human social patterns normally do not change that quickly, and it’s usually years, if not decades, before “new” technology is fully deployed and adopted throughout society. In addition, most changes are either incremental or cosmetic. For example, most western men wear trousers of some sort most of the time, and it’s been almost two centuries since trousers replaced breaches and stockings. In industrialized nations, the internal combustion engine powers most surface ground transport and has for almost a century. And in most of the world, women remain largely subject to male control and oversight, and in the rest of it, most men — and some women — are resisting further changes in the balance of power between the genders. For all the claims about human adventurism, on balance, we’re a conservative species, and that makes biological sense… until or unless or environment changes radically.

The problem with this mental conservatism is that, when the future cannot be accurately predicted on the basis of extrapolation from past experience, most people, including experts, tend to get it wrong. Conservatism and experience have combined in most people so that for years, the majority tended to be skeptical of global warming and the possible speed of climate change. Many still are, even though the latest measurements of arctic ice and glacier melting in Greenland and Antarctica indicate that the “radical” estimates of the effects on the oceans were far too conservative. The same thing happened with last year’s financial melt-down. But attempts to predict massive and radical change can be equally wrong. Forty years ago, most “experts” were convinced that space travel would be commonplace — and yet, it’s been something like 37 years since any human being even stood on the Moon. And for all the predictions of an “information singularity” or “spike,” it still hasn’t occurred.

As both a writer and as an economist, I’d love to be able to predict accurately beyond the extrapolated future, and so would many, many others, but few ever have, successfully, and perhaps, in some ways, that’s for the best. Cassandra could prophesy beyond the expected, according to Greek myth and the playwright Aeschylus, but her curse was that no one ever believed her, especially when she warned the Trojans against bringing the wooden horse into Troy.