One Person’s Waste [Part I]

During my years in government, then as a consultant dealing with government regulations and environmental and energy issues, and even afterward, I’ve heard thousands of people say that we could just solve the budget problem by getting rid of the “waste” in government.

And when I hear that tired old phrase, I want to strangle whoever has last uttered it, because “waste” – at least in the way it’s commonly used – is a tiny fraction of federal or state spending. Now… before you start screaming, let me at least try to explain.

First, I’m defining waste as unnecessary spending for no purpose and that accomplishes nothing. Second, I do believe that government spends a great deal of money on programs and projects which have little to do with the basic objectives of government as outlined by the Founding Fathers… and I suspect most intelligent individuals believe something along the same lines.

The problem is that one person’s waste is all too often another person’s gain or livelihood. For example:

The Georgia Christmas Tree Association got $50,000 from the Department of Agriculture for ads designed to spur the buying of natural Christmas trees. To the Christmas tree growers of Georgia, this was not waste, but advertising designed to help them sell trees and make money.

The Department of Agriculture spent $93,000 to “test the French fry potential of certain potatoes.” Do you think the potato growers objected to this?

$15,000 from the Environmental Protection Agency to create a device that monitors how long hotel guests spend in the shower. Is this so wasteful, given the water crises in the west and southwest?

And then there’s Donald Trump’s use of a $40 million tax credit to renovate the Old Post Office in Washington, D.C. into a luxury hotel. I’m certain that the city would support another tax-paying and revenue generating hotel.

The Department of Agriculture’s Market Access Program provided $400,000 to the liquor lobby, which used part of those funds to transport foreign journalists to different breweries and distilleries in the southeastern United States. The liquor industry doubtless feels that this will boost liquor exports.

At the same time, there is definite out-and-out waste. According to the Government Accountability Office, in 2014 the federal government spent $125 billion in duplicative and improper payments. GAO made 440 recommendations to Congress for fixing these problems. To date, it appears that Congress has addressed none of them.

One waste-watching outfit came up with $30 billion in supposedly wasteful projects for FY 2013, including studies of the threatened gnatcatcher bird species. The only problem with the gnatcatcher “waste” was that such a study is mandated by federal law when an endangered or threatened species may be adversely affected by building or expanding a federal facility.

More to the point, however, is the fact that these self-proclaimed waste-finders only came up with $30 billion worth of waste out of federal outlays totaling $3.5 trillion – so their waste amounted to less than one percent of federal spending. Even if Congress addressed the GAO’s much more sweeping findings, such actions would only reduce federal outlays by less than 4%.

Now… I’m not condoning waste in any amount, but when the federal deficit has been ranging from $440 billion to $670 billion in recent years, it doesn’t take much brain power to figure out that merely getting rid of even all the obvious waste isn’t going to do much for constraining federal spending, assuming Congress would agree, which, as an institution, it doesn’t despite the scores of politicians who claim they’re against waste.

And all those who support a strong national defense should be appalled at some aspects of defense spending. Right now, DOD has stated that as many as 20% of the 523 U.S. military installations are unneeded. This doesn’t even count the more than 700 U.S. bases and facilities outside the United States, yet the present Congress has enacted specific language in the appropriations bill for the current fiscal year that absolutely forbids base closures.

What about my “favorite” airplane, the oh-so-lovely-and-over-budget F-35? A recent report cited DOD officials stating that “essentially every aircraft bought to date requires modifications prior to use in combat.” A plane that isn’t yet ready for combat for which the government has already committed $400 billion? An aircraft that was outmaneuvered by a much older F-16?

DOD also wants to build a new long-range strike bomber with full stealth capabilities, 100 of them at a projected cost of $565 million each.

As a former Navy pilot, I don’t object to better planes; I do have problems with very expensive aircraft that don’t seem to be better than their predecessors, and especially attack aircraft that can’t defend themselves. I also have problems with politicians who decry waste, but won’t allow DOD to reduce it because that “waste” is in their districts. Those are far more expensive examples of waste than $50,000 studies on laughter or Christmas tree promotions. It reminds me of shell game misdirection – look at these ridiculous examples of waste, and, and for heaven’s sake, don’t look at that man over there behind the curtain… or at the pork in my district. And yet, politicians, especially Republican representatives and senators, continue to attack “waste” while doing absolutely nothing meaningful about it… and they get re-elected.

The Religious Selfie

One of the basic underpinnings of religion, almost any religion, is the worship of something or some deity bigger than oneself, and the overt acknowledgment that the individual worshipper is less than the deity worshipped. Some religions even incorporate that acknowledgment as part of liturgy and/or ritual. Such acknowledgments can also be part of “secular religions,” such as Nazism, Fascism, and Communism.

Today, however, there’s a totally different secular religion on the rise, with many of the old trappings in a new form, which might be called the “New Narcissism,” the elevation and exaltation of the individual, or the “self,” to the point where all other beliefs and deities are secondary.

Exaggeration? Not necessarily. What one believes in is reflected in the altars before which one prostrates oneself. Throughout history the altars of the faithful have either held images of a deity, perhaps accompanied by those of less deity, or no images whatsoever. While images of private individuals have also existed throughout history, those images or sculptures were created for posterity, of for the afterlife, so that others would have something to remember them by… or to allow them to remember themselves as they were. At one point in time, only the wealthy or the powerful could afford such images. Even until very recently, obtaining an image of one’s self required either the cooperation of others or special tools not particularly convenient to use. This tended to restrict the proliferation of self-images.

The combination of the personal communicator/camera/ computer and the internet has changed all that. Using Facebook, Instagram, Twitter, and the internet, now each individual has the ability to create themselves as a virtual deity – and tens, if not hundreds, of millions of people are doing just that, with post after post, selfie after selfie, proclaiming their presence, image, and power to the universe [with all three possibly altered for the best effect].

It’s the triumph of “pure” self. One no longer has to accomplish something for this presence and recognition. One can just proclaim it, just the way the prophets of the past proclaimed their deity. And given what positions and in how many ways people have prostrated themselves before their portable communications devices in order to obtain yet another selfie, another image of self, it does seem to resemble old-fashioned religious prostration.

Of course, one major problem with a culture obsessed with self and selfies is that such narcissism effectively means self is bigger than anything, including a deity or a country, and I have to wonder if and when organized religions will see this threat to their deity and belief system.

Another problem is that selfies have to be current; so everyone involved in the selfie culture is continually updating and taking more selfies, almost as if yesterday’s selfie has vanished [which it likely has] and that mere memory of the past and past actions mean nothing. All that counts is the latest moment and selfie. That, in turn, can easily foster an attitude of impermanence, and that attitude makes it hard for a society to build for the future when so many people’s attention is so focused on the present, with little understanding of the past and less interest in building the future… and more in scrambling for the next selfie.

All hail Narcissus, near-forgotten prophet of our multi-mirrored, selfie-dominated present.

Cultural Appropriation

Over the past several years, there’s been a great deal of talk about the need for “diversity.” So far as I can tell, this means stories set in cultures other than those of white, Western-European males and told by protagonists other than white males. I certainly have no problem with this.

I do, however, have some misgivings about the idea that such stories must always be written by authors from those cultures, and the equally disturbing idea that when someone other than a member or a descendent of those cultures writes about them, even when projected into the future, or into a fantasy setting, that is “cultural appropriation,” and a literary sin of the first level. The rationale behind this judgment appears to be that no one who is not a member of a different or a minority culture can do justice to representing that culture in a fictional setting.

Beside that fallacious point, what is exactly the point of fiction? Is it just to be culturally accurate? Or to entertain? To make the reader think? And for that matter, how does one determine “cultural accuracy,” especially when there are significant social and even geographic differences within most cultures?

Taken to extremes, one could classify Rob Sawyer’s hominid series, about an alternate world populated by Neandertals, as “cultural appropriation,” since most of us only have a tiny fraction of Neandertal genes. Roger Zelazny’s Lord of Light could easily be classed as cultural appropriation of Hindu beliefs and myths. For that matter, Tolkien certainly used the Elder Edda of Iceland as a significant basis of Lord of the Rings. And I wrote The Ghost of the Revelator even though I wasn’t born in Utah and I’m not LDS [although I have lived here for more than twenty years].

Obviously, writers should take seriously the advice to write what they know, and know what they write, but “non-members” of a minority or another culture may well know and understand that culture as well as or even better than members of that culture. Should they be precluded from writing fiction based on those cultures because editors fear the charge of “cultural appropriation”?

This concern, unfortunately, isn’t just academic. I’ve heard editors talk time and time again about how they want more diversity, but… In one case, the significant other of a Chinese-American born and raised in Hawaii wrote and offered a YA fantasy novel based on Hawaiian myth to a number of editors. When several agents and editors found out that the writer was not Hawaiian genetically, they decided against considering the book. Several well-known authors have also told me that they wouldn’t have considered the book either, because dealing with Hawaiian beliefs would be too controversial.

Shouldn’t it just be about the book…and not the genetics/cultural background of who wrote it?

Teachers

In yesterday’s local paper, there was a front page article headlining the coming teacher shortage in Utah, to which I wanted to reply, “How could there not be?”

The beginning salary for a Utah teacher in most systems is not far above the poverty level for a family of four, and the average Utah teacher’s salary is the lowest in the United States. Utah spends the least money per pupil in primary and secondary schools of any state in the United States. Nationwide, anywhere from twenty to fifty percent of newly certified teachers drop out of teaching in five years or less [depending on whose figures you trust], and that rate is even higher in Utah. In 2015, half of all newly hired teachers in Utah quit after just one year. Yet studies also show that the longer teachers teach, the more effective they become. Add to that the fact that Utah has on average the largest class sizes in the United States. The academic curriculum leading to a teaching degree has also become more demanding [at least at the local university], and it often takes even the best students more than the standard four years to complete a course of study that leads to teacher certification, especially if they have to work to help pay for their studies.

Despite the often dismal salaries, study after study shows the comparatively poor level of pay is down the list for why teachers walk away from teaching. Almost all prospective teachers know that teaching isn’t a high-paid profession. What they don’t know is just how effectively hostile the teaching environment is to a healthy and balanced life.

Here in Utah, for example, there are state legislators who complain about pampered and lazy teachers. They’re obviously unaware of the unpaid after-school, weekend, and evening workload required to support an eight-hour teaching day. Or of the number of parents who complain about their darling children’s grades – such as the one who wanted to know how his son could possibly flunk an art class [which turned out to be the fact that said son failed to attend most of the classes and never did a single art activity]. Or about the increasing reliance on testing to determine teaching effectiveness [when the testing itself reduces instructional time, when the test results determine teacher retention and ratings, and when the tests tend to measure factoids, and fill-in-the-blank skills, rather than thinking or being able to write even a coherent paragraph].

It also doesn’t help when the local papers are filled with pages and pages about the sports activities of the local high schools, with seldom a word about academic achievements or other more academic successes, such as plays, concerts, success in engineering competitions and the like.

Nor is it exactly encouraging when school administrators offer little understanding or support of their teaching faculty. That’s more commonplace than one might realize, although national surveys show it’s a significant factor in contributing to teacher drop-out/burnout. Certainly, a number of former students of my wife the university professor have mentioned this as a difficulty in their middle school or high school teaching positions.

And finally, in the end, what’s also overlooked is that it’s actually more expensive to continually replace a high number of departing teachers than to take the necessary steps to cut the teacher drop-out rate. But based on the current public view of education and the unwillingness to make meaningful changes, I don’t see this problem changing any time soon. In fact, it’s only going to get worse… far worse.

There’s Always Someone…

I admit it. I did watch the Super Bowl. How could I not when my grandfather was one of the first season ticket holders back in the days when the Broncos were truly horrible? I can still remember him taking me to a game, and he went, rain, shine, or snow, until he was physically no longer able. I wasn’t able to go with him, unfortunately, because by then I was working in Washington, D.C.

And yes, I was definitely happy that the Broncos won, particularly since I’ve always felt that Peyton Manning is a class act, but that brings me to the point — Cam Newton’s postgame interview, if it could be called that, which was anything but a class act. Yes, he was disappointed, and he wasn’t the first great quarterback to be disappointed, and certainly won’t be the last.

Newton’s real problem is that he is so physically gifted and also has a mind good enough to use those gifts that he’s never considered a few key matters. First, in anything, no matter how big you are, how fast you, how strong you are, how intelligent you are… there’s always someone bigger, faster, stronger, and more intelligent. Second, football is a team game, and the team that plays better as a team usually wins. Third, sometimes you get the breaks, and sometimes you don’t. Fourth, you don’t win just because you have the better record or the better offense – as Denver found out two years ago. Fifth, it is a game, if a very serious one played for high stakes.

Newton also needs to realize that he’s paid extraordinarily well to do exactly the same thing that every writer does, except few of us, indeed, are paid as well as he is. He’s paid to entertain the fans, and while that means winning as much as possible, it also means not pissing everyone off and coming off like a spoiled kid. This is also something writers need to keep in mind.

Given his talent, I’m sure Newton will be a factor for years to come, but it would be nice to see a bit more class when things don’t go well. You don’t have to like losing, but in the end, as even the great Peyton Manning has discovered, we all lose… and the mark of the truly great is to show class both when things go well and when they don’t.