Archive for the ‘General’ Category

Return to the Past?

After finishing a whirlwind tour – seven cities and some of their suburbs in seven days – I’ve seen a trend I noticed years ago becoming even stronger… and more than a little disturbing.  Once upon a time, books were so expensive and hard to come by that only the very wealthy possessed more than a few, and most people had none.  Libraries were few and reserved effectively for the well-off, because few of those less than well-off could read or could manage access to them.

What does that have to do with today or my tour?

Only the fact that, despite such innovations as ebooks and e-readers, in a subtle yet substantive way we’re on a path toward the past in so far as books are concerned.  Yes, millions of books are printed and millions are now available, or soon will be, in electronic formats, but obtaining access to those books is actually becoming more and more difficult for an increasing percentage of the population across the United States.  With the phase-out of small mall bookstores, more than 2,000 bookstores that existed thirty years ago are now gone.  While they were initially replaced by some 1300 “big-box” bookstores, with the collapse and disappearance of Borders and consolidation by other chains, the numbers of chain bookstores has now dropped at least 25%, if not more, in the last few years.  Add to that the number of independent bookstores that have closed, and the total shrinkage in bookstores is dramatic.

Unhappily, there’s another aspect of this change that’s far worse.  Overwhelming numbers – over 90%  – of large bookstores in the United States are situated in “destination” locations, invariably near or in wealthy areas of cities and suburbs, reachable easily only by automobile.  At the same time, funding for public and school libraries is declining drastically, and, in many cases, funds for books are slim or non-existent and have been for years.

But what about electronic books… ebooks?

To read an ebook, one needs an e-reader of some sort, or a computer.  In these economically straitened times, adults and children from less affluent backgrounds, especially those near or below the poverty level, have difficulty purchasing an e-reader, let alone ebooks. Somehow, this fact tends to be overlooked, again, as if reading might not even be considered a problem for the economically disadvantaged

In the seven cities I visited on my recent book tour, every single chain bookstore or large independent was located in or adjacent to an affluent area. Not a single major bookstore remains in less affluent areas.  As I mentioned in a much earlier blog, this is not a new pattern, but the trend is becoming almost an absolute necessity, apparently, for new bookstore locations. Yet who can blame the bookstores? Small mall bookstores aren’t nearly so profitable as trendy clothes retailers, and most mall rents are based on the most profitable stores. Hard times in the book industry have resulted in the closure of unprofitable stores, and those stores are almost invariably located in less affluent areas. These economic realities also affect the WalMart and grocery store book sections as well.  In particular, grocery retailers in less affluent areas are less likely to carry books at all.

But no matter what the reason, what the economic considerations may be, when a city and suburbs totaling more than two million people have less than ten major bookstores, with only one major independent, and all of those stores are located in economically well-off areas, I can’t help but worry that we are indeed on a road to a past that we shouldn’t be revisiting.

 

 

 

The Comparative Species

For all our striving as a species to find clear and absolute answers to everything, from what is “right” to the deepest mysteries of the universe, at heart, human beings remain a highly comparative species.  In its best form, this compulsive comparativeness can fuel high achievement in science and technology.  Whether we like it or not, competitive comparativeness fueled the space program that landed men on the moon, the early development of the airplane, even the development of commercial and residential electrification, not to mention untold advancements in many fields.

The worst aspects of comparativeness remind me, however, of the old saying that all comparisons are odious.

In personal affairs, comparisons tend to be subjective and unfair, particularly in politics and business.  The late Richard Nixon was pilloried for taping conversations in the White House, yet White House taping had gone on in several previous administrations.  He resigned under threat of impeachment for covering up the Watergate burglaries, yet cover-ups have occurred in government for generations.  The full extent of the naval oil reserve scandals in the Harding administration didn’t come out for decades, nor did the extent of Jack Kennedy’s extensive philandering in the White House.  While both Kennedy and Nixon had grave faults, in point of fact, Nixon actually had many accomplishments as president, while Kennedy’s sole measurable achievement was averting nuclear war in the Cuban Missile crisis, yet in popular opinion, there’s essentially no comparison.  The ballyhooed Presidential debate between Kennedy and Nixon was another example of the fickleness of comparativeness.  Among those who heard the debate on radio, a significant majority felt Nixon had “won.”  Among those who watched it on television, a majority opted for Kennedy.  Same debate, same words – but physical appearance carried the day.

Likewise, study after study has shown that men who are taller, regardless of other qualifications, receive more pay and more respect than shorter men, even those more able in terms of ability and achievement, and interestingly enough, in almost all U.S. presidential elections, the taller candidate has been the winner.

Another example surfaced with the recent deaths of Steve Jobs and Dennis Ritchie.  While the entire world seemed to know about Jobs, and mourn his early and untimely death, only a comparative handful of people seemed to know about Dennis Ritchie, who was the pioneer who developed the first widespread and fundamental computer languages [the C programming language and the UNIX system] which made possible the later success of both Steve Jobs and Bill Gates. Yet Jobs’ death appeared everywhere, and Ritchie rated a small paragraph buried somewhere in newspapers, if that.  Although Ritchie’s death was widely mentioned in technical and professional journals, it went almost unnoticed in the popular media.

In the end, the question may be: Is it that comparisons are so odious, or that the grounds on which we make those comparisons are so odious?

 

Unforeseen Results

Just before I left on this book tour [and yes, I’m writing this on the road, which I usually don’t do], I read an article on how unprepared recent college graduates and even those getting advanced degrees happen to be in terms of what one might call personal preparedness.  The article, by a professional business recruiter, stated that most graduates had little idea of even what to wear to an interview, let alone how to get one.

Then, on one of the airplane jaunts, I read about how college students are moving out of engineering and science courses because “they’re too hard,” despite the fact that the average college undergraduate studies half as much today as the average student did thirty years ago.  To complete this depressing litany, I finished up with an opinion piece by a scientist who lectures occasionally, and who cited figures to show that today’s students have trouble learning anything in science without repeated repetition of the material because they don’t easily retain what they’ve heard in the classroom without that repetition.

But to top it all off, last night I ran into an attorney who teaches part-time at a prestigious southern law school, and we got to talking after the signing at the bookstore.  What she told me was truly astounding.   She set up a class where attorneys in various fields came and discussed the actual practice of law and where the students, all in their final year of law school, were told to be prepared to ask questions and were given the time and opportunity to do so.  First off, all were buried in their laptops, and not a single one could ask a question without reference to the laptop or notebook.  Second, not a one could ask a follow-up question or one not already prepared on the computer.  Third, not a one engaged in extended eye-to-eye contact with the visiting attorneys, and fourth, not a single one asked any of the visiting attorneys for a business card, despite the fact that none of them had job offers and all would be looking for positions in six months.  Considering the fact that almost all law firms are becoming very picky about new hires and that many have actually laid off experienced attorneys, none of these law students seemed to have a clue about personal interaction or personal networking.  Oh… and almost none of them actually wore better clothes to that class.

If this is what the new, computerized interactive net-based society has to offer, we’re all in big trouble, and those of us who are approaching senior citizen status may well have to keep working a lot longer for more reasons than economic necessity.

 

No Objective Truth?

The other day, one commenter on a blog asked if I wanted to write about the growth of a belief structure in American society that essentially denies the existence of “objective truth.”  Actually, I’ve written about aspects of this before, particularly as a subset of the selective use of information to reinforce existing confirmation bias, but I find the growth of the feeling that there is no objective truth, or that scientifically confirmed “facts” remain a matter of opinion – and that everyone’s opinion is equal – to be a disturbing but almost inevitable outcome of the fragmentation of the media along lines corresponding to existing belief structures, as well as of the increasing role that the internet and social media play in the day-to-day life of most people.

The basic ground rule of any successful marketing effort is to get the target audience to identify with your product.  Usually that’s accomplished by positioning the product to appeal to biases and beliefs.  Information – which is largely no longer news or factual/objective reporting outside of stringently peer-reviewed scientific journals – apparently must no longer have more than a tangential relationship to facts or objectivity, but has its content manipulated to appeal to its desired target audience.  Now… this is scarcely new.  Modern yellow journalism dates back more than a century, but because the economics of journalistic production limited the number of perspectives that could be specifically pandered to, because the law did have an effect in so far as actual facts were concerned, and because there remained a certain basic integrity among at least some media outlets until comparatively recently, facts were not quite so routinely ignored or distorted in quite so many ways.

One of the mixed blessings of technology is that millions and millions of people in every high-tech society have access to and the ability to use comparatively sophisticated media techniques (particularly compared to those available even a generation ago) to spread their views and versions of the “facts” in ways that can be appealing and often compelling.  In turn, the multiplicity of ways of presentation and distortion of existing verified facts now in existence creates the impression that such facts are not fixed, and the next step for many people is the belief that facts are only a matter of opinion… and since everyone’s opinion is valid… why then, “my” view of which fact or interpretation is correct, or can be ignored, is just as good as anyone else’s.

This “personalization of truth” leads to many rather amazing results such as, for example, that, as the scientific consensus on the issue of global warming has become almost unanimous in the fact that, first, such global warming is occurring, and, second, that there is a strong anthropomorphic component to such warming, popular opinion agreeing with these findings has dropped almost twenty percent.

Unfortunately, occurrences such as global warming or mechanisms such as oncoming vehicles combined with high-volume earbuds, famines and political unrest, viruses and bacteria, and high-speed collisions are all present in our world. Consequently, rising sea levels, violent weather changes, fatalities due to disease among the unvaccinated, starvation, or due to failure to wear seatbelts will all take their toll, regardless of the beliefs of those who ignore the facts.

Belief is not a viable defense or preventative measure against climate change, biology, on-coming heavy objects, or other objective impingements upon subjective solipsistic unreality… no matter what or how you believe.

 

“Downers,” Stereotypes, and Literary Quality

Yesterday, when I was flying back from the World Fantasy Convention in San Diego, I read through two “best-of-the-year” anthologies… and landed in Cedar City thoroughly depressed… somewhat disgusted… and more than a little irritated.  No… I wasn’t irritated that the anthologists hadn’t picked one of my infrequent short stories.  In the first place, I know that it’s unlikely anything I write will find favor with most anthologists [although there have been exceptions].  In the second place, I hadn’t published any short fiction for the year in question.

My anger, irritation, and depression came from the same root cause.  Out of something like fifty stories, all but perhaps ten were downers.  Out of the ten that weren’t, eight were perhaps neutral or bitter-sweet, and only two could be called upbeat.  Now… I don’t have anything against downbeat or depressing stories.  I don’t even have anything against them being singled out as good stories.  Certainly they were all at least better than competently written, and some were indeed superbly written.  And I definitely don’t think a story has to be upbeat to be great or “best-of-the-year,” but after many, many years of writing professionally, and even more of reading all manner of books and stories, ranging from “genre” fiction to the acclaimed classics, it’s clear to me that the excessive citation of literarily depressing stories as classics and excellence is hardly a mark of intellectual distinction, let alone impartial judgment.

All this, of course, reinforces my feelings about those critics and anthologists who seem to dismiss anything with any upbeat feel or positive aspects… or anything that isn’t “literary mainstream.”

The latest New York Times book section contains a review with an opening along the line of “A literary novelist writing a genre novel is like an intellectual dating a porn star.”  Supposedly, reviewers who write about books should be able to transcend stereotypes, not reinforce them, but then, snobbery is often based on the enshrinement of stereotypes contrary to the snob’s world view, and all too many critics, reviewers, and even some anthologists are little more than snobs.

A good story is a good story, and a bad story is a bad story, whether it’s “literary” or genre.

More Wall Street Idiocy

I recently discovered that the cable company Hibernia Atlantic is spending $300 million to construct and lay a new transatlantic cable between London and New York [New Scientist, 1 October].  Why? In order to cut 6 milliseconds from the 65 millisecond transit time in order to get more investment trading firms to use their cable.  For 6 milliseconds?  That’s apparently a comparative age when computers can execute millions of instructions in a microsecond, and London traders must think that those 6 milliseconds will make a significant difference in the prices paid and/or received.

And they may well.  Along the same lines, a broker acquaintance of mine pointed out that New York City real estate closest to the New York Stock Exchange computers commands exorbitant rents and prices for exactly the same reason… but I find the whole idea totally appalling – not so much an additional data cable, but the rationale for its use. Human beings can’t process much of anything in 6 milliseconds so that the speed advantage is only useful to computers using trading algorithms.  As I’ve noted earlier, the use of programmed and computer trading has led to a shift in the rationale behind trading to almost total reliance on technical patterns, which, in turn, has led to increased volatility in trading.  Faster algorithmic trading can only increase that volatility, and, regardless of those who deny it, can also only increase the possibility of yet another “flash crash” like that of May 2010, and, even if the new “circuit-breakers”cut in and work as designed, the results will still disrupt trading significantly and likely penalize the minority of traders without superspeed computers.

Philosophically speaking, the support for building such a cable also reinforces the existing and continually growing reliance on maximizing short-term profits and minimizing longer-term concerns, as if we don’t already have a society that isn’t excessively short-term. You might even call it the institutionalization of business thrill-seeking and attention-deficit-disorder. This millisecond counts; what happens next year isn’t my concern.  Let my kids or grandkids worry about what happens in ten or twenty years.

And one of the problems is that this culture is so institutionalized that any executive who questions it essentially destroys his or her future. All you have to do is look at those who did before the last meltdown.

Yes, the same geniuses who pioneered such great innovations as no-credentials-check-mortgages, misleadingly “guaranteed” securitized mortgages, banking deregulation, fees-for-everything-banking, and million dollar bonuses for crashing the economy are now going to spend a mere hundreds of millions to find another way to take advantage of their competitors… without a single thought about the implications and ramifications.

Isn’t the free market wonderful?

 

Why Don’t the Banks Get It?

Despite the various “Occupy Wall Street” and other grass-roots movements around the country, banks, bankers, and investment bankers really don’t seem to get it.  Oh, they understand that people are unhappy, but, from what I can tell, they don’t seem terribly willing to accept their own role in creating this unhappiness.

It certainly didn’t help that all the large banks ducked out of the government TARP program as soon as possible so that they wouldn’t be subject to restrictions on salaries and bonuses for top executives – bonuses that often exceeded a million dollars an executive and were sometimes far, far greater.  They all insist, usually off the record, that they feared “losing” top talent, but where would that talent go?  To other banks?

Then after losing hundreds of billions of dollars on essentially fraudulently rated securitized mortgage assets, they took hundreds of billions of dollars in federal money, but apparently not to lend very much of it, especially not to small businesses, who are traditionally the largest creators of new jobs in the country. At the same time, they continue to foreclose on real estate on a wholesale basis, even when ordered not to by judges and states and regulators and even in cases when refinancing was feasible with an employed homeowner.

And then… there’s the entire question of why the banks are having financial difficulties.  I’m an economist by training, and I have problems understanding this.  They’re getting money cheaply, in some cases, almost for free, because what they pay depositors is generally less than one percent, and they can obtain federal funds at an even lower rate.

Mortgages are running 4-6%, and interest on credit card debt is in the 20% range and often in excess of 25%.  Yet this vast differential between the cost of obtaining the product and the return on it apparently isn’t sufficient?

And that brings us to the latest bank fiasco.  For years, the banks, all of them, have been urging customers to “go paperless.”  Check your statement electronically; don’t write checks; use your debit card instead. Then, after the federal government tried to crack down on excessive fees for late payments, overdrafts, and the like, now several of the largest banks are floating the idea of a monthly fee for debit card use.  Wait a second!  Wasn’t this the banks’ idea in the first place?  Wasn’t it supposed to reduce costs?  So why are they going to charge depositors more to use their own money?

And the banks still don’t get it?  With all those brilliant, highly compensated executives?

Or don’t they care?

What Is a Cult?

Recently, apparently members of the Christian right are suggesting that presidential candidate Mitt Romney is not a “Christian,” but a member of a “cult.” As a resident of Utah for nearly twenty years, and as a not-very-active Episcopalian who still resents the revision of the King James version of the Bible and Book of Common Prayer, I find the raising of this issue more than a disturbing, not so much the question of what Mr. Romney believes, but the implications that his beliefs are any stranger or weirder than the beliefs of those who raised the issue.

Interestingly enough, the top dictionary definitions of the word “cult” are “a system of religious rites and observances” and “zealous devotion to a person, ideal, or thing.”  Over the past half-century or so, however, the term cult has come to be used in a more pejorative sense, referring to a group whose beliefs or practices are considered abnormal or bizarre.  Some sociologists make the distinction that sects, such as Baptists, Lutherans, Anglicans, Catholics, etc., which are products of religious schism, therefore arose from and maintain a continuity with traditional beliefs and practices while cults arise spontaneously around novel beliefs and practices. Others define a cult as an ideological organization held together by charismatic relationships and the demand of total commitment to the group and its practices.

Mitt Romney is a practicing Mormon, a member of the Church of Jesus Christ of Latter Day Saints, but does that make him a member of a cult?  Since the LDS faith specifically believes in Jesus Christ and follows many “Christian” practices such as baptism, belief in an omnipotent God and his son Jesus Christ, and rejected the practice of polygamy a century ago, can it be said to be a total “novel” faith or any more “bizarre” or “abnormal” than any number of other so-called Christian faiths?  Mormonism does demand a high degree of commitment to the group and its practices, but is that degree of commitment any greater than that required by any number of so-called evangelical but clearly accepted Christian sects?

While I’m certainly not a supporter of excessive religious beliefs of any sort, as shown now and again in some of my work, and especially oppose the incorporation of religious beliefs into the structure of secular government, I find it rather amazing that supporters who come from the more radical and even “bizarre” [in my opinion] side of Christianity are raising this question.  What troubles me most is the implication that fundamentalist Christianity is somehow the norm, and that Mormonism, which, whether one likes it or not, is clearly an offshoot of Christianity, is somehow stranger or more cultlike than the beliefs of the evangelicals who are raising the question.

This isn’t the first time this kind of question has been raised, since opponents of John F. Kennedy questioned whether the United States should have a Catholic president, with the clear implication that Catholicism was un-American, and it won’t be the last time.  The fact that the question has been raised at all in this fashion makes me want to propose a few counter-questions.

Why are those politicians who endorse and are supported by believers in fundamentalist Christianity not also considered members of cults?

Are we electing a president to solve pressing national problems or one to follow a specific religious agenda?

Does rigid adherence to a religious belief structure make a more effective president or a less effective one?  What does history show on this score?

And… for the record, I’m not exactly pleased with any of the candidates so far.

 

The Wrong Message

Social media are here, regardless of whether we like them, dislike them, use them, or don’t use them.  They’re also becoming a part of education, and school districts and colleges and universities across the country are struggling with policies that allow constructive use of social media while curbing abuse.  Some school districts prohibit their use in education entirely, while others range from restricted use to almost unrestricted use.

Time will tell, as with many things, just what uses will be allowed, but there’s one aspect of all of this that, I must say, troubles me greatly.  One educator, cited in a recent article in The Christian Science Monitor, made an observation along the lines that he had to give feedback on assignments to students through Facebook because students never checked email since email just wasn’t part of their world.

I relayed that comment to my wife the college professor, and she nodded sagely, informing me that a growing percentage of college students simply never check their email or answer telephone messages. She should know, since her university system will inform her whether any email she has sent has even been opened – and many aren’t.  An increasing number of students only respond, and not necessarily reliably, to text messages and Facebook postings.

What?  Since when are students determining what forms of communication will be used in education?  The issue here, it seems to me, is not just whether social media has a place in education, and what that place should be, but also who exactly is setting the standards and the ground rules.

To begin with, for a teacher to reach a student through a social network, the teacher must belong to that network, and depending on the settings, etc., must request of the student to be accepted as a “friend,” or request that the student contact them and be accepted as a friend. In short, either party can refuse communications, and, in effect, the students are effectively setting the requirements for what communications they’ll receive and how.  I can certainly see students – and parents – rebelling if teachers required communications via FedEx, UPS, or carrier pigeon, but not accepting emails as opposed to Facebook messages?  Email is a non-obligatory electronic communications system far more open to all users and recipients, and takes no more time or equipment than does Facebook or any other social network. Also, teachers should be teachers, not “friends,” because even the most brilliant of students should not be encouraged to think of themselves as the equal of their teachers, no matter how much greater some of them may doubtless end up.

Again, I may be antiquated, but at this point using social networks for any form of “official” communication, whether educational, governmental, or business, raises questions about security, privacy, scholastic policies, discipline, and propriety that certainly have not been answered.

 

Too Much Instantness?

Who’s the leading GOP presidential candidate this moment?  Romney? Perry? Cain? Is Christie in or out? What about Palin? The stock market’s up three hundred points – oops, down four hundred, up one hundred, down two hundred…  The latest on Amanda Knox, or whatever celebrity’s hot, bestseller numbers on Amazon reported hourly… commodity reports tracked by the millisecond, commodities and stocks traded by the nanosecond….

Forget about telephone calls.  Keep up with Twitter, 128 character quick bits, or friend messages, quick test messages on your iPhone.  Forget about so-called instant messages; they’re too slow, and emails… obsolete!

Have we as a society lost our minds?

There’s an old, old saying – Act in haste; repent at leisure – and I have the feeling that almost no one has heard it or remembered it. We’re inundated with instant information, pressured to act and decide instantly.  The worst of it is that because there’s so much instant communication and information, people are often taking longer and longer to get around to working on projects and doing actual work because they have to deal with the instant information, and that means more and more decisions and actions are taken with less and less forethought because there’s less and less time to actually consider them, and almost everything becomes an instant decision.

For example, when the liquidators took over Borders, they didn’t have “enough time” to consider selling blocks of leases to other bookstores and chains, or to sell book stock in lots.  In the end, I suspect, they raised far less cash than if they’d taken a bit more time to plan things out.

My son and I tried to buy a bathing suit for his daughter, because she’d inadvertently left hers behind.  This was the first weekend in August – still summer, one might think.  We had to try four stores before we could find any bathing suits at all – in the suburbs of Washington, D.C., where the temperature stays above eighty degrees until October.  Why?  Because instant automated decisions insist that the summer buying season is over in mid-July.

Programmed computer trades, made in nanoseconds, have transformed the stock market from a marketplace where fundamentals and logic had a role into a largely “technical” market based on using algorithms to make quick profits, but the result is an extremely volatile market, and one in which the risks of catastrophic losses and meltdowns become more and more probable, even when the underlying fundamentals of many securities are sound.  What’s happening is that the instant information drags the entire market up or down almost in lockstep, regardless of the differentials in values of various stocks.  So “hot” stocks with little behind them behave in much the same way as issues with solid fundamentals. That has turned the market into even more of a casino than it was. We’ve already had one “flash crash” in the market, and I’d be astonished if we don’t have another.

The instant emphasis pervades everything, it seems, even when there’s a question as to whether it makes sense, but, after all, “instant” is so much better.

 

Dead or Alive?

No… I’m not going to write about “wanted” posters, but about the awareness of being alive.  What sparked this was a New York Times article about how Grande Prix racing had gone from a sport that killed drivers every year on a predictable basis to one that seldom sees fatalities, thanks to the improved safety technology incorporated in the race cars… and how its public profile has dropped in the American media.

Every so often my wife and I may glance at a story or an ad or something that depicts so-called extreme sports.  Almost invariably, even when she says not a word, I know what she’s thinking.  She can’t understand why anyone would engage in something that dangerous, and she thinks they’re idiots for doing so.

My attitude is a bit different. Not only do I think they’re foolish, but I tend to feel sorry for them. Anyone who can only feel alive when risking death and annihilation, or who can only find a thrill or meaning in life in such circumstances, most likely isn’t truly alive most of the time anyway.  Many of those individuals, interestingly enough, claim that the rest of us aren’t truly alive because we don’t understand what it is to be alive in the face of danger.

Obviously, we’re all different, but I’d like to think that it shouldn’t take the imminent threat of death to feel alive, but what bothers both of us even more than that is the apparently growing popularity of such “sports”… where, like the crowds in the Roman Coliseum or the Circus Maximus, everyone roars when there’s a death or a crash.  But then, some Republicans roar when a governor boasts about the executions in his state. I’m all too aware that life can be fragile, and that no one so far has managed to get out of it alive, but I find it a sad commentary on humanity that bystanders and voyeurs can get a thrill or pleasure out of death and destruction.

Oh… I know that tendency has been around throughout history, and that less than two generations ago in parts of the United States, lynching was a spectator sport.  I’m also more than casually aware that death is, sooner or later, potentially all too close to most military personnel… but shouldn’t death be thought of as a reluctant necessity rather than with excitement or as entertainment?

And what does it say about us as a culture that the more violent forms of “entertainment” seem to be the most popular?

Dead or alive…?

 

The Same Book? [And Lots of Spoilers]

For at least several years, I’ve been puzzled by the handful of readers/reviewers who insist I write “the same book” over and over.  My first reaction was that they weren’t reading all of what I wrote… but several of these reader reviewers have clearly read much of what I write.  So my latest reaction tends to be, “If you find what I write so objectionable in its repetition, why do you keep reading my work and repeating your objections?”  If you don’t like it… then don’t read it.  I understand that my work doesn’t appeal to everyone.  No author’s work does.

But perhaps they feel so strongly that they’re compelled to try to persuade others that my work is “repetitious” or the like?  Why?  What’s the point?  I’ll admit that there are books and series that I feel the same way about… but I don’t spend time and ink trying to make that point to those who love those books and series.  If their followers enjoy them, then that’s their pleasure.

This “sameness” criticism has been applied especially to the Recluce Saga, and since several amateur reviewers [who consider themselves superior] continue harping, I thought I’d try to take a more analytical look at the saga and see if I could identify persistent areas of “sameness/repetition.”

One charge is that I always write about young people trying to find their way, yet out of the 16 books in the Recluce Saga, only four deal with protagonists younger than 20 [six, if you count the second book in the case of Lerris and Cerryl], and those young people come from very different backgrounds, ranging from being an orphan to being the son of a ruler.  In six of the sixteen books, the protagonists are well-established in their occupations and all over 30. Do they all then go from rags to riches?  In only three cases in all the Saga do the protagonists become absolute rulers – Cerryl, Lorn, and Saryn.  While Cerryl does move from “nothing” to high wizard, Lorn is the son of the fourth most powerful man in Cyad, and takes two books and much effort to reach the top spot. Saryn begins as number two in Westwind and ends up as number one in Lornth. Creslin starts out as the son of a ruler and ends up as one of five members of the ruling council, in roughly the same place after a great deal of trial and tribulation.  Kharl is a prosperous cooper who loses everything and finally manages to become a modestly endowed junior member of the aristocracy.  Dorrin  comes from a prosperous background, is exiled, fights, and ends up as what might be called an engineering tribune who founded Nylan. Justen begins as an engineering mage and ends up as a druid-influenced gray wizard and far from wealthy.  Rahl begins as a scrivener and ends up as the Mage-Guard advisor to the provincial governor. Nylan begins as a ship’s engineer and ends up as a gray mage in Naclos.    So… most of them did somewhat better for themselves, if at rather high costs, but not all did.

Well… maybe the books are stylistically similar.  Of the sixteen, two were written in the first-person past tense, four in the third-person present tense, and ten in the third-person past tense [which is the POV used in about 90% of all F&SF books].  That doesn’t present an overwhelming “similarity” in approach and actually differs greatly from the average.

Then does this purported sameness lie in the plot or the characters?  I’d be the first to admit that there is one definite element of similarity – the main characters all do survive and succeed to some degree, but the degree of their physical success varies considerably.  Creslin and Megaera effectively lose their entire families and end up trying to build a land on a desert isle.  Lerris ends up with no wealth, and no family except his wife.  Lorn becomes emperor, but loses his father and sister, and his remaining sister exiles herself. Justen spends his life as a wandering gray mage.  Rahl becomes a high-ranking mage-guard and does marry his love.  Kharl loses his wife and children, but eventually gains true love and  small estate.  Nylan gains nothing, except his wife and son, and loses his daughter.  Cerryl gains great power, and will spend the rest of his life looking over his shoulder.  Maybe I’m missing something, but the only similarity I see is that these characters have paid high prices for their survival and success, and the prices they have paid differ in how and when they were paid.

Heinlein once observed that there were only three plots in fiction – the success story and its opposite, the tragedy; the love story; and the story of the person who learned something.  I’ve only written one tragedy [The Forever Hero], and while many of my books incorporate love stories, I will admit that most of my books do center on people who have learned something and who have succeeded to some degree – if generally at a high personal cost.

If some reviewers claim that this is writing the same book again and again, then the same claim could be lodged against  90% of all the books ever written, because every book with a plot will have a basic sameness compared to what came before, and like pretty much every writer, I’m guilty of that sameness.

So what else is new?

 

All the Fuss About Taxes

With the Presidential nomination sweepstakes and popularity contest already opening up, we’re all going to be treated to another year of claims and counterclaims, and, if the President’s recent remarks and the Republican candidates’ counter-claims are any indication, a good proportion of the rhetoric is likely to center around taxes.

As I understand the respective positions, the Democrats feel that, because wealth has become more and more concentrated, particularly in the last decade, the “wealthy” [however they’re defined] should pay a greater share in taxes, and that would be determined by closing various “loopholes” and creating a higher tax rate for the top income categories, roughly above $250,000.  The Republicans counter by saying that higher rates are counterproductive economically and that those who are above the “middle class” already pay a disproportionate amount of federal income tax.

While statistics need to be viewed with care, and I know, having spent many years as an economist, I decided to take yet another look at the IRS statistics in light of the present and likely the coming campaign charges, even though I know that few are likely to change their minds based on mere statistics.

According to IRS statistics, during the period from 1951 to 1980, the percentage of Americans who paid no federal income taxes essentially remained stable at between 21-22%.  Beginning in the 1980s, the percentage of taxpayers who paid no federal income tax began to rise, hitting 32% in 2004, 47% in 2009, and an estimated 53% in 2010.

At the same time, the percentage of tax revenues paid by the “middle class” also declined, with the percentage of total income taxes paid by the “middle class’ [defined as those taxpayers comprising those making more than the median wage, but less than the top 10%] declining from almost 40% of all income tax revenues to about one quarter of all tax revenues.  At the same time, the top ten percent of taxpayers went from paying roughly 45% of all income taxes to paying 70% of all income taxes.

Put another way, 53% of all taxpayers, largely those in the bottom fifty percent of taxpayers in income terms, paid no taxes.  The next third [37%, if we’re being more precise] paid 30% of all income tax revenue, and the top 10% [those with taxable incomes above $115,000] paid 70% of all federal income tax revenues.

At present, the current federal deficit is running close to one and a half trillion dollars annually, and federal income tax revenues are bringing in around $850 billon. The most obvious, and most bandied about, solution is to increase taxes on the rich, but there are a number of problems with this solution.

First, the reformers on the left confuse is “wealth” with “income,” and unless the Congress changes the tax law, the IRS and the Congress can only tax income, not wealth.  According to the latest IRS statistics, the eight thousand wealthiest Americans earned a combined total of $239 billion in 2009.  Assuming that Congress sees fit [which they won’t] to increase the marginal tax rate on millionaires and billionaires to 90%, and also assuming that they’re smart enough to get rid of all the deductions for these individuals, the total federal income tax revenues would total a little over $215 billion.  Given that this year’s federal deficit will be roughly $1.4 trillion, taxing those less wealthy would also be necessary to get rid of the deficit by taxing the “rich.” The 14,000 odd taxpayers who earned between five and ten million dollars a year had a total income of $95 billion, and a 90% cut of their income would raise $85 billion.  But since these taxpayers already pay close to $100 billion, the additional tax revenues would only be $200 billion. That’s still not enough.  In fact, if a 90% rate were applied to all taxpayers with an income above 1 million dollars, the total additional revenue raised would amount to $300 billion.  That leaves a short-fall of well over a trillion dollars… and the only people left to tax are those who are complaining the most about being overtaxed.  For the 81 million people who aren’t millionaires, to cover the remaining deficit through income taxes would require an average tax increase of over $12,000 a tax return.

Again, if one only wishes to tax the remaining “rich,” i.e., those making over $200,000 a year, that won’t work either, because taking all their taxable income would just barely cover that remaining trillion dollar deficit.

So… in essence, even a 90% tax rate on everyone earning over $200,000 won’t cover the current federal deficit. And, of course that would raise other problems, because, since most state income taxes run around 6% for those making over $200,000, a 90% federal income tax would bankrupt all but those millionaires making more than $5 million annually.

Given a $1.4 trillion annual deficit, and the lowest tax rates in more than 70 years, the Republican alternative of continuing lower taxes and slashing federal programs doesn’t seem terribly workable, either, since to balance the federal budget would require cutting roughly 30% of all federal programs…which would translate into cutting more than a million jobs at a time of high unemployment… and given the fact that many of those programs can’t be cut without a massive overhaul in government, either way, neither side makes much sense.

 

Never in Any Real Danger

The other day I engaged in an activity that my wife deplores – I read another review of one my books, of Arms-Commander – and came across yet another common mistake made by both professional and amateur reviewers all too often.  The reviewer in question made the statement that, because of her abilities, Saryn was never in any real danger.  Outside of the fact that she gets rather banged up and almost dies upon several occasions, this reviewer and others – and not just in reviewing my books, by the way – fail to understand that great ability does not guarantee surviving inherently dangerous combat and occupational situations.

Since I do happen to know a bit about flying, I’ll begin with an example from that field.

The greatest combat pilot in the world is still partly at the mercy of mechanical failure, the elements, his/her own failures in judgment, unforeseen circumstances, and luck on the part of an opposing pilot.  As a matter of fact, in World War II, roughly half of all aircraft fatalities occurred in non-combat situations.  The same sets of factors occur anytime anyone of ability is involved in a dangerous situation.  Even the best mountain climbers get killed, and that’s without anyone shooting at them.  In a sword fight, blades can shatter, get caught on something for a moment at an inappropriate time, or the superior fighter can slip on sand or oil – or be distracted in some fashion or another.

Those who are best will also attempt to set up situations where their exposure to the unpredictable is minimized… as does Saryn, but that doesn’t mean that they’re not in danger every time they go into a battle or combat.  Then think about the fact that, as a matter of fact, even everyday life in the good old USA has a significant element of danger, when you consider that over 40,000 people die annually in auto-related accidents, and that there are something like 15,000 homicides a year.

In the case of someone like Saryn, whose forces are outnumbered, the best strategy is always to divide and conquer, to attack in ways and with methods that maximize her strengths and neutralize the enemy’s.  She does so… but that doesn’t mean she’s not in danger, as her various injuries and wounds prove… as do the deaths of hundreds of her supporters and allies also prove.

Well… perhaps the reviewer didn’t get the sense that she could be killed. If injuries, wounds, near-death, the deaths of those closer to her, and lots of close calls won’t convince a reader, then the only thing that will is her own death.  But that creates a bit of a problem, because most readers want the hero or heroine to prevail against great odds.  Like it or not, that means that most protagonists will survive, especially in, frankly, commercially successful books, and, as an author, I really can’t afford to write commercially unsuccessful books.  The only question is how badly the protagonists are injured and under what circumstances.  As one of my offspring once observed, “You need to abuse your characters a lot.”  But abuse doesn’t mean that an author has to slaughter 90% of the characters to prove danger.  Even 5-10% death rates suggest dangerous situations.

So… any reviewer who claims that a protagonist who survives trials and tribulations and almost dies along the way is never really in danger is not only an idiot, but hasn’t had much real world experience… because, for any character, death can be just around the corner, just as it is in real life.

Brighter At What?

Recently, in an ABC television interview with Christiane Amanpour, Eric Schmidt, the former CEO of Google and its current executive chairman, made the observation that the young people graduating from colleges and university today were brighter than their predecessors and noted that he’d worked with some of the brightest minds of his generation.  Given Schmidt’s background in electronics and communications technology, I have no doubts that he has indeed worked with some of the brightest minds in his field.

But what exactly have these brilliant minds, especially at organizations like Google and Facebook, given to society and civilization?  They’ve certainly perfected the technological aspects of introspection, fame-seeking, ego-satisfaction, and instant communications over subjects largely meaningless in the larger scope of the problems facing society. They created a massive search engine that’s most useful for finding the general and trivial… and possibly one of their endeavors, through the Google book settlement, may have undermined the entire literary copyright process. Oh…and they’ve created some jobs and a form of bubble wealth.

I don’t see that these brilliant [and exceedingly well compensated] minds have been terribly successful at stabilizing our financial system.  In fact, in the quest for wealth, their algorithms and quant models have been highly destabilizing and have likely destroyed more companies and wealth than they’ve created.  Nor have the younger generations of bright minds made significant contributions, from what I can tell, to environmental improvement [those were made largely by pre-baby-boomers and early baby-boomers].  And that brilliance has been incredibly successful in revolutionizing the political system, in that the application of technology, money, and data to campaigns has made the results of most elections a foregone conclusion – and resulted in the greatest polarization in American history and potentially the most disastrous political deadlock since the Civil War.

From these observations, I have to ask at just what are these younger college graduates so brilliant?  Developing technologies and systems that make billions of dollars out of the trivial?  Or improving the economic and political and technology infrastructure of the nation?  Or finding new approaches to our health care and energy problems?  Or… [fill in scores of different questions dealing with fundamental improvements to society and the world]?

To my way of thinking, antiquated as it may be, brilliant is as brilliant does, and brilliance in pursuit of the trivial, no matter how remunerative, is merely brilliance in pursuit of mediocrity… and yet, no one seems to point this out.

 

 

 

Rugged Individualist or Cooperative Village?

The other day one of the blog comments cited a preference for even a “fake rugged individualist over some ‘it takes a village’ idiot,” and while I initially appreciated the sentiment, the comment got me to thinking, and the more I thought, the more I decided that the choice represented by the two alternatives was a false representation… and another example of the “either/or” polarization that infects our society today.

Why? By way of a slight digression, I’ll explain.

The recent history and culture of the United States as a European outshoot, short as it is, is strongly colored by the myth of the rugged individualist, the pioneer, the superiority of the individual entrepreneur, and a number of other idealized depictions of individual superiority over the group or the masses or the village.

But let’s look at a few aspects of those myths.  First of which, the majority of the conquest of the “new world” wasn’t accomplished by Europeans and their culture and tools, but by disease.  Second, individuals didn’t create all those superior weapons and tools that led to an industrial and military power by themselves.  The frontiersman with his trusty rifle, his saddle, etc., all the equipment that allowed the “conquest” of the Americas was in fact the product of the village, if you will, and the crafts and skills of those villages.  And many of the great inventions attributed to single individuals, such as the steamboat to Fulton, the steam engine to Watt, the airplane to the Wright brothers, electricity to Edison, and so forth, all could have been – and were in fact – accomplished by others at close to the same time.

The fact is that such developments are an outgrowth of the existing culture, and while it may take a bright individualist to make an advance, first, there must always be more than one such individual for the advance to be successful [more about this in a moment], and the culture must need and/or accept that advance.  Progress and success, if you will, require both the individualists and the culture or village.

In Ptolemaic Egypt, Hero [Heron] built what appears to haven been the first steam engine, as well as employed magnetism in a technical way and built a jet-like pump for fighting fires.  Yet the steam engine vanished from history and did not reappear for more than 1600 years. Similar advances occurred in early China, and, effectively, the culture turned its back on them. Being a genius with proven products wasn’t enough, and it never has been.

The term “rugged individualist” conjures the idea of the man or woman living apart from and independent of society, yet human beings cannot survive above the most primitive level without the support of and the products of society.  Likewise societies tend to languish, stagnate, and eventually collapse if they crush individuality and creativity.

A vital culture needs to support both genius and individuality and cooperative effort.  Without both, it has no future… and yet, today, all too many on the left denigrate the contribution of the outstanding individuals and all too many on the right denigrate the role of a productive and cooperative society.

Post-Idea America

Early in August, the author Neal Gabler wrote an article in The New York Times, in which he observed that “we are living in an increasingly post-idea world – a world in which big, thought-provoking ideas that can’t be instantly monetized are of so little intrinsic value that fewer people are generating them and fewer outlets are disseminating them, the Internet notwithstanding.”  He contends that this is largely so because we are drowning in information and that the informational version of Gresham’s Law is at work, in that the mass of trivial information pushes out significant information, and because, within that mass of trivia, there is so much that confirms what we think is so that most people do not look or quest beyond their conformational biases, each of us is actually living in a smaller universe than did previous generations, even though the amount of information is infinitely larger.  He concludes by pointing out that society has historically been changed by “big ideas,” such as those of Einstein, Keynes, Freud, and Darwin, and that while there are currently thinkers who offer equally “large” and provocative ideas, those ideas are being lost in the ocean of trivia… and that society is already suffering and will continue to do so.

I don’t dispute any of Gabler’s points, and, in fact, find his observations and assessments, if anything, far too moderate, but I also believe that he minimizes two other aspects of the problem – the fact that the total mass of information acts as insulation to keep people from having to come to grips with ideas and facts at variance with their beliefs and the equation of “profitable ideas” with great ones.

Because the mass of information is so great, its very volume encompasses a range of correct data, generalizations, beliefs, anecdotes, examples, falsehoods, misrepresentations, and inaccuracies, the sum total of which creates the impression that all points of view and all ideas on a subject have equal value… or that the individual has every right to pick that information with which he or she is comfortable.  In the past, when information channels and sources were much narrower, there was a far higher percentage of “new” ideas and information that challenged existing beliefs reaching the average person.  While, in many cases, even “correct” new information or ideas were initially rejected, out of those challenges and questions emerged new perspectives and often new ways of looking and society and even the universe. Now… the new ideas are still out there… more often than not, adrift in a sea of trivia and indifference.

And… there is indeed one new “great” idea in American society, although it’s actually anything but new, but has rather undergone a re-birth, and that is the thought that no idea is of great worth unless it can be monetized profitably.  This is a central theme of the right wing of American politics today, that the profitability of government and business are paramount. Unhappily, it’s also an idea that is at the core of the left wing as well, even as the liberal left denies it.  But when the liberals make the argument that the wars in the Middle East should be stopped on monetary grounds, they’re essentially agreeing with the conservatives, in that they’re stating that social programs should be monetized, and that their worth lies in the amount of money applied to such programs.  In underlying principles, that’s no different from saying that no product is good unless it’s profitable.

Yet Galileo certainly wasn’t wealthy, nor Copernicus, nor Socrates, nor Freud, nor Einstein, nor Darwin… nor the majority of great thinkers in history.  Very, very few of the great artists were wealthy, either.  Few of the founding fathers of the United States died wealthy, either, for all their great ideas… So why do we spend so much time today idolizing the rich and famous?

Have we forgotten what greatness and great ideas are?  Or have we just reached the point where we as a society either fear them or can comfortably ignore them?

 

The More Things Change…

In 1768, the composer Franz Joseph Haydn wrote Lo speziale, an opera that depicted a Jewish apothecary, a work that was later revived by Mahler and Hirschfeld at the end of the nineteenth century as Der Apotheker [The Apothecary].

In the opera, non-Jews rail against the immigrant Jews for taking the jobs of the locals, and blaming them for all the misery that befalls them. Of course, in the 1930s in Germany, Hitler used the same theme, and that led to the Holocaust. Today, in the United States, a similar chorus is once more rising, as it did in the nineteenth and early twentieth century, first against the Germans, then the Irish, and finally the Italians, citing each immigrant group as the source of crime and social woe – just as many people and politicians are doing today with the U.S. Latino population. Of course, the Jews are scarcely blameless, either, historically regarding the Moabites and the Samaritans rather disfavorably

It appears to be an all-too-human trait to blame the “outsider” when matters aren’t going well in a society, and because the United States is facing the highest unemployment since the Great Depression, everyone is looking to blame someone or something else. Despite this chorus against immigrants, recent studies indicate that manufacturing employment in the U.S., the economic area where the job loss over the past two generations has been the greatest, is now and has been relatively stable for the past several years.  Because the U.S. population is growing, of course, the percentage of manufacturing jobs compared to total employment continues to decline, and because jobs have been cut in all areas of the economy manufacturing jobs have been cut as well, but such cuts are different from those resulting from basic structural changes.

The structural reasons for the losses in manufacturing employment are various, ranging from the ability to produce goods more cheaply overseas to a growing reliance on automation and robotics.  Regardless of the reasons, however, those seeking to immigrate to the U.S., either legally or illegally, did not cause the problems.  They were caused by U.S. citizens operating in response to those great American ideals – the profit motive and the bargain.  Those job losses were caused because Americans want the best good at the cheapest price, and all too many goods can be manufactured more cheaply – and more profitably — either through automation or through overseas outsourcing.

Yet all over the country, more and more blame is laid upon the immigrants, both for crimes and lack of jobs.  More than a few studies have shown that crime rates are far more related to poverty than ethnic origin and that crime rates in poor white communities are little different from crime rates in poor areas of other ethnicities. Poverty and crime go together. Yet blaming immigrants continues, despite the fact that in many areas, non-immigrants won’t take the lower-paid and often physically more demanding jobs that immigrants will and the even more important factor that the U.S. economy requires fewer and fewer unskilled and semi-skilled jobs and more and more jobs requiring education or additional training.  The days when a semi-skilled auto worker could make more than $100,000 are vanishing… if not gone, but, rather than recognizing these facts, once again, we have politicians and demagogues seeking to blame those who aren’t the cause, but who only want what everyone else wants.

 

Bookstores

Over the past few years, especially among book lovers, there’s been a continual undercurrent of dissatisfaction with chain bookstores, and I’d be the first to admit that I have my problems with the big box bookstores.  Certainly, those who’ve followed this site for several years know that I felt from way back that Borders was badly managed, but what I find interesting is that I’ve seen very little on what led to the rise of the mega-bookstore… and it wasn’t just corporate greed. Because I’m an author and because I’ve been to well over a thousand bookstores of all sizes and shapes in almost every state in the United States [excepting five], however, I may have a slightly different perspective from others.

Over the last thirty years especially, the book business has changed dramatically, the most significant factors, in my opinion, being the collapse/centralization of the wholesale distribution network and the closure of more than 2,000 smaller mall stores. The closure of the mall stores resulted from a failure of Borders, in particular, to realize exactly what those stores did, which was to increase the reader base while providing a very modest profit.  That modest profit wasn’t enough for the corporate types, unfortunately, and they thought large destination stores would provide higher margins, which they do [if run well, which Borders was not], but almost everyone who goes to a big chain store is a dedicated buyer… and the closure of the mall stores left entire areas of major cities with no convenient bookstore. With the centralization of the wholesale distribution networks, most of the bookracks in drugstores and elsewhere vanished, as did the local expertise on what sold where. These factors have reduced the number of readers and buyers, as well as led to the growth of the large book chains, including WalMart’s book sections, and, in turn, to aggressive price discounting on best-sellers. That aggressive pricing made the economics unworkable for many small independent booksellers.

Yet for all the woe and hand-wringing by some authors and others, I have very mixed feelings about smaller bookstores.  I love their passion and their love of books, and their dedication to literacy and reading, but… having visited scores of them, one thing stands out in my mind.  Except for a comparative few specialty F&SF stores [less than thirty nationwide in 1990, and less than a handful today], very few of the small independents carried much fantasy and science fiction.  I’m fortunate if I see more than three or four of my titles in any small independent bookstore, and generally there is only one copy of each. This is true of even F&SF top best-sellers as well, if with a few more copies of each title. Now… there are exceptions, such as the small store in my home town, but they’re rare.  On the other hand, the big box chains carry almost all my fantasy titles, and if they didn’t, I’d be looking for a day job or eking it out on what I’ve saved over the years.  The plain fact is that big-box stores have supported genre fiction far more than have the small independents, and that’s especially true for fantasy and science fiction.  What’s also true is that the old dispersed wholesale rack system also supported genre fiction more than the independents did.  So now, the only real outlet for a broad range of genre fiction, especially F&SF, appears to be the big box stores.

Some authors in the field are optimistic that the internet will provide another outlet, besides Amazon and B&N.com, but I have my doubts, simply because most readers don’t want to search author sites and the like – at least not until they know the author and his or her works.

So… like it or not, for now those of us in the F&SF field are pretty much tied to the big box boys and Amazon… because for all of the concern about the independents, much as I like them and their devoted people, the independents alone can’t come close to supporting the field… although that’s something that far too many authors won’t admit publicly.

 

Simplistic Solutions – Again

The other day, my brother sent me a copy of the final column of a retiring columnist [Charlie Reese of the Orlando Sentinal].  If the column is representative of Mr. Reese’s views, I’m glad to see him no longer in print and wish him a very happy retirement.  His view was that all of our ills as a society can be laid to 545 people – the Congress, the President, and the Supreme Court, because not one of the taxes, not one of the federal budgets, not one of the federal regulations, not one of the deficits, and not one of the federal court decisions that have led to the mess we’re in could have taken place without the acts of those individuals… and that each and every one of them could have said “no.”

And, in the strictest and most simplistic sense of the word, he’s absolutely right.  But in the larger sense, he’s absolutely wrong… because we live in a representative democratic republic, and we, as voters or non-voters, decide who represents us every two years. As some of you may know, I spent some 18 years in Washington, D.C., first as the legislative director for a congressman, then as the staff director for his successor, then as the head of legislation and congressional relations for the U.S. EPA, and finally as a consultant, i.e., beltway bandit, representing corporations before the Congress and the Executive Branch.  Given that I’ve also worked in private industry and as a small businessman, not to mention as a Navy pilot, I’ve seen how government works and doesn’t work pretty much from all sides.  And it’s anything but simple.

I’ve known personally dozens of representative and senators, and professionally dealt with hundreds of them… and well over 90% of them faithfully and diligently represented the views of the majority of the voters who elected them.  It’s all well and good to extol the “good old days” when the USA was the economic power of the world with balanced budgets and prosperity… but that often wasn’t the case.  Even before the Great Depression, there were other brutal depressions and financial collapses, and certainly in World War II, the budget was far from balanced.  By the time of the Great Depression, the majority of Americans were ready to move away from unrestrained laissez-faire capitalism, and they showed it in their support of Franklin Roosevelt and whom they elected to Congress.  With unemployment over 25%, and breadlines everywhere, with older people in poverty, who could blame them?  They voted for what they thought they wanted, as they did before, and as they have ever since.

Since I left Washington, have my representatives and senators represented my views?  Hell no!  But my views aren’t in the majority where I live.  And because only a little more than half the eligible voters actually vote, especially in off-year elections, it may well be that many senators and representatives do not represent the views of the majority of their constituents, but only the views of the majority of those who vote… but that’s not the fault of the Congress.  It’s the fault of those who fail to vote.

To blame the problems in Washington on a Congress and a President that reflect the views of the majority of voters is not only simplistic, but it’s also taking the easy way out.  Recent elections have shown, more than ever, that any representative or senator who goes against the wishes of the majority of voters in his or district or state usually gets tossed out.  The plain fact of the matter is that the majority of voters, for better or worse, really don’t want fiscal discipline.  They don’t want cuts in the federal programs that benefit them, only in those that benefit someone else, and they don’t want to pay more taxes, although it might be all right if someone else did.  And Congress has continued to listen to them and reflect their wishes.

Would any of us want a government that didn’t?  That would be even worse than what we have… and what we have isn’t all that wonderful at the moment, but it’s still better than the alternatives.  The problem isn’t the structure, and it isn’t the Congress.  As Pogo said many years ago, “We have met the enemy, and he is us.”