Archive for the ‘General’ Category

Freedom: Back to the Basics

The first basic point about freedom is that absolute freedom does not exist and never has. Every object and/or entity is constrained by its environment and by other entities.

The second point is that, above the forager/near subsistence level of human culture, material improvement is linked to population density and the production of an agricultural/food surplus, i.e., those producing food need to produce more than they consume to feed others who design and produce tools that make life above the subsistence level possible. All technological improvements come from communities, not isolated individuals. Virtually all major advances in human technology have been developed and been implemented in urban centers and cultures, or financially and technically supported by those centers.

Third, increasingly urban areas and areas with high population density cannot continue to exist without restrictions on human behavior, either through manners and custom, through laws, or through some combination of both. Moreover, the greater the density, the greater the need for more restrictions on the excesses of human behavior.

At least so far, every advancement of human technology has created more toxic waste products, and the need to manage such wastes requires enforceable rules. If such wastes aren’t managed, then substantial segments of the population have their freedom to a healthy life restricted by the freedom of those benefitting from the sale and use of those goods.

The bottom line is very simple. Functioning societies need restrictions on “freedom.” To remain functional over time, a high-tech, high consuming society needs more restrictions than a decentralized low-tech society.

Who enacts such restrictions and upon whom? In an autocratic state the ruler does. In a state that has some form of popular government, those elected to govern do.

The greatest problem for either form of government is understanding that everything affects everything else and that simplistic maxims don’t work well in practice. In essence, the struggle over the direction of U.S. government has been the conflict between two “principles.”

That government is best that governs least.

The government is best that strives for the maximum good for the maximum number.

The first is, at best, in effect a defense of the status quo, and at worst a maximization the power of those with power, wealth, and skills that can be easily monetized or turned into wealth and/or power.

The second, at best, puts the determination of “good” totally in the hands of government, and at its extreme, becomes a socialism that disproportionately rewards those of lesser ability and determination.

A government that governs least is highly unlikely to restrict the abuse of personal freedoms. A government determined to obtain maximum good for the maximum number is just as likely to crush innovation and excellence, and in doing so, bring about its own eventual downfall.

Neither extreme position works, or not for long, yet the Americans who control the two political parties, at least lately, are polarizing to the extremes, while the majority of Americans in the middle bemoan the lack of middle ground even as they largely vote for the most extreme politician on “their” side, and then attack the few in the middle who try to work out compromises.

A Certain “Freedom” Has Costs

As many pundits and non-pundits have pointed out, freedom isn’t free. And most people would agree. But what is seldom discussed are the costs of various freedoms.

Freedom of speech, for example, means that I get deluged with unwanted and unordered advertising mail. It means that political demagogues can assert that falsehoods are true. It means I have to spend money in court to stop someone from falsely libeling me, or at the least pay an attorney.

But there’s a new form of “freedom” being extolled that’s also far more costly, especially on others. And that’s the so-called freedom not to obey public health mandates, currently being pushed by anti-vaxxers, particularly COVID-anti-vaxxers. Since vaccinations reduce the chance of being hospitalized for COVID by close to 90%, those who don’t get vaccinated place enormous costs and strain on the health care system and those who staff it, as well as additional costs on their own families and neighbors.

A study by the Peterson-KFF Health System calculated that the additional hospital health care costs created by unvaccinated individuals being treated for COVID, just for the period from July 1, 2021 to November 30, 2021, was almost $14 billion. The cost for hospital treatment of COVID patients can range from around $11,000 to well over $300,000, but comes out to average between $20,000 to $25,000 per patient, according to various studies. Other reputable studies peg the average costs more in the $40,000 per patient range. These figures don’t include follow-up visits or the costs associated with long COVID. In addition, unvaccinated individuals hospitalized with COVID had a 10% higher rate of complications, which increased their costs of treatment.

And these costs don’t just fall on the unvaccinated individuals. Some costs fall on insurers, who will cover those costs by raising premiums on everyone. Other costs will fall on family members because insurance and government programs don’t cover all COVID medical expenses. The costs of treating uninsured or underinsured unvaccinated individuals will require increased fees on others or funding from government sources… or in some cases, closure of the health facility.

Then, when hospitals are filled with largely unvaccinated COVID patients, those hospitals won’t have enough space or staff to treat urgent non-COVID patients, or not to treat them as quickly or effectively.

The increasing number of unvaccinated COVID patients is also taking a toll on doctors and skilled nursing staff, with a workload and stress level that makes them more vulnerable to breakthrough COVID and other opportunistic infections. In turn, over time, that reduces the level of care for all patients, which means that those comparative few vaccinated COVID patients, usually older people or immune-compromised individuals have to suffer more as a result of the “freedom” of the unvaccinated not to be vaccinated.

So…for those of you “freedom-loving” anti-vaxxers, your so-called freedom isn’t free. You’re just imposing the costs on everyone else… and, in my book, that’s called “freeloading,” not freedom.

What Gives?

I’ve lived in Cedar City for close to thirty years, but I’ve almost been hit by drivers blatantly running red lights three times in the last month, compared to once in all the years before. Red lights, not yellow lights that turned red. I’ve also seen four drivers running stop signs, not slowing instead of stopping and then speeding up, but running them full speed… and not in the middle of the night when no one was around. I know of at least two recent accidents where a driver ignored a red light and caused an accident, one of which resulted in the death of a motorcyclist.

The Utah State Highway patrol has also reported that highway speeds, accidents, and deaths are up dramatically in the past year – the average speeds appearing to the fastest ever, even though speed limits haven’t changed. I’ve been passed on Main Street, when driving the speed limit [25-45 mph, depending on locale] by drivers who had to be going close to sixty, and frequently, not just occasionally. On the interstate, while going 82 mph in an 80 mph area, I found that to avoid causing a traffic back-up I had to move up to 85 mph, and people were still passing me, going at least 90 mph.

But this isn’t confined to Utah, either. Neighboring Colorado registered the highest number highway deaths in twenty years in 2021. And late in 2021, the federal government reported that road fatalities spiked the first half of 2021, the largest increase ever recorded in its reporting system’s history during a six-month period, nearly a twenty percent increase from the same period in 2020. Incidents of speeding and not using seatbelts were also found to be higher than before the pandemic.

Then there’s this pandemic, where statistics demonstrate rather conclusively that being vaccinated and wearing a mask reduces your chance of being hospitalized and/or dying. Data from New York shows that of those recently hospitalized for COVID, the unvaccinated were more than 32 times likely to be hospitalized than those who were vaccinated and even more likely than that to die and/or suffer long-term complications.

I just wonder if all those people speeding and running red lights are the same ones who aren’t getting vaccinated, especially here in Cedar City, where only 47% of the eligible population is vaccinated.

Everyone’s Like Me

One of my readers made a telling comment last week – that Republicans believe the election was “stolen” because they cannot believe they’re in the minority.

The first reaction of those who aren’t Republicans is likely disbelief. How can they believe something that’s so manifestly not so?

The answer to that lies in a simple observation. Given any choice in the matter, people tend to surround themselves or join with people who are like themselves, and they also tend to buy houses in places where they feel comfortable. Add to this the combination of the growth of cell phones, the internet, and a range of news services that all allow people to wall out anyone or any news they don’t want to believe in. So they instinctively come to believe that “most people are like me.”

This has almost inexorably led to a mindset whereby they believe that people like themselves are the only ones who count, and, in the case of Republicans, that mindset can be justified by the past, where all those who mattered were essentially white males. Since Republicans find it difficult to believe that there can be large numbers of women and minorities with money and political power, they attack specific individuals, particularly women, as “outliers” and unrepresentative, claiming that these individuals don’t represent “true” American values.

This leads to the dual fallacy that not only are Republicans really in the majority but also that those who don’t believe as they do aren’t truly “real Americans.”

So, if those who aren’t real Americans aren’t in the majority, they must have stolen the election from real Americans.

Of course, that line of thinking ignores the fact that the only real Americans are American Indians, because they were here first, and the ancestors of the Republicans’ “real Americans” stole the United States from those American Indians.

Manners and “Culture”

More than 200 years ago, Edmund Burke made the following observation:

“Manners are of more importance than laws. Upon them, in a great measure, the laws depend. The law touches us but here and there, and now and then. Manners are
what vex or soothe, corrupt or purify, exalt or debase, barbarize or refine us… They give their whole form and color to our lives. According to their quality, they aid morals, they supply them, or they totally destroy them.”

Admittedly, the law touches each of us a great deal more now than in Burke’s time, but the essential truth of his observation remains, simply because law cannot encompass everything in social interaction, business practices, government, and personal life – and when it tries, it fails on some and often many levels, even in the most authoritarian states.

All functioning societies have a shared culture, or at times, more than one culture, each shared by a significant fraction of the population, and each culture embodies a standard of manners. Much of what has been historically manifested in the operation of the government of the United States was never codified into law. It was based on manners and custom. Losing candidates accepted their loss, sometimes grudgingly, but they accepted it. Except for Andrew Jackson, Presidents generally accepted Supreme Court rulings they didn’t like, as did Congress.

All this was based on a mannered acceptance of authority.

Then came the 1960s and 1970s and what amounted to a combination of an assault on manners as phony and hypocritical, the Civil Rights movement, which was a slow-burning explosion against the cultural, legal, and long-standing physical repression of black Americans, and the feminist movement, another slow-burning explosion against thousands of years of male dominance. Over the years that followed, these led to significant but delayed changes in the legal system.

But what revolutionaries and reformers have too often failed to understand is that while laws can, immediately after enactment and enforcement, require different requirements of behavior and conduct, when such laws are enacted, they’re often in conflict with cultural beliefs and behavior. And cultural beliefs and manners are highly resistant to change, particularly when those in power have a vested interest in resisting change.

We’ve seen this around the world in often futile attempts to change social structures and cultures into societies that are more “democratic” and egalitarian.

Yet we’ve failed to notice that we have the same problem here in the United States. We’ve also failed to notice that since the European invasion of North America [a phrase studiously avoided by almost all politicians and historians], the forms and control of culture, business, political and governing structures have been and continue to be dominated by white males, but with legal changes over the last generation or so that complete dominance is no longer assured.

And because so much of the American political and social system has been based on cultural acceptance, when the impact of profound legal changes has truly begun to change the political, social, and economic power structure of the United States, those believing themselves to be disadvantaged by those changes, and who feel they’re the ones being discriminated against by their relative loss of power and influence, have effectively decided to reject the traditional mannered acceptance of popular political change, since it no longer benefits them. Given that, it appears, unfortunately, that more unrest and violence are likely.

For the People?

I can understand that Republicans feel Democrats spend too much and want to spend even more. I can understand that they feel the “wild left” is pushing gender/sexual politics beyond the law. I can understand why they want more spending on police, rather than less. I can understand their concerns about immigration, concerns that many Democrats share but refuse to acknowledge publicly. I can understand their concerns about excessive government regulation. I can understand, even if I disagree violently, their feelings about abortion. I can even understand [although it’s incredibly difficult] that they want Trump back as President.

Issues such as these, whether we like it or not, are the sort of issues to be decided by Congress, the courts, and the President through Constitutional procedures, not by a mob smashing its way into the U.S. Capitol and not by an authoritarian government.

What I find impossible to accept from Republicans is their belief that the last election was “stolen,” and their failure to accept that the January insurrection was just that – an attempt to overthrow the results of an election that even Republican state officials claim was fair, particularly at a time when Republicans controlled the majority of state governments.

To me, such Republican stances are the precursors of yet another attempt to force their will upon others, even on issues where over two-thirds of the population opposes the Republican position.

In his Gettysburg Address Abraham Lincoln said that the Civil War was fought so “that government of the people, by the people, for the people, shall not perish from the Earth.”

Today, it’s more than a little clear that the Republicans firmly no longer believe that, but instead will deny facts and ignore the will of all the people in order to create government of Republicans, by Republicans, and for Republicans, and the hell with anyone else, even though Republicans are in fact a minority of Americans.

A “Christian Nation” ?

Lately, especially over the last few years, there’s been a great deal of rhetoric from largely conservative sources about the need to stop “the war on Christian America,” a “war” supposedly being waged by “the left.”

Those making such charges claim that liberals and the left want to replace “Christian values” with big government, but those making the charges conveniently ignore history and the Constitution. At the time the Constitution was drafted, Europe had endured hundreds of years of war over which creed and what “Christian values” were to be the law of what land. That was why the Founding Fathers stated in the First Amendment to the Constitution that “Congress shall make no law respecting an establishment of religion, or prohibiting the free exercise thereof.”

So… by the words of the Constitution itself, the United States is not legally and should never be a “Christian nation.” Nor should explicitly religious beliefs and practices be enshrined in law. Yet when individuals and groups use the law to protest local and state laws establishing or promoting religious values, Republicans and many evangelicals paint those individuals as leftist radicals trying to destroy the United States.

What’s ironic about the efforts of the Republicans and evangelicals to paint the left as the enemies of Christianity is that Republicans and too many evangelicals are attempting, through changes in statutory law, especially on the state level, but increasingly on the federal level, to impose mandatory “Christian values” on everyone, whether Christian or not. Currently, a wide range of studies and surveys indicate that roughly 35% of Americans are not Christians. Most of that 35% are either non-believers, atheists, or agnostics.

There’s a clear difference between freedom to practice one’s own faith and enacting laws to force one’s beliefs on others through law, and that difference is ignored more and more, largely, but not exclusively, by Republicans and the far right, not that there’s much difference any more.

Christmas and Planned Obsolescence

We have a long-standing Christmas tradition. Actually, we have quite a few, and I suspect all couples who’ve been married (or together) thirty years have long-standing holiday traditions. But the tradition under discussion is an extensive display of outside Christmas lights – white icicle lights on the gutters on the front and rear of the house, twinkling white lights on artificial garlands affixed to long rear deck railing, some lighted figures on the front lawn, and various strings of lights on the fitzer hedge flanking the front walk and in the side yard.

Needless to say, a number of hours are required for installation, usually taking much of the weekend after Thanksgiving. The labor is necessary for the enjoyment the lights provide us, occasional visiting family members, and the neighbors – and, of course, the power company, which surely enjoys the increased revenues.

We have, however, noticed a trend in terms of the lights themselves. When we first moved to the house we occupy some twenty-eight years ago, we bought six strands of white-wired white icicle lights for the rear of the house. We still have four strands remaining. On three of them every light still works. The fourth strand, alas, lost the lights in a three foot section this winter after two days of winds gusting to 50 mph, likely because several bulbs were smashed. Replacement lights for a 28 year-old strand are not available.

The trend we noted is that almost no set of lights manufactured in the last ten years lasts more than two or three years. The other interesting factor is that although my wife scrupulously saves all the spare bulbs, every strand of newly-purchased replacement lights has a slightly different bulb design, so that if you need more than two replacement bulbs, you essentially need to buy a new strand. And it’s worse than that, because it takes needle-nosed pliers, a surgeon’s touch, and the strength of Sampson to replace one of those bulbs.

Which is why, every January I end up tossing a strand or two of lights, and every late November I buy more, which invariably give out more quickly than their predecessors. I have the feeling that we’re on the way to one-season disposable Christmas lights, and that may be a reason why light displays are becoming limited to those of us who are slaves to our traditions.

The Slippery Slope

In the previous post, I lamented the massive lack of honesty among Republicans, especially among elected office-holders, who are getting to the point where at least some of them will literally do almost anything to hang onto to power and position, whether legal or not.

So far, the Democrats are better, but not nearly as much better as they believe, and therein lies the problem.

The public “lying” problem isn’t new. It’s as old as civilization, but when it gets bad enough that most of the public in a country believes that no public official can be trusted, that country is on the brink of revolution or autocracy, if not both. The United States is coming perilously close to that benchmark, given that the majority of Republicans have effectively declared that they don’t believe the administration on factual matters of national importance.

So how did we get here?

It’s really pretty sadly simple. First, Americans have never liked unpleasant truths, and rather than face them, they prefer to blame others, usually the President in power. Second, while Americans have always had a weak understanding of politics and history, the current generations have an even weaker grasp and, moreover, don’t want to improve that understanding. Third, Americans have become slaves to instant gratification, and with that has come a failure to understand or accept that it takes time to fix things. That’s why the first President Bush lost re-election – because he made unpleasant and unpopular financial fixes – and why President Clinton reaped the benefits of those fixes because he didn’t have to make unpopular public policy choices.

Fourth, politicians of both parties have learned that telling unpleasant truths has the immediate consequence of unpopularity and losing the next election. This bleeds over into everything.

For example, although the official inflation rate for some time, until the last few months, has been below 2%, that methodology for calculating inflation isn’t the same as it was fifty years ago, because the impacts of housing, food, energy, and education are now significantly understated. This “adjustment” not only understated the “official” amount of inflation, but also allowed the government to keep down cost-of-living increases in Social Security, military pensions, and various other benefits and programs, which not only reduced federal outlays but effectively was a hidden tax on beneficiaries.

So… when these costs suddenly increase, and government economists are saying inflation is “transitory,” even people who aren’t economists definitely get the idea that their government is lying to them. And all those years of misinformation and statistical manipulation are coming home to roost in the form of more and more people losing trust in government… and asking, “What else aren’t they telling the truth about?”

Now, even when a public official is telling the truth, most people are skeptical.

And that’s very, very bad news at a time when there are no good quick fixes available.

The Liars/Hypocrites Party

According to recent polls, roughly 60% of registered Republicans and Republican office-holders have now endorsed the lie that Trump had the election stolen from him by fraud. In addition, the Republican leadership is effectively ostracizing any Republican office-holder who dares to tell the truth.

The real issue is no longer just about that lie. It’s about the fact that the majority of Republicans will not only blatantly lie, but will reject the facts and the truth in that instance, especially when Trump administration officials refuse to testify about what happened on January 6th and when Trump himself attempts to keep records secret which would reveal what happened behind the scenes.

I suppose I shouldn’t be surprised, not when the vast majority of deaths from COVID are now among the unvaccinated [largely Republicans] who refuse to believe the cold hard statistics about who is dying and who is not. When people will die because they refuse the facts, why should I be surprised that they’ll accept another big lie about who really won the election.

Republicans claim that life is sacred, and that’s why they want to ban abortion. So why do they so strongly oppose aid to children born in poverty through no fault of their own? By their logic, children in both situations are faultless and need help, but apparently only the unborn deserve it…and only until they’re born, which is quite convenient for the Republican pocketbook.

And because abortion is still largely governed by state law, poor women in Republican-dominated states would be impacted far more than wealthy women, but, again, Republicans seem immune to the hypocrisy of their policies.

This sort of lying and hypocrisy is exemplified by Senator Rand Paul of Kentucky, who has steadfastly voted against disaster aid to other states on the grounds that such aid is not the role of the federal government, but who is now demanding aid for his home state since it’s been hit by massive tornadoes.

Republicans also theoretically believe in local political control, but only if it’s Republican. That’s why they split Salt Lake City, which is heavily Democratic, into four parts, each part being included and outvoted by heavily Republican suburban and rural interests. So although I live more than two hundred fifty miles from Salt Lake City, and Salt Lake City has very different needs from Cedar City, a part of Salt Lake City is in the second congressional district, and I can almost guarantee that the Republican incumbent’s attention is only marginally focused on the needs of his urban constituency.

I’m not saying that Democrats aren’t hypocritical, but today’s Republicans have taken lying and hypocrisy to an all-time low. Not only that, but they’re shameless about it.

Public Appearance?

I happen to like vests, but it’s clear that, except occasionally with three-piece suits, vests are not currently popular or fashionable in most parts of the United States. But what is fashionable today?

The definition of fashionable is “characteristic of, influenced by, or representing a current popular trend or style,” while stylish is usually defined as “fashionably elegant and sophisticated.”

Now, obviously, with my love of vests [tastefully flamboyant with matching tie when I’m making writing-related appearances, and quite conservative otherwise], dress shirts, and cowboy boots, I’m no slave to current fashion, but what I wear, according to more than a few people, is a style that suits me, in more ways than one. Because I have high arches, cowboy boots are one of the few forms of footwear that don’t destroy my feet, and all of my boots are either solid black or brown.

When I was younger, I sported longer hair and a mustache, partly because my first wife thought both were more fashionable This was in the 1970s, and 1970s fashions, especially in retrospect, didn’t benefit most people, and I was no exception. I look better with short hair [even if there’s not much of it left on top] and clean-shaven. I also feel better that way.

Any type of fashion trend generally tends to look better on people who are young and painfully thin. Most of us aren’t. And that means, if we want to look our best, we need to choose what looks good on us and what is also practical and comfortable.

What I don’t understand is why so many people, especially younger [defined loosely as those who are less than forty] people, particularly men, seem to go out of their way not to look good. Maybe I’m missing something, but when people I know are not poor, or even close to it, show up wearing ripped pants or cargo shorts, dirty shirts, and flip-flops in forty degree temperature weather, it doesn’t make a lot of sense to me. Nor does wearing shorts that swallow you, or tank tops that show and exaggerate every extra pound.

If dressing like that is making a statement, what exactly is that statement?

Competence

I recently discovered that a number of readers have decided that my books fall into a category that I’d heard in passing over the last several years – “competence porn.”

I don’t have a problem with readers finding my protagonists competent – as well as even some of my villains. My problem lies with the category itself, possibly because I’m definitely old school, and while I can’t object to knowledgeable adults reading and viewing pornography, it’s definitely not my thing. As Marian Zimmer Bradley – who definitely knew pornography – once observed, pornography is mainly concerned with anatomical plumbing. Combining pornography with competence exalts the former and degrades the latter.

And I have a real problem with degrading competence, especially at a time in our history where everyday competence is getting rarer and when fewer and fewer young people can read or write competently. Classifying books with competent main characters as a type of pornography is the last sort of thing we need today.

Part of the idea behind the “competence porn” classification is a failure to understand that competent characters aren’t perfect. Even the most competent individuals make mistakes; it’s that they seldom make stupid mistakes in their own field, because competence requires knowing your field.

Another problem with the term “competence porn” is the current tendency of far too many readers to denigrate genres, subgenres, styles, and authors that they don’t like. I understand that many readers like and want fallible characters who get into messes because they’re not competent. Some readers want to root for such characters. That’s what they like. But that doesn’t mean that what they don’t like is bad. Sometimes it is; many times it’s just not to that reader’s taste.

I don’t have a problem with that. I do have a problem with anyone who denigrates books that feature excellence and competence. If an author doesn’t write competence well, one can fault the work, or the way it’s handled, but terming books that feature competent main characters as competence porn is a disservice to both the authors and to the ideal of competence.

The Fragile Generation

This past semester, my wife, the voice and opera professor, has been faced with the most fragile and unprepared group of incoming students that she’s seen in more than fifty years of teaching, although for the past decade or so she’s found that incoming students have become increasingly fragile and less academically prepared.

Not only are the vast majority unable to write a coherent paragraph, but most of them have difficulty reading material that the majority of previous classes could handle. They also have difficulty following class discussions, in turning in assignments on time, and in being able to attend class regularly. And we’re not talking about minority students, but predominantly western USA whitebread students.

They consider writing a thousand word essay as a major and unnecessary trial and fifty pages of reading a week as excessive.

Every single faculty member in the music department is facing the same issues, as are faculty members in any department that is attempting to actually get students to study and to learn. According to a university staff psychologist, roughly forty percent of the incoming students in the university suffer from depression and/or have anxiety issues.

In the field of music, as in most fields, professional musicians and music teachers have to know the music, the techniques, and the history behind their studies, but these incoming students don’t know how to write or how to learn and memorize music. They’re under the illusion that they can Google everything, and they often get sullen or resentful when they find out that they can’t… and they also can’t be separated from their cellphones. Under university policy, while faculty can request students to put away cellphones, faculty can’t prohibit them in class. One student in another department even requested that the university director of ADA certify her cellphone as a psychological necessity after her professor asked her to stop using it incessantly in class.

Many of them break down in tears – and the males tend to be bigger babies than the women – when they discover that they actually have to work to pass a class.

Yet the administration pressures faculty members to do everything they can to keep students in school, even students who’ve missed weeks of classes because they’re too stressed out to attend classes.

Given the way the students are when they arrive at the university, there’s too much they’re not being taught in elementary and secondary schools, and they’re certainly not being taught true self-discipline or accountability. But everyone seems to think it’s the job of college faculty to undo all the damage caused by overindulgent parents and elementary and secondary school teachers bludgeoned into submission to the “self-esteem” requirements forced on them, largely by parents.

The Interface Problem

The first two definitions of “interface” are: (1) the point where two systems, subjects, organizations, etc. meet and interact and (2) a device or program enabling a user to communicate with a computer.

One of the greatest problems with the increasing use of computerized systems is that all too many human/computer interfaces are flawed, both on the human side and on the computer side, as exemplified by the following examples.

A little over a week ago, the local Walgreens called to remind my wife that she was due for her second Shingles shot. She couldn’t do it immediately, but she had time after a dental appointment last Tuesday. So she stopped in at the Walgreens around 5:00 p.m. and went to the pharmacy. There was no one waiting for anything, and two pharmacy technicians and a pharmacist were on duty. She asked for the shot. She was told she had to make an appointment, except the store’s pharmacy telephone information line said that appointments were only necessary for COVID and flu vaccines, and that people could go to the pharmacy without an appointment. The main Walgreens website said the same. She pointed out that when she’d called the store, she was told she didn’t need an appointment for Shingles. She came home furious, but she called for an appointment, but was told by the Walgreens central vaccine scheduling office that they could only schedule COVID and flu shots by telephone. Other shots had to be scheduled online. But when she tried that, the Walgreens system wouldn’t schedule anything but COVID and flu. Another call back to Walgreens vaccine scheduling didn’t solve the problem, but the person on the other end suggested a Walgreens’ scheduling subsite that she could go to directly, a site that wasn’t listed anywhere. That worked… so far as getting the appointment, but that site wouldn’t accept her doctor’s info, which mean more of a wait when she did get the shot.

That’s definitely an example of an interface problem.

Another example is something experienced by a Canadian reader who was trying to obtain a Kindle version of ISOLATE from Amazon.ca [the Canadian Amazon outlet]. He could get the audiobook and the hardcover, but not the Kindle ebook. The same was true for a number of his Canadian friends. I brought the matter to TOR’s attention, and my editor looked into it. Amazon replied to TOR that there was no problem. The links worked fine. Except they didn’t for those Canadians. Paradoxically, my Canadian friend got the Kindle from Amazon.com [the U.S. Amazon], but he informed me that Amazon.ca still said the Kindle version was unavailable, not only to him, but to number of others.

I’d like to think that these are isolated examples – but they’re not. Too many organizations have websites that are close to impenetrable even for people with considerable familiarity with computers, not to mention those businesses with semi-AI telephone systems that not only work poorly, but often never allow a caller to talk to a real person, or only if the caller spends forever going through menu options and trying to reply to a computerized voice saying “I didn’t get that. Did you mean XXXX,” or the equivalent.

Yet more and more businesses are relying on flawed computerization and voicemail systems that don’t deal with real-world people and their problems… and with the shortage of workers, this problem is likely to get a lot worse.

Naysayers

Now that Isolate is finally published, I’ll be interested to see if reader reviews follow a familiar pattern to that of my earlier books, a pattern, interestingly enough, that also occurs in the political world.

Once one of my books is published, usually the first reader reviews are mixed, but almost immediately, along with those who liked the book are those who go to great lengths to find faults with it, of all sorts. Those quibblers and naysayers tend to have a greater presence in the days and weeks immediately following publication, but then, over time, those who quibble and carp about what’s in the book and about what’s not (and find the book “boring”) drop off, and later comments tend to be more positive.

What I find interesting about this is that it’s very similar to the reaction to major political events. Whatever the event or occurrence, the naysayers are usually out in force first, whether it was January 6th, or Obamacare, or walls and immigration, masks and vaccination.

Part of the similarity, I suspect, lies with the subject matter. Neither politics nor my books are simple, and anyone who’s studied either knows that. Anything that’s complex tends to draw opposition, possibly because saying “no” is always easier than a considered and thoughtful response.

In addition, in dealing with large numbers of people, even the best crafted regulation or law will have repercussions on someone. If a vaccine is 93% effective (and that’s high for a vaccine), that means that it doesn’t work well on 7% of those who receive it.

Likewise, even the best crafted thought-provoking book will irritate some people, and as study after study has shown, negative reactions show up more often first and more strongly than positive reactions. This has been true in politics as well. The AMA and most businesses were initially dead-set against FDR’s Social Security proposals. Going back a bit farther, the southern states would have blocked the Declaration of Independence and the Constitution had slavery been outlawed from the beginning.

But it doesn’t always happen that way, which is why, sometimes, it’s better to think things over, from books to politics.

Arrogance and Arrogance

In the United States of today, I’ve observed two types of arrogance manifested by those who have ability and power, usually but not exclusively by males. The first type of arrogance is that typical of most elites in most societies – that they’re special and everyone should know it, even if they gained their position, power, and wealth largely aided by factors they had little to do with, such as family economic and social position.

The second kind of arrogance is the assumption that, if they can do it, anyone can, if others only work hard enough. I’ve seen this kind of arrogance manifested far more than a few times, usually by white males. I’m not saying that most of them didn’t work hard to get where they got, because many of those I know did in fact work hard, but all too often hard and even hard smart work isn’t enough.

What few of them fail to realize, or at least to acknowledge publicly, is that many of the aspects of their lives that they take for granted as “normal” are anything but normal for tens of millions of Americans, things like a stable home life growing up, having enough to eat as a child, a decent grade school and secondary education, living in a low-crime area, not having an ethnic/cultural background that makes strangers suspicious, having good role models.

Another factor that too many “self-made” individuals ignore or minimize is the role of luck and timing. I owe a great deal of my success to what I’ve learned from my wife, yet how we met was statistically effectively impossible.

A publisher once told me that the great success of a particular book/series was made possible by a set of circumstances that existed for only one five-year period ever in the publishing industry. Now, the writer in question had been published previously and could have likely continued as a successful midlist author…and perhaps eventually done better than that, but those circumstances and the fact that the publisher recognized them gave the author far greater success than others who had equal ability, but wrote earlier or later in time.

I’m not writing about myself, but in my case, I got my first and long-standing editor as a result of the intersection of three facts – the fact that I’d published a handful of stories in ANALOG, that he read short stories because he compiled anthologies, and that he recognized my last name because he’d known my cousin [with the same uncommon last name] in college. Those were just enough to get him to read my first novel… and to publish it and eventually many others. And it was pure luck, from my point of view, that he then became an editor for a publishing start-up then known as TOR.

Yes, I sold my first stories over the transom to people I’d never met, and I worked hard, damned hard, and I sent that first novel to every F&SF editor whose name and address I could find, but I’ve known lots of other authors who have worked hard and weren’t in the right place at the right time with the right book. And even after that, it took me another ten years to be able to become a full-time writer.

It’s been said by others that great success comes when hard work meets great opportunity, but hard work doesn’t always meet such opportunity. For those reasons, and quite a few others, I find that it’s arrogant when someone says, “If I can do it, anyone who works can do it.” It’s just not that simple… and it never has been.

Dying for Your Beliefs?

The fatality rates of diseases, at least in theory, shouldn’t have any connection with political beliefs. That’s in theory, but since this past June, that theory has been proven wrong.

Since Delta began circulating widely in the U.S., COVID has exacted a horrific death toll on counties where Donald Trump received at least 70 percent of the vote, killing 47 out of every 100,000 people since the end of June. In counties where Trump won less than 32 percent of the vote, the number is about 10 out of 100,000.

In October, twenty-five (25) out of every 100,000 residents of heavily Trump counties died from COVID, more than three times higher than the rate in heavily Biden counties (7.8 per 100,000). October was the fifth consecutive month that the percentage gap between the death rates in Trump counties and Biden counties widened.

Is this a lethal political litmus test? In a way it is. Because of the vast amount of COVID misinformation circulated and accepted by Republicans, or for other factors unique to Republicans, they are far less likely to get vaccinated, and vaccination keeps the vast majority of those vaccinated from being hospitalized or dying from COVID.

A late October poll by the Kaiser Family Foundation’s COVID-19 Vaccine Monitor found that 39% of Republican adults remain unvaccinated, while just 10% of Democratic adults were unvaccinated.

Yet depending on the state and the statistics, between 89% and 98% of patients hospitalized for COVID are unvaccinated, and the current number of U.S. COVID fatalities is now at 751,000 and continuing to grow.

What I want to know is why so many Republicans believe that dying for the freedom not to be vaccinated is so glorious.

Election Insanity?

Not all election insanity or power-grabs are national. Last Tuesday, Cedar City held an election for its mayor and for two city council seats. The city’s population is roughly 36,000, and the city’s annual expenditures are [if I’ve added all the scattered budgets correctly] are around $42 million, which, besides normal government functions [administration, water, trash, sewage, parks, police, and fire] also include operating a modest airport, a municipal theatre, and a golf course.

Councilors serve four years and are paid slightly more than $13,000 annually. The mayor has a four year term and an annual salary slightly above $20,000. The City Council is the body that decides policy, and the mayor has no voting power.

On the surface, the council seat elections were unremarkable, in that the four candidates [all male – after all, this is Utah] all promised that they would be the best in guiding the city forward. The four candidates spent from $5,000 to $15,000 each on their campaigns, for a total of roughly $40,000.

The mayor’s race was another story. The two-term incumbent is a corporate attorney in her very early thirties, married to a doctor, with deep family roots in the area. She was the youngest mayor in city history and the only woman ever elected mayor. She raised over $106,000 from a variety of business and corporate sources, as well as from personal sources, but the majority of contributions came from the business and corporate sources.

Her challenger was a local businessman who had founded and expanded an extremely successful plumbing supply business for over 30 years, who put $130,000 of his own money into his campaign, and who also donated $11,000 each to the two council candidates that he favored, effectively allowing them to significantly outspend their opponents.

In the end, money won. The challenger came up a winner by a little over a hundred votes out of a little more than 7,500 cast… but only one of the two council candidates he backed happened to win.

I still have a hard time understanding why the race for a mayor’s position that pays only $20,000 a year and has no voting power ended up costing close to a quarter of a million dollars, except that the mayoral challenger clearly wanted the position.

Denial

The other day I read in the next-to-latest edition of BBC History about how the BBC was swamped with complaints about a program that depicted a Roman Legion Commander as dark-skinned. The facts – lots of them – prove that while the Romans were imperialist bastards, they didn’t give a damn about skin color. Their upper class, all across the empire, had various skin tones. So did their slaves. Power and wealth mattered, not skin color.

Yet, to this day, people deny this, along with scores of other matters that go against what they want to believe, and in this time of “pick your own news” and pick only your own facts, this trend of denialism is worsening, certainly in the United States, and apparently in at least some of Europe.

So why do intelligent people believe things that are not so? It’s not primarily a matter of intelligence. In fact, studies have shown that intelligent and well-informed people, whether conservative or liberal politically, are often more likely to use their knowledge in support of matters that are not accurate or true than are less informed individuals.

In psychological terms, denialism is a person’s choice to deny reality as a way to avoid a psychologically uncomfortable truth.

In theory, resolving factual disputes should be relatively straight-forward. Present strong evidence, or evidence of a strong expert consensus. Unfortunately, when scientific advice presents a picture that threatens someone’s perceived interests or ideological worldview, large numbers of people will reject those facts. The strength of rejection turns out to be related to someone’s political, religious or ethnic identity, and the strength of those identities.

In short, denial is notoriously resistant to facts because it isn’t about facts in the first place. Political or scientific denial, as well as other forms of denial, are an expression of identity – usually in the face of perceived threats to that identity.

And, unfortunately, right now, a great number of facts threaten a great number of personal identities, and many of those facts don’t appear to be going away.

Hot Off the Press

I was recently featured on Con-Tinual’s “Hot Off the Press” program, along with other authors.

The Facebook version is live now at https://www.facebook.com/james.nettles/videos/6641981779153027 .

The U-Tube version should go live in a few days at: https://www.youtube.com/c/ConTinualConvention