Archive for the ‘General’ Category

Living in La-La Land

One of the greatest gifts of the species homo sapiens is the ability to dream of what might be. Unfortunately, that ability is also one of our greatest curses, because it allows individuals to dream up unworkable and truly terrible beliefs and inspires them to try to impose them upon others, often by force or deception, if not both. In this, mass media, like all technology, allows the amplification of human abilities to spread and impose various beliefs.

So now we live in a country where the President of the United States believes that a tax bill that conveys the majority of its benefits upon the wealthiest one percent of all Americans will improve life for everyone and where a significant percentage of Americans shares that belief. A country where the President and policy makers believe that there’s a workable military solution to the nuclear weapons efforts of North Korea [and there is, that is, if you’re willing to accept the destruction of South Korea and millions of Korean deaths]. A country where roughly half the population believes that the massive proliferation of individual weapons of death actually reduces violence, despite endless and irrefutable [factually, that is] statistics to the contrary.

These sorts of delusions, of course, aren’t limited to the United States, and some other countries are far worse, but even here in the “good ole USA,” I run across personal examples that stagger me, even as I recognize that belief is stronger than fact, stronger than rationality, and more powerful than a speeding locomotive [to totally scramble metaphorical comparisons].

This week, a student we know revealed that she was told not to come for Christmas by her mother because she had set a horrible example for her younger siblings. Her offense? She was dating a young man who was not of her faith. Rejecting your own child for that?

Then I heard the university president claim that over the past twenty years the university had more than doubled in size, but the student/faculty ratio was lower. When the full-time faculty has increased by only thirty percent, but the administration and adjunct faculty have more than tripled, is this self-delusion or deliberate deception?

Here in Utah, President Trump proclaimed that his action to cut the Bears Ears National Monument by more than eighty-five percent would allow native people to have a rightful voice over the “sacred land where they practice their most important ancestral and religious traditions.” Those Native Americans clearly didn’t think much of that, since they supported the original monument size and in fact have so far filed four lawsuits against the Administration. The president also contends that the best Senate candidate for the open seat in Alabama is a confirmed sexual predator of high school girls because the Senate needs that Republican vote, while, of course, Al Franken and John Conyers – both Democrats – should be expelled from Congress for their sexual predation.

A national poll and study revealed that Americans continue to value men on their accomplishments and women upon their appearance. And, as I’ve mentioned so many times before, educational bureaucrats and politicians keep claiming education is getting better, and that more students are going to college and graduating. That may be so, but a greater and greater percentage of them can’t learn and synthesize information or write coherent paragraphs.

All this gives me the uneasy feeling that the “true believers,” those who place belief in their political tribe or faith above facts and reality, are winning and that the United States is indeed moving toward becoming even more of a La-La Land, where all that matters is the strength of belief, whatever that belief may be.

Another Statistic

A few days ago, a friend of mine, who was also the husband of a colleague of my wife the professor, became a statistic. He shot himself fatally while his wife, also a professor of music, was at work. We’ve been friends, if not the closest of friends, for a number of years, and we even had them over for Thanksgiving dinner, as we have had for the past several years.

He had been an on-site construction manager for industrial projects, and some thirty years ago was badly injured in a construction accident. He was almost completely paralyzed for some time, but managed to regain enough muscular control that he could walk, talk, and handle most everyday tasks, although he did lose what I’d estimate as probably 60-70% of his former muscular strength, especially in his upper body. He lived most of his life since the accident in some degree of pain. With the help of opioids, he’d managed a normal life as essentially a house-husband – he did the cooking and light cleaning. And this worked for more than 25 years. But several years ago, even though he’d never abused opioids, with all the furor over them, he was essentially denied their use. Then he fell getting out of bed and shattered his leg, and had to go into total rehabilitative care because he didn’t have the upper body strength or coordination to use either crutches or a walker. Even after the leg healed, the pain got worse, and he lost 60 pounds in a year, and the doctors kept insisting the pain was all in his head.

He tried more physical therapy, to the degree that it was physically possible, and forced himself to take walks three to four times a day. Nothing that the medical profession suggested worked, and to top it all off, at one point, doctors even implied that both my friend and his wife were opioid users, which was totally ludicrous, given that she abstained from any kind of stimulants or drugs, except coffee, tea, and diet cokes… and that he had never turned to illegal drugs or to illegal means of acquiring prescription painkillers.

In the end, when he finally took his own life because the pain overwhelmed him, he became, not a victim of opioids, but of the war on opioids. He was an intelligent and highly disciplined man, a devoted vegetarian, who’d never used any drugs, except the opioid painkillers, and those never to excess, and who might have two glasses of wine with dinner, very occasionally. He was gently and kindly witty and very good company. He’d managed very well for 25 years on a moderate opioid regime, but with all the furor about opioids, this relief was denied to him.

This is not the only story I’ve heard along these lines, but it’s the closest one that I’ve witnessed personally.

As I’ve noted before, it seems as though the policymakers in this country, and possibly elsewhere, are ignoring the problem of pain and are essentially treating everyone who seeks relief of that pain as a potential criminal statistic. If this continues, and I see no sign of it changing, there will be a significant increase in both suicides and/or the use of illegal drugs or “illegal” possession of legal opioid painkillers, if not all three. And that’s assuming that these increases haven’t already begun.

“Connected” or “Disconnected” ?

One of the seemingly unfathomable and comparatively new outlooks my wife the professor has noted among students entering college in the last two to three years is a comparatively much lower level of understanding of certain connections and values that used to be easily comprehended by past students.

For example, students given full tuition scholarships, which require at least an even “B” average, are blowing off classes and not doing the work…. and they lose a four-year scholarship, which is worth tens of thousands of dollars. And we’re not talking about well-off students with family money, nor are these students disadvantaged minorities. They come from working or middle-class families; they have good grades in high school and high SAT/ACT test scores. Some of them will overcommit to part-time work in order to pay for what those of us in an older generation would have considered luxuries, such as newer cars and I-phones, but they’re not using the money to buy textbooks, or even borrow them, or in the case of music students, not even to purchase the music they’re supposed to be learning as part of their major, complaining all the time that they don’t have the money. It’s almost as if college is an imposition.

At the same time, they pay for everything with plastic, almost as if they had no idea of where the money represented by the endless card-swiping comes from.

Then there are those of higher than average intelligence who cannot take a series of events, or pieces of music, or facts and synthesize what they have in common or how they differ. Nor can a majority of them write a coherent paragraph. Far too many of them feel that they have no obligation to learn, and that every professor is under an obligation not only to inspire them, but to spoon feed them what they need to know. This is not helped by an administration whose overt and clearly expressed philosophy is that professors are solely responsible for keeping students in school and that student retention is a higher priority than a good education.

A majority of these students have little or no intellectual curiosity, as well as little knowledge of either American culture or history, let alone the history or cultures of other lands.

Yet, they’re generally good young people, if as self-centered as most teenagers have been in at least the past several generations. They’re not mean or vicious, but they don’t seem able to figure out what work needs to be done unless they’re given specific directions. And when they reach the end of those directions, they stop and look around blankly.

In many ways, for a generation cited as the most connected in history, it’s almost as if they’re totally disconnected from anything but their electronic “reality.” They don’t talk to the people around them. Far too many of them don’t understand deadlines and get upset when professors don’t “understand” that they’re stressed or have emotional issues. They don’t really seem to make a connection between the quality of work and success. They don’t understand, or want to understand, the history that led to where they are.

Too many of the voice students can’t even explain what they feel when they’re singing, and yet they want to be professional singers… and they don’t get the fact that unless they can master their own bodies, and understand the feelings and muscular control necessary, they’ll never make it as singers or teachers of singing. In fact, many actively reject connecting to their physical feelings.

Disconnection may shut out a world they find unpleasant or unimportant… until that world crashes through their electronic bubble and asks them to pay the bills with real physical work requiring meeting standards on someone else’s timetable. And it will… sooner or later.

The Betrayal of Trust?

As I’ve pointed out before, both in this blog and in various novels, public trust is vital for a working civilization on all levels. We trust that there will be water and power. Despite a handful of terrible mass shootings, we trust that, in the vast majority of times, we can walk the streets of our communities without being gunned down. We used to trust the media for comparatively honest reporting, but that trust is rapidly vanishing, and has vanished entirely in the minds of a large segment of the American population.

Because we’re a social species, we instinctively look for individuals in whom we can place trust. Most people don’t trust numbers, and they tend to trust those who try to persuade them with numbers and statistics even less. They want to trust people who are like them and who seem to tell the truth.

But what happens when more and more public figures are revealed not to be truthful in their private lives, or worse, to have engaged in reprehensible behavior that they kept secret through their power? Immediately, people begin to wonder in whom they can put their trust. Americans have already lost faith in most career politicians – one of the reasons why Trump was elected.

More than ever we’re seeing how many more politicians and media figures have engaged in far less than exemplary conduct in their private lives, and the trustworthiness of the media, never that high to begin with in recent decades, is plummeting, as is the public image of business leaders. What we don’t like to admit, either privately or publicly, is that what we’re seeing about public figures isn’t anything new, but merely a revelation of what has gone on all along. What’s different is that the formerly powerless people who used to be abused without recourse now have recourse, and the results are anything but pretty. History has also revealed that revered and beloved leaders often kept secrets that might have driven them from power, had they been revealed, but those revelations usually didn’t come out until much later.

What some powerful people also fail to realize is that, in a mass media and social media society, very little can remain hidden for long, and it’s harder and harder to keep secret personal shortcomings or abhorrent or potentially illegal or immoral behavior. And, no matter who you are, all of us have deeds or words that could be embarrassing or worse if revealed to the world. This isn’t something new. We want to have leaders better than we are, and we want them to be above reproach in everything. But our leaders don’t come from some spotless heaven; they come from society. Yet we feel betrayed when dirty secrets or sexual harassment charges appear in the media.

And that sense of betrayal makes it harder and harder for leaders to lead, and to reach any sort of consensus, partly because each side doesn’t believe it can trust the other, and partly because, when there is a lack of trust, people want absolute guarantees and, too often, an absolute guarantee for one side totally alienates the other side.

Now… if we want to reduce the magnitude of the “trust” issue in business, government, and the media, there’s one fairly straightforward way to elect politicians less likely to engage in sexual harassment, or to choose news executives and anchors who are less likely to use the “casting couch,” and that’s to put more women in charge. While there are some women who have sexually harassed others, according to EEOC figures, men in power are more than five times more likely to abuse their position than are women. Not that this will set well with most men… like most unpleasant facts.

Another possible way to deal with the trust issue is to spend more time verifying those facts that can be verified, rather than blindly trusting people we find appealing and likable. Another way is to be more skeptical and to judge political and media figures by what they’ve done… and what they’ve failed to do… and to evaluate what they propose by what the impact will be… and not by what they claim. We may not ever know all the exact details, but when a tax cut has the greatest immediate benefits for the wealthiest one percent, is it really prudent to trust the politicians who claim that it’s a great benefit for the other 99% of the population? When every nation in the world, except the U.S., has taken a stand of some sort against global warming, is it really wise to trust politicians who ignore this, or who claim global warming is a hoax.

When a political party rigs the representation in a state so that by winning less than half the votes in that state, that party controls 60% of the state legislature, can the politicians of that party really be trusted to be truthful?

Maybe, just maybe, it’s time to reduce our emphasis on personality or increase our emphasis on facts and actual accomplishments. We might not be quite so disillusioned then.

Writing What You Know?

Writing what you know is a well-intended piece of advice for aspiring writers that is too often misconstrued or misapplied. First, what we know is the result of our experiences, both good and bad, and also interesting and, frankly, boring. A long time ago, I spent a year as an industrial market research analyst, and my job was to analyze past sales patterns and forecast future sales of compressed air filters, regulators, lubricators, and valves used primarily in heavy industry. In the more than forty years since I departed that job, I have yet to find a way to make it terribly interesting to readers, except as a motivation to escape such detailed and precise boredom.

Yes, sometimes we do have experiences that are exciting, but if all you do with them as a writer is rehash the past, no reader will be interested. I’ve used my years as a Navy pilot not to relive the Vietnam years but as the basis for writing about piloting spacecraft, but even that requires integrating new information with old. What my experience does provide is the “feeling” of being in that sort of situation. The same is true of my years as a senior political staffer in Washington, D.C., where the experience and knowledge I gained become the basis for writing about politics in different governmental settings, in both SF and fantasy.

There are times, however, when using experience is counterproductive in writing fiction, and that’s when popular but inaccurate images and tropes have been fed so thoroughly to people that conveying what your experience has demonstrated is accurate conflicts with the popular images that are anything but accurate. I discovered this with The Green Progression, a then near-future SF mystery thriller I wrote with Bruce Scott Levinson about environmental politics in Washington, D.C. Although it actually got a review from a D.C. paper praising its accuracy in depicting Washington politics, the book was a miserable failure in terms of sales, largely because, I suspect, its depiction of national politics was far more mundanely brutal and cruel than the glamorously exciting, body-filled, last-minute-escape-from-danger images created by both popular thrillers and movies. What’s ironic about that was, when, several years ago, I had lunch with the head of the consulting firm I once worked for, he laughed at the fact that we had essentially written a lightly fictionalized version of what his firm did… and no one believed it. He also paid for the lunch.

Experience does matter, because a wide range of experience makes it far easier to convey in depth different settings, occupations, and environments and how people react to such settings and occupations. How much it matters depends on what you write and how you write it… and the audience for whom you write. Page-turner thrillers, from what I’ve observed, rely on action and more action, and not as much depth in other areas is required. Novels that explore character through action generally can benefit from more experiential depth.

But like all sets of advice and all generalizations, there are authors and books that are quite successful… and go against every observation I’ve just made.

Now is the Time…

…for my occasional rant about when it would be proper to start celebrations and gift buying for Christmas, that is, AFTER Thanksgiving.

Yet in most of the United States, by November first, right after Halloween, Christmas decorations appeared and sales were announced. Sirius’s XM satellite music service shut down the Billy Joel channel and replaced it with Christmas music, too much of it the elevator kind. Some Christmas decorations and sales items were appearing in Walmart in October.

While Christmas has historically been the time to celebrate the birth of Christ, only two of the gospels even mention his birth – Luke and Matthew – and none give any hint of what time of year his birth took place. Some early believers thought it was in April, but the Egyptian Christians decided to celebrate it on January sixth, while others favored the winter solstice. Eventually by the fourth century the Christian Church agreed on December 25th.

Even the Christmas tree had nothing exactly to do with Christmas, but more with non-religious German customs, and wasn’t common even in England until Prince Albert made it a custom of the British royal family in the middle of the nineteenth century.

I can understand the uncertainty about the time of Christ’s birth. I can see why, with the exact date unknown, it made political sense some seventeen centuries ago to agree on a date that matched other celebrations in the hopes of gaining converts.

What I don’t accept is the idea that a religious holiday and supposed celebration of faith should become – cancerlike – a massive commercial sales shill that threatens to gobble up [pun-intended] the comparatively non-commercial celebration of Thanksgiving.

But then, what else should I expect from a religious holiday that was moved to appeal to the pagans, now that it’s been swallowed by worship of another pagan deity – either Plutus [the ancient Greek god of wealth, from which comes the term plutocrat] or Mammon, take your pick.

Role of University President?

Before I married my wife the opera singer and university professor, my primary interests in the arts were literature, especially F&SF and poetry, and painting. My principal musical interests were instrumental classical music and “non-twangy” country music. The country music has faded to the background as I’ve also come to appreciate and enjoy opera and musical theatre.

I’ve also moved from being immersed in national politics to being a close observer of university and faculty politics… and have come to realize that in all too many instances, Henry Kissinger was right – university politics are every bit as bad as national politics, or they were until the last year or so.

One question that keeps coming to mind for me is exactly what is the role of a university president? Are university presidents primarily glad-handers and fundraisers? Or are they supposed to set the course and policies of the institution? Or, in the case of state institutions, to lobby the legislature to obtain a share in increasingly hard to get state funding?

The current president of the local university is a lawyer, a profession that I’m convinced has created a disproportionate share of the mess in which our federal government finds itself. I have nothing against attorneys in their proper place [and I shouldn’t, given the number of them in my family], but I firmly believe that neither accountants nor lawyers should be in charge of anything. Yes, every CEO needs a good attorney for advice, but attorneys as CEOs or university presidents? Not a good idea. I feel the same way about accountants.

I’m certainly not privy to all this president has done, but I have to say that, from what I’ve seen, his priorities are… can I say, of dubious value.

Every door in the music department had to be replaced not once, but twice, for legal reasons, because professors couldn’t be observed teaching otherwise, not that there were ever any complaints. On the other hand, there’s still no funding to replace the sixty year old defective and potentially dangerous lighting system in the recital hall… although plans have finally been discussed, but the replacement has been postponed for three years running. When asked about the possibility to replace the sixty-year old and overcrowded music building, he told the faculty they needed to find a wealthy donor… and that they “knew what they were getting into” when they became music professors.

The university president religiously attends every football game and touts the football team, which has won the conference title for two of the last three years. He hasn’t commented on the fact that singers from the music department have made it to the national finals of the National Association of Teachers of Singing for the past three years. Nor has he noticed the theatre alumni/alumnae who have appeared on Broadway or in touring national productions. He certainly hasn’t noticed the professors who taught and mentored those graduates. But he has forced out the director of the Utah Shakespeare Festival, one of the two men who built it into a national Tony award winning regional theatre, and eliminated the modest stipend paid to the Festival’s founder. The only music department concert he now attends is the annual choral-rock concert, and he’s inordinately proud of the university’s recent achievement as being cited as the most “outdoors-oriented” university in the country.

For more than fifty years, the university president has served as a board member of the Cedar City Music Arts Association, the oldest all-volunteer arts organization in Utah. Some presidents were more active than others; all attended board meetings [nine a year] occasionally. This president never attended and recently resigned.

So far, after three years, he hasn’t managed to land major financial support, nor has he been able to persuade the legislature to come up with significant additional funding, even though the legislature has insisted that the university accept more students every year, so that tuition continues to climb. And despite the increased enrollment, very few additional full-time faculty have been hired, but the number of adjunct faculty has burgeoned.

But the football team is better.

Educational Excellence…and Measuring It

One of the problems with excellence is something that I’ve seldom seen acknowledged, especially by those charged with determining and “measuring” it.

Simply said, excellence is individual, limited, and determined by specific accomplishments or “products.” Mozart and Beethoven wrote specific exceptional musical works. That defined their excellence, and that excellence was independent of their personal behavior, which was, to be charitable, far less than excellent. Einstein’s excellence manifested itself primarily in his theories of general and special relativity and the photoelectric effect.

These days, colleges and universities have essentially tied the idea of excellence in education to bureaucratic systems and accountability to rigid standards that miss the mark. Buzzwords like “essential learning outcomes” and “experiential learning” and “detailed rubrics” and “enhanced student retention” all abound. Syllabi have become detailed tomes that need to be written with near-legal precision. All of this and more is presented as both a means to excellence in education and as a way of measuring what constitutes such “excellence.”

And… of course, whether anyone will admit it, such methods and systems are failing. They’re failing because no one wants to look at what actually measures excellence in education. Excellence is not measured by how many students stay in school and graduate, nor is it measured by inflated grade-point averages, or university student evaluations, or by the immediate post-graduate earnings of such students. Diplomas, in all too many cases, have become almost meaningless paper credentials. First jobs and accomplishments pale, and touting the earnings of former students shows more about their interest in money than their accomplishments or their interest in actual accomplishment.

In the end, what represents a college’s excellence is the accomplishments in life of its students, especially in later years. The problem with this measure of excellence is that it’s long-term, and the educrats need immediate and flashy bookmarks to placate and motive legislators, donors, alumni, and parents and students paying ever-higher tuition and fees.

All too many universities recognize and honor primarily alumni or alumnae who have either attained celebrity status or donated substantial sums of money. Universities who recognize concrete and significant accomplishments of alumni with as much ballyhoo as those who donate enormous sums of money are rare. Student athletes who become successful professional athletes are touted over former students who become successful professionals in other fields.

The same is also true in terms of faculty recognition. Solid, career-long accomplishments of faculty seldom are lauded. Popular awards, awards that can be used for PR purposes, or accomplishments that gain press or increase enrollment, are all too often the faculty “accomplishments” that are touted by universities… and seldom do they represent excellence. Nor, apparently, do most people, even university alumni, even care.

They’re more interested in whether the football team has a winning record.

The Unrecognized Costs of a College Education?

For the past four decades, if not longer, Americans have been told in more ways than one that a college education is the way for a young person to get ahead, in fact, just about the only way. In 2009, 70% of all high school graduates entered college, an all-time high. Today, the figure is around 66%… but only a little more than half of those who enter college actually graduate.

The cost of higher education may be one factor for the recent decline, given that, over the past forty years, college costs to students have risen at an average rate of seven percent per year, roughly twice the rate of inflation. Part of the reason is that state colleges and universities have passed on more and more of the costs to students and their parents, and often neither can actually afford them.

The result? Forty-four million Americans have student loans. Almost 20% of those loans are in default, and the default rate is continuing to rise.

Why? The simple answer is that the former students can’t afford to repay the loans, suggesting that they don’t make enough money to cover both living expenses and loan repayments. That’s one reason why many recent graduates are still living with their parents.

One of the reasons is that, as I’ve mentioned in previous blogs, there aren’t as many jobs out there requiring a college degree – and thus providing the income necessary to pay off substantial loans – as there are graduates seeking those jobs.

Yet education remains “the answer.”

I won’t try to address all the occupations where this is a problem, just one area that I know something about – fiction writing. When and where I graduated from college, there were few degree programs in creative/fiction writing. I could and did take several semesters of creative writing, but had a double major in economics and political science. Today, Master of Fine Arts (M.F.A.) programs in creative writing have proliferated. The first M.F.A. program was established at the University of Iowa in 1936. By 1994, there were 64. By last year, according to the Association of Writers and Writing Programs, there were 381 M.A. or M.F.A. programs in creative writing. Annually, some 3,000 plus students a year graduate with such a degree.

While the Bureau of Labor Statistics (BLS) currently states there are 145,900 “writers and authors” in the U.S., a quarter of them are part-timers, and 56% of them make less than $12,000 annually, which would place them below the federal poverty level for a single person. This isn’t especially surprising, given that Nielson Bookscan reported that of 1.2 million books tracked, only 25,000 — barely more than 2 percent — sold more than 5,000 copies. At current prices and royalty rates, selling 5,000 copies will generate between $12,500 and $15,000 – spread over two years at a minimum. Also consider the fact that, according to Publisher’s Weekly, the average book sells less than 500 copies.

There are roughly 1,900 members of the Science Fiction and Fantasy Writers of America, and less than 10% of them make more than about $30,000 annually, according to a former officer of the association. Noted F&SF editor and author Eric Flint once estimated that only 32 F&SF authors in the U.S. earned a consistent comfortable income, presumably an income above the median family income of $55,000.

Under these circumstances, for how many M.F.A. graduates is the degree really worth the cost? And this isn’t just a problem for would-be authors, but for more than a few other fields, as well. I just happen to know the numbers for writers better.

According to the BLS, there are roughly 800,000 employed lawyers in the U.S. today, and those who are employed make an average of $118,000… BUT the BLS also states that every year there are more unemployed law school graduates because the number of graduates is greater than the number of new positions created. This is also true in higher education, where the job market is so tight in most fields that educators with doctorates from good universities can only find part-time positions as adjuncts.

Such numbers also raise another question. Given the increasing costs of higher education, isn’t insisting on a college education risking becoming another form of economic segregation, potentially bankrupting those with heavy loans who don’t win the “jobs lottery,” not to mention offering unrealistic hopes to far too many young people?

Incompetence

Earlier this week, I flew back from the World Fantasy Convention in San Antonio, with what I thought would be a comparatively simple itinerary, at least for me, given that getting to and from Cedar City isn’t ever a single flight – except to Salt Lake. My first flight was from San Antonio to Salt Lake City on a fairly comfortable aircraft, an Airbus 320.

Boarding was without incident. Then a few minutes before scheduled push-off from the gate, the pilot announced that there was a fuel discrepancy that needed to be resolved. It took more than an hour and ten minutes to resolve the “discrepancy” and handle the paperwork.

The pilot had announced that the aircraft had the proper amount of fuel, but that the discrepancy still had to be addressed. So, believing that we had been sitting around for more than an hour just to unravel a bureaucratic paperwork snafu, I inquired into the nature of the discrepancy. One crew member finally told me that the problem wasn’t the amount of fuel, but that its location was. Apparently, there was a 20,000 pound imbalance of some sort. How this occurred or whether the crew member had it precisely right, I don’t know, but I do know that an Airbus 320 has two tanks in each wing and a center fuselage tank. To me, a 20,000 pound fuel imbalance sounds serious [especially given that the maximum fuel load is roughly 42,000 pounds], and according to FAA regulations, aircraft are prohibited from taking off with significant fuel imbalances, not that I knew that at the time

As a result, once we arrived in Salt Lake, despite my sprinting between gates, I missed my connecting flight to St. George by ten minutes… as did at least three others, who were either smart enough or pessimistic enough not to run. That meant a five hour wait for our connection. Several others couldn’t leave Salt Lake City until the next day, while a few “fortunate” souls could sprint and make their connections. I finally got home at close to one in the morning.

While I’m very thankful that the pilot caught the error/problem, the incompetence of the refueling crew cost everyone time and money, and had the problem not been spotted, it’s possible that matters could have been far worse.

I may not like weather delays for aircraft, or Air Traffic Control delays, or even some maintenance and repair delays, but delays created by incompetence are another thing entirely. Now, it could be that I’m getting more curmudgeonly as I get older [although some of my offspring might claim I’ve always been that way], but it appears to me that I’m seeing a great deal more of this kind of sloppiness. My wife sees it in students; an Army lieutenant colonel who’s a battalion commander tells me that new soldiers need much more training and “reminders” about the importance of details, and has the statistics to back up his statement; and our son, who runs a very high-end retail outlet, has had to fire more people in the last two years than in the previous decade for exactly the same reasons.

Yet I see statistics insisting that the young people of today are more intelligent than ever. In my view, intelligent people don’t misfuel aircraft or require continual occupational reminders and babysitting.

And then I got a survey from Delta asking how they could have better handled the situation. My answer won’t be considered, I’m sure. I suggested that passengers who are delayed and inconvenienced by incompetence should be financially compensated, and that such compensation should be funded by deductions from the paychecks of senior airline executives.

The Opioid Mess

The number of deaths from opioid overdoses and misuse continues to climb. All sorts of legislative and regulatory proposals have been floated, almost entirely, from what I can tell, dealing with controlling or restricting the prescription and distribution of opioids. Most recently, the President has signed an executive order purportedly addressing the opioid crisis.

Almost none of these measures will work, just as the measures proposed to deal with illegal drugs have failed miserably. And these “new” approaches will fail for a very similar reason: They don’t address the real problems leading to opioid deaths.

According to the National Institutes of Health, over 100 million Americans suffer from chronic pain, and opioid-related overdose fatalities have doubled over the past ten years to more than 60,000 last year. While the NIH has recognized that pain and overdose deaths are related, and that medical pain treatment methods need to be improved, the underlying problem is incredibly simple… and presently not solvable for the majority of those suffering long term severe pain.

Opioids are the only legal way to relieve pain for most of those individuals suffering long-term severe pain. Continuous use of opioids requires higher and higher dosages to be effective and also makes users increasingly more sensitive to pain. In addition, chronic intense pain makes sleep difficult, and sleep-deprived individuals have even more difficulty handling pain. The medical profession has also been successful in “saving” people, at the associated cost of painful and chronic medical conditions.

While researchers are seeking other non-addictive pain remedies, so far as I’ve been able to determine, no non-opioid medication useful on a daily and long-term basis for a range of pain conditions has reached the stage of human clinical trials, and until something meeting those criteria is developed we’ll continue to face an “opioid crisis.” Restricting prescription painkillers will only drive people in pain to illegal drugs on a greater basis than at present, and that’s frightening, because overdose rates for illegal synthetic painkillers such as fentanyl are now approaching 20,000 deaths per year, an almost six-fold increase since 2002.

The problem isn’t opioids; the problem is pain. And very little of the rhetoric even acknowledges that.

Who’s Going to Pay?

Early this week, the Department of Interior announced plans to increase the entrance fees to some seventeen of the nation’s largest national parks in 2018, more than doubling the previous fees during the most crowded times. Among those parks are several here in Utah, including Zion, Bryce, Canyonlands, and Arches.

The local reaction was fierce and immediate, not to mention negative, all along the lines that families can’t afford to pay $70 per car [now $30] or $30 per individual [up from $15] just to get into a national park. And if families can’t or won’t do that, Utah tourism will take a significant hit.

I understand the reaction, even if the proposed fee is far less than a day at Disneyland or Disney World. But I also understand the problems facing the National Park Service, which needs desperately to repair decades-old and damaged infrastructure, an infrastructure that gets damaged more each year by the increasing number of visitors. Currently, the Park system’s maintenance/repair backlog exceeds eleven billion dollars.

What also struck me was that this is the same reaction to all too many government programs, whether it’s SNAP/food stamps, health insurance, Medicaid, Medicare, disaster relief, interstate roads and bridges, tuition and fees at state universities… the list is seemingly endless. The least affluent members of society are hit the hardest by either increasing costs or decreasing services, and because politicians don’t ever want to raise taxes on anyone, either things don’t get fixed, or a few things get some help, and federal spending is financed more and more by increasing deficits.

It’s a national epidemic of “We need this, but we don’t want to pay for it.”

And yet, despite ballooning deficits, the Republican-led Congress and the President are pushing for massive tax cuts, claiming that such tax cuts will fuel growth that will wipe out the deficits. This is political bullshit and voodoo economics erroneously based on the experience of the tax cuts proposed by President Kennedy and signed into law by President Johnson as the Revenue Act of 1964, which reduced the top individual rate from 91% to 70%. The corporate tax rate was reduced from 54% to 48%. In fact, there was a moderate but significant growth attributed to those tax cuts.

Today, the tax rates are much different, and much lower than then. The top individual rate is 39.6% for individuals [with taxable incomes above $418,000 a year] and 35% for corporations, although the average rate paid for corporations is closer to 20% [and some large corporations pay no tax at all]. In addition, statistics show that there’s plenty of unused capital that’s not being invested in new businesses or jobs because the demand isn’t there. Since most of the tax cuts will go to the well-off, they won’t increase spending by the bulk of the population, which is what would be required to stimulate demand significantly.

And that means that the problem of “needs” being greater than the funds to pay for those needs is only going to get worse. And while many decry the growth of Social Security and Medicare, exactly how else, at present, are we as a nation going to provide for people too old and too infirm to work? Then, too, regardless of political philosophy, meeting some of those needs, such as our aging infrastructure, an overcommitted military, disaster relief and rebuilding, and yes, the national parks and the environment, are vital to the future of the country.

But no one wants to pay for enough for them… or to agree on what spending can be cut.

Slow Writing?

I can’t say that, with a few notable exceptions, that I’ve found many books to be slow reading. I’ve found books that I thought were less than well-written, books whose action sequences, upon reflection, seemed to have little point, books where I didn’t care about the main character, and books where there was less action, but I didn’t think of them as “slow.” I can only claim to have found one set of books truly slow, the Gormenghast Trilogy, but I know that there are a few readers who don’t find it slow.

One thing I have noticed, though, is that more and more readers are complaining that books are slow. I was astounded to find a huge listing of “slow” fantasy books on Goodreads. Some of those listed as slow included Brandon Sanderson’s Mistborn series, George Orwell’s 1984, the Harry Potter books, Neil Gaiman’s American Gods, Andy Weir’s The Martian, and even George R.R. Martin’s Game of Thrones. The list of “slow popular books” was over 900 books.

As I’ve mentioned before, I try to read a number of new writers every year, and it does seem to me that there are fewer and fewer books with slower pacing every year… and yet the number of readers complaining about slow books seems to be growing.

So… is it about the books? Or is it that more and more readers are used to fast-paced [and often shallow] video-based entertainment and expect books to be “faster” in the same way? Or could it be that more and more Americans have less ability to concentrate and possess reading skills inferior to the readers of previous generations? Given the huge expansion of graphic novels and manga, there certainly seems to be a segment of the “reading” public that prefers fewer words and more pictures. Is this because of declining reading skills or because the expansion of “visual”/video culture has stunted the ability of some portion of the reading public to create mental images of what they read? Perhaps both?

Certainly, the scores of teachers I know and have asked about this all believe that, in general, students in high school or college have more difficulty focusing,tend to avoid reading whenever possible, and complain that reading assignments that would have been considered light or easy a generation ago are too long and too hard – and that includes even students who score high on SAT or ACT tests, suggesting that they’re not lacking raw brainpower.

Slow books? Maybe. But I’m inclined to believe that it’s as much poor and slow readers as slow books.

Manners, Value… and the Appeal of Trump

For the most part, the manners of the first half of the twentieth century have been “modernized,” ignored, trashed, or updated. Which word describes one’s assessment depends on the individual and background, and there may well be additional terms better suited in the minds of others.

The social upheaval that began in the mid-1960s focused on manners as hypocritical and dishonest, among other things, and while that doubtless wasn’t the only factor, it was likely the most significant. What the downgrading or even disposal of manners and social custom ignored or disregarded was the role manners played in affecting individual self-worth.

Hypocritical as manners may be and often are, they require effort on the part of individuals. When people say “please” and “thank you,” when they address or refer to people as Mr., Ms., Mrs., or Miss and the appropriate last name, when they don’t crash or crowd lines, when they open doors for people who need help, when they address letters and emails with names and titles, rather than “Em” or “Bob,” it tends to send a message that others have worth. And when millionaires and billionaires dismiss the concerns of the poor, the working class, and minorities or when political figures call the supporters of an opponent “deplorables,” it’s neither accurate nor useful. More important, such behaviors send messages of devaluation.

How does this tie into Donald Trump and the polarization of the United States?

Both “sides” feel that they’re being devalued by the other side, especially by the leaders of the “other side.” There’s no sense of polite disagreement. The other side is “the dark side,” to be attacked and trashed for their “values” or lack of values. The large majority of Trump supporters, in particular, feel that they’ve been devalued and disregarded and that no one was speaking up for them. Ironically, many of them are willing to overlook Trump’s total lack of manners because they didn’t see anyone with manners able to articulate their views and feelings strongly enough.

Hillary Clinton was more mannered, but far less passionate, and it showed. As a result, too many Democrats drew the conclusion that she needed to be more of a gutter-fighter. Add to that the fact that many people seem to equate crudity with honesty, and manners as a trait of the self-serving elite, and she came across to too many of the undecideds as manneredly dishonest. Trump has proved, rather conclusively, that crudity doesn’t mean honesty. Politeness doesn’t, in itself, mean honesty either, but politeness has a far better record in allowing people to talk over controversial issues.

The more someone feels devalued, the less they’re going to listen to the other side, and the only way to even begin to bridge that gap is for the name-calling and vulgar and incendiary epithets to stop, and for people to address the issues politely. Being polite and mannered doesn’t mean giving up passion. Whether one liked Martin Luther King or not, he was both passionate and fought for his goals in a mannered fashion. The same can also be said of our greatest Presidents.

Like most social conventions, manners are a tool, one devalued in false service of “honesty” and one whose employment would be most useful today.

Showing… or Telling?

A while back, I got an interesting comment from my editor about a section of a manuscript I’d submitted. Her words were, roughly, “Here, it’s better to tell than show.”

I bring this up because one of the maxims pounded into beginning writers is: “Show; don’t tell.” That means the writer should describe the action as it happens, letting the reader “see” it as it happens, so to speak. In general, that’s good advice, but not everything needs to be shown. Not every step of a fifteen mile march, let alone a hundred mile march, needs to be described. Nor does every part of most love letters need to be read to the reader.

The pertinent wording of a law lends a certain authority if the speaker is an advocate, attorney, or judge… or a trader trying to pull off a shady deal, but what those words are isn’t necessary for a scrivener engaged in copying book after book – unless they bear on the plot specifically or a sentence is used to show how boring the tome truly is.

On the other hand, some excruciating detail in certain situations may be vital. The detailing of woodworking in The Magic of Recluce or of barrel-making in The Wellspring of Chaos are necessary in terms of defining the character and character development of Lerris and Kharl.

And sometimes, there’s no happy medium, as I discovered when Solar Express was published. As a technology-based near-future SF novel, the detail is vital for some readers and drags the story for others, which is why Solar Express is fast-moving for one category of readers and “slloooww” for others. Without the technical detail the story wouldn’t feel real to the first readers, and for those not into such technical intricacies, the details just got in the way. Some readers have been delighted when I’ve gone into the details of food and food preparation…and complained when I didn’t in a later book.

What book was my editor talking about? And what aren’t you ever going to read? I’m not saying. That’s one of the uses of a good editor – to make the book better than it would have been. And I’m not about to show you that it wasn’t as good as it turned out to be.

Law

What’s the point of law? Or law and order?

I’d say that it’s to provide a common set of rules that everyone in a society can understand and accept, ideally to accept as providing a degree of fairness. Others have or might have another concept – law as a hard and fast rule that defines good and evil in terms similar to their theological beliefs – and still others might feel that law is a tool for the elites of a society to control those beneath them. Some lawyers, I know, believe that the law is a tool they use in attempting to obtain justice, meaning a “fair” outcome for their clients, but, of course, what “fair” is always depends on individual viewpoints. From a technical point, in the United States, a law is essentially a statement by enacted by a governmental body which allows or prohibits certain acts, or imposes certain limitations on them.

And I’m certain there are other definitions of law, but why do we need laws? And why, once we have laws, do we seemingly need more and more of them?

Human societies need laws because there are always individuals who refuse to accept limitations on their acts, even when those acts harm others.

The answers to the second question are more multifold. Every law has areas where it lays down absolutes. Every time an absolute is codified into law, it creates situations where the absolute imposition of that law is unfair and unjust, or perceived as such. And someone often wants to remove that unfairness, which requires another law. In addition, every law excludes as well as including, and people want to “clarify” the law to assure that something heretofore excluded gets included. Then add to that that certain groups want certain laws for their benefit.

When people who share the same culture enact laws, they see those laws similarly among themselves and in a different way than do people who come from a different culture or economic class. That’s one reason why more egalitarian and homogenous societies tend to have lower crime rates.

In addition, equal penalties or “requirements” under law have differing impacts on people from differing social and/or economic strata.

The entire issue of so-called “voter fraud prevention” laws” being pushed by affluent white Republicans in the U.S. provides a good example of this, because those laws are regarded essentially as voter suppression laws by those of minority and lower income levels.

The difference in viewpoint comes from the difference in situation. For me, a photo ID isn’t a problem. It’s a slight hassle at most, a few hours once every five years, when I renew my driver’s license, and because I travel occasionally internationally, I have a passport as a back-up. Because I live in a moderate sized town, it’s a ten minute drive to the post office or the Department of Motor Vehicles, and because I was educated to the need for certain papers, I’ve always kept copies of things like birth certificates.

That’s all very easy and convenient – for me. My offspring, however, all live in large metropolitan areas where obtaining or renewing a driver’s license – or a passport — can be a lengthy affair, requiring travel and time. But they’re well-off enough that they can arrange the time and deal with the costs… and they had parents who prepared and educated them to those needs.

A minority single parent working a minimum wage job who lives in a state requiring a photo I.D. has a much tougher time of it. First off, most of the offices that can issue an I.D. are only open during working hours, and most minimum or low-wage earners don’t have much flexibility in working hours and often have to forgo paying work to get through the process. Also, the fees for getting such an I.D. take a greater percentage of their income. Then, even before that, they may have to obtain a certified birth certificate – taking more time and money. They are likely renting, rather than owning a home, and that requires more documents to prove where they live.

And the impact of other laws falls harder on the poor. If you don’t have the money to immediately fix a broken tail-light or a faulty muffler, that risks getting a ticket, and the cost of the ticket just adds to the burden. If you can’t drive the car, you may not be able to work. What is a modest cost and inconvenient repair for a middle-class worker can literally be a disaster for a poor worker.

What so many Americans fail to realize is that “equal” laws, even assuming that they’re enforced equally, which study after study shows they’re not, fall more heavily on the poorer members of society.

In reality… the “law” isn’t the same for everyone, nor is it seen as the same by everyone…but we’d like to pretend that it is… or that it’s stacked against us – and which belief you’re likely to hold depends on where you come from…and, often, how well off you are.

Formality in F&SF

All civilizations have at least two sets of rules. The two most basic sets of rules are laws and custom, and the most obvious subset of custom is manners. With the recent revival/ renaissance of Jane Austen and various spin-offs, there are a number of writers who focus on manners and social etiquette, generally in such sub-genres as steampunk or Regency-style fantasies.

But all cultures, in all times and places, have unspoken codes of manners, and they’re not restricted to just attire, although at times, cultures have gone so far as to legally define and restrict what people could wear, based on their wealth and social position, through sumptuary laws, which carried significant penalties.

As one of the older practicing and producing writers, I grew up in household where manners and custom were drilled into me. Of course, they had to be, because I was, to put it politely, socially oblivious. The majority of human beings have innate social senses. Mine were largely absent. That made little difference to my parents. I was drilled in every possible social grace and situation by my mother, while my father made certain I was more than adequate in sports, particularly those of social value, while both emphasized the importance of achievement in education. For the time, place, and setting in which I grew up, this was the norm.

What tends to get overlooked by a number of younger writers is that such an upbringing is not an aberration in human cultures, and for the majority of human history, those who have ruled and shaped society have had an upbringing that emphasized what was required to succeed. Those who were well-off but not of the elite also did their best to instill such education and manners in hopes that their offspring would have the background and manners to rise economically and socially.

At present, in the United States, the iron requirements of formality required prior to roughly the 1960s have been relaxed, or battered into scattered remnants of a once-uniform code of elite conduct, just as the former elites have been disparaged and often minimized.

This situation is not usual for cultures. More social rigidity is the norm, just as the studies of Thomas Piketty have shown that, historically, high levels of income inequality have also been the norm. Whether less rigid standards of manners and social behavior are the result of higher technology remains to be seen, but writers should consider [more carefully than many do, and no, I’m not naming names] whether the manners and social conduct of their characters match the actual culture that they’re depicting. The shepherd boy who attains power will never fit [and this almost never happens, except in fiction], except through brute power. His children might, if his wife/consort is from the elite and is in charge of their upbringing.

Also, contrary to what some believe, manners don’t reflect weakness, but are a way of displaying and reinforcing power. The decline of formal manners in the United States reflects the decline of old elite structure, and the often enforced casualness of new up-and-comers is meant as a symbol of a new elite, one problem of which is that an apparent lack of manners too easily suggests a lack of control… and a certain level of chaos and uncertainty.

In any case, any culture will have a form of mannered behavior that reinforces whatever elite governs, something that writers should consider.

Diversity… and Diversity… in Fiction?

At present in fantasy and science fiction, and, it seems to me, especially in fantasy, there’s a great push for cultural and ethnic diversity, especially in the last few years, at least in part as a reaction to the history in the genre, where stories and books largely focused on white male characters. That’s not to say that there haven’t been quite a number of notable exceptions that dealt with non-European ethnicities or with female characters, or even hermaphroditic characters, as in the case of LeGuin’s The Left Hand of Darkness. But the criticism that the field has been too “white male oriented” definitely has validity.

I certainly applaud works that effectively create or celebrate different racial or ethnic backgrounds, and even those that tastefully explore sexual diversity, but I’d like to sound a note of reality and caution for authors in dealing with “diversity.”

Some writers explore “diversity” by creating and exploring a culture very different from those traditionally depicted in fiction, and that can be enlightening and entertaining, but that’s very different from presenting a civilization/society which contains large numbers of people from diverse ethnicities.

First, all low-tech powerful civilizations [the kind often depicted in fantasy] have been dominated by an ethnic elite. These elites haven’t been all white, either. The Nubian culture conquered and ruled Egypt for a time, and that was definitely not a “white” culture. Most people know about the Mongol culture, and the fact that it ruled China for a time [until the Chinese absorbed the Mongols in China, which has happened more than once]. I could give a substantial list of non-Caucasian empires throughout history, but the point is that these cultures weren’t “diverse.”

They were different in ethnicity from other cultures, but there have been very few successful civilizations that embodied a great diversity in cultures. One could make the point that the United States, for all its failings, is the largest multicultural nation that has ever existed. Now, there have been empires that included different cultures, but those cultures, for the most part, were geographically distinct and united into the empire by force. About the only places where you might see diversity in any significant numbers were major port cities and the capital city.

Second, diversity in a society creates internal conflicts, sometimes on a manageable level, but if history is any indication, usually not. Even the “melting pot” United States struggles with internal ethnic diversity, and the rest of those nations with significant ethnic minority populations aren’t, for the most part, doing even as well as we are with diversity issues.

That doesn’t mean that a writer shouldn’t write about different cultures. I’m all for that – if it’s well-thought-out and consistent. In reality, however, such stable cultures will likely have a dominant ethnicity/culture, unless, of course, the author is going to explore internal ethnic conflict or unless the author has some truly “magic” potion that can solve the problems of wide-spread cultural internal diversity, because past experience shows that any internally diverse culture is likely to be highly fractious. And that’s something that both writers… and almost everybody else… tend to ignore.

The Multiplier Tool or… Not So Fast…

Technology by itself, contrary to popular beliefs, is neither good nor evil. It is a tool. More precisely, it is a multiplier tool. Technology multiplies what human beings can do. It multiplies the output from factories and farms. It also multiplies the killing power of the individual soldier or assassin. Fertilizers multiply grain and crop yields. Runoff of excess fertilizers ends up multiplying ocean algae blooms and making areas of oceans inhospitable to most life.

Modern social media makes social contacts and communication more widespread and possible than ever before. Paradoxically, it also multiplies loneliness and isolation. As recent events show, this communication system multiplies the spread of information, and, paradoxically, through belief-generated misinformation and “false news” multiplies the spread of ignorance. Use of fossil fuels has enabled great industrial and technological development, but it’s also created global warming at a rate never experienced before.

Those are general observations, but in individual cases, certain technological applications are clearly one-sided. Vaccines do far more good than harm. The harm is almost statistically undetectable, despite belief-inspired opposition. Use of biotechnology to create bioweapons benefits no one. The use of technology to turn rifles into what are effectively machine guns does far more harm than good.

The other aspect of technology is a failure of most people to understand that, with each new technology, or technological adaptation or advancement, there is both a learning curve and a sorting-out period before that technology is debugged and predictably reliable – and that period is just beginning – not ending – when the technology or application first hits the marketplace.

So… the late-adopters of new technology aren’t technophobes… or old-fashioned. They’re just cautious. But one of the problems today is the feeling by too many that it’s vital to be the first to have and use new technology or new apps. Over the years I’ve seen far more problems caused by rushing to new system and gadgets than by a deliberate reserve in adopting “the new stuff.” In addition, changing systems every time a manufacturer or systems producer upgrades wastes the time of employees and also creates anger and frustration that usually outweigh the benefits of being “early adopters.” Adopted too early or unwisely, technology can also multiply frustration and inefficiency.

Add to that the continual upgrades, and it’s very possible that the “drag effect” caused by extra time spent educating employees, installing upgrades, and debugging systems either reduces productivity or actually decreases it until reliability exceeds the problems caused by the “rush to the new.”

All of which is why I’m tempted to scoff at those individuals who rush to be the first with the newest and brightest gadget. But I don’t. I just wait a while until they’ve stumbled through all the pitfalls and most of the debugging. There’s definitely a place for “early adopters.” It’s just not a place where I need to be.

Truth…

Recently, a reader made an interesting comment to the effect that what I personally believed to be true doesn’t necessarily turn out to be true for others. This is a statement that initially sounds very reasonable, and studies indicate that it’s something that most people believe.

But… it’s also incredibly deceptive and dangerous. Now, I may have been correct, or I may have been incorrect. I may have had my facts wrong, or perhaps they were right. But the idea that correctness, accuracy, or “the truth” of something varies from individual to individual, depending on individual perception, is a very dangerous proposition.

Part of the reason why that proposition is dangerous is the use of the word “truth.” The word “truth” embodies a connotation of moral purity and certainty on the part of the individual defining that truth. On the other hand, facts are. How they’re perceived by individuals obviously varies, and different individuals give different weight to the same set of facts. Different individuals cite different sets of facts in support or opposition to policies, proposals, books, laws, or in other settings. But the bottom line should always be based on whether the facts are indeed accurate, and whether they apply to the situation at hand, not upon my beliefs about them or someone else’s beliefs about them.

It appears to me that today we’ve gotten into a societal mindset that places what we feel about anything far above determining what is accurate, what is actually so, and what is not. As feeling beings, this tendency has always been a great part of being human, but one of the great drivers of the advancement of human civilization has been the effort to determine verifiable facts, workable scientific theories based on replicable experiments and solid facts, as opposed to belief based on what cannot be determined to be accurate.

Yes, scientists and true empiricists have beliefs, but they try [and sometimes fail] to base those beliefs on hard evidence.

I’m not dismissing the importance of belief. Every human being needs things or ideals in which to believe, but the idea that what is “true” for one individual is not for another puts individual perception above accuracy and tends to support the idea that each set of beliefs is as valid as any other set of beliefs, when time and history and science have shown that “truth” resides far more often in what can be accurately determined and verified than in what cannot.

Despite the fact that in the third century BCE the Greek astronomer Aristarchus of Samos had presented a proof that the Earth revolved around the sun, more than 1500 years later the Christian Church was burning as heretics those who stated that the Earth was not the center of the universe and that it revolved around the sun. The “moral certainty” of faith trumped the facts, at least until science advanced to the point where the proof was irrefutable.

We’ve now reached a point where individuals realize that they must have at least some facts to support the “truth” of their beliefs… and in welter of “information” that surrounds us, too many individuals pick inaccurate or inapplicable facts in order to support their beliefs.

The idea that truth of belief varies from individual to individual is actually an accurate statement of a dangerous proposition – that “individual truth” is superior to verified evidence and facts, when, in fact the converse should be what we all strive for, that verified evidence and facts support our beliefs, rather than having our beliefs force us to find facts to support those beliefs.

Yet recent study after recent study shows that the majority of people tailor their facts to support their beliefs, rather than using verifiable facts to change their beliefs. Will we revert to faith over facts, as did the Christian Church of the 1500s? Given what I’ve seen over the last few years, it’s anything but an unreasonable question.