The Even Darker Side

Recently, I’ve seen a number of public service spots pointing out how texting or cell phone use while driving is twice as deadly as driving drunk. Not only do I believe it; I’ve seen it, up close and personal. The strangest time was last Sunday morning while I was doing my morning walk with the sweet-crazy Aussie-Saluki. We were halfway across the street that had a four-way stop when a driver comes up the street…and keeps going, without even stopping. Fortunately, I’m slightly paranoid, and look around when crossing streets, even at stop signs and in crosswalks, and when I suspected what might happen, we sprinted. Even so, the driver barely missed us, but he passed so close that I could see he wasn’t even looking – except at the cell-phone he held in one hand. And he was dressed in coat and tie, apparently heading for church.

Not a day goes by that I don’t see text-impaired driving and walking, and at least where we live it’s getting worse. I see mothers with small children in their cars glued more to their cellphones than either their driving or their children. I even occasionally see parents walking with children – wearing earbuds and ignoring those offspring. I see scores of college students driving one-handed with the other hand holding a cellphone to their ear or texting on it.

What has struck me about all this is that it’s an extreme form of narcissism. All of these individuals are so wrapped up in themselves and whatever pleasure or need the texting or phoning fulfills that they don’t and possibly can’t think of the potential consequences of overuse and careless use of instant communications.

Young people, particularly, seem glued to their devices, as if they are prosthetics that they cannot do without. Increasingly, college students are spending more time on social media and less on their studies, but paradoxically, in general, they’re less socially adept because they interact less with others in direct personal contact and restrict themselves to electronic contacts. It even appears that the majority of college students move across campus, earbuds firmly in place, ignoring the other students around them.

It’s as if all these users are electronic/communications druggies, with all the narcissistic faults of alcohol or drug dependency. And no one seems to recognize this… or the increasingly lethal side-effects.

Alternate Views?

One author’s viewpoint of the future, of society, of technology, of anything, in fact, should not preclude another’s view, or the views of a number of other authors. Nor should authors be condemned for whether they incorporate and impose a new or “better” view of matters such as gender, ethnicity, and social mores on a past society, or whether they fail to do that. What should be questioned is their accuracy in depicting the past as it was, and, if the work is F&SF, whether the society, technology, cultures, etc., they are depicting are workable and believable, and, also, for me, anyway, whether such a culture could actually evolve into what is persented.

Now, that doesn’t mean anyone has to like what an author does. I don’t particularly like the world of Game of Thrones, but as more than a few historians have pointed out, George R. R. Martin’s use of the War of the Roses as a model of sorts for his world certainly does capture the brutality and the almost total lack of morality rampant in that type of culture. I can admire the craft, but I don’t have to like the result.

That would seem obvious, or it should be, but it clearly isn’t to all too many in the F&SF field. Like it or not, in the past, and still in some societies, most positions of power and prestige and most of those in science were and are held by men, regardless of culture or ethnicity. This isn’t good for a great number of reasons, first and foremost being the fact that any society that does this is wasting at least half of its intelligence and abilities, if not a great deal more. But it did happen, and it continues to happen in some places, and likely will for a long time in others.

If an author wants to write in that kind of world, that’s his or her business, but that doesn’t mean that anyone should be required to like what such authors write or grant them awards. Nor does it mean that they should be denied readers or awards, either. Nor should it mean that writers who depict worlds with diverse populations and cultures should automatically expect readers or awards for merely pointing out what hasn’t yet happened in most societies, particularly if their talent in telling the story is submerged by the “message.” I understand this very well, since every so often some reader or reviewer critiques me for being too pedantic, and, in retrospect, at times I may have been. At other times, I suspect the readers and reviewers in question simply didn’t like considering what was behind what I wrote, but that’s a danger all writers face.

Readers largely buy what entertains them, and what entertains the bulk of readers bears less and less resemblance to reality [as I learned more than 20 years ago by publishing a very “real” book that was incredibly unpopular while watching authors who depicted the same milieu most unrealistically rake in millions]. Often what is entertaining not only has little accuracy in depicting human behavior, politics, and technology, etc., but also isn’t even that well-written, but it still sells.

Various literary awards aren’t all that much better in reflecting excellence, either, because they’re either popularity contests, as in the case of the F&SF Hugo awards, or they reflect the tastes of a small panel of judges, as in the World Fantasy Awards or the Pulitzer Prizes or even the Nobel Prize for literature. While such awards may reflect excellence, that view of excellence is highly influenced by the tastes of those doing the judging.

So…all the stone-throwing because authors do or don’t depict something in a given way seems to me irrelevant to how popular a book is or how technically and artistically good it may be.

But then, some people revel in throwing stones, either figuratively or actually.

Another Take on Income Inequality

Sometime around 7500 B.C., people began building clustered mud-brick houses at Catalhoyuk, Turkey. According to detailed archeological studies, for roughly the next thousand years, the same patterns of life persisted, apparently with all families living in the same fashion and with approximately the same level of goods and the same size houses. Analyses of the human remains show that men and women received the same level and type of food as well.

By around 6500 B.C., however, income and status inequality began to develop, and as it did, more violence also began to appear, including a significant number of individuals with healed head injuries, wounds that suggest to the archaeologists who have studied the site for more than forty years that such injuries were inflicted as a means of social control, but that such control was not necessary until pronounced income/resource inequality began to develop. This is, of course, a conclusion drawn by those studying Catalhoyuk, but it does appear without doubt that the society appeared more stable when the income levels were similar and that more violence occurred once income inequality began to develop.

I have to say that this scarcely surprises me. Historically, countries with high levels of income inequality have often had violent uprisings and/or revolutions, such as the French Revolution, the Russian Revolution, the Spanish Civil War, the Cuban revolution, the more recent violence in the Sudan, and the troubles in Colombia and Venezuela.

In looking at income inequality by country across the world, I was struck by several facts. First, among industrialized/technological nations, the United States has the greatest income inequality. Among all nations, there is a pronounced tendency for countries with high income inequality also to have high levels of societal violence, and that includes the United States.

All of which suggests that pushing for tax cuts on the wealthy and opposing increases in the minimum wage may well have costs beyond the merely monetary.

Retention

This year, the buzzword at the local university is “retention.” What it amounts to for faculty and staff is, essentially, to do anything possible to keep students in school. Act as their friend or their counselor. Give them any way you can to pass courses. Ensure that they get instant positive feedback.

Along with this comes a blizzard of brand-new acronyms, a program to train faculty as emergency counselors and psychologists [because the three new counselors the administration hired are so far behind that they’ll never get through the caseload of students], and the very clear message that university faculty members are responsible for getting students through in five years or less, faculty and no one else.

Since most entering students have never really had to work hard to learn and study, they’re not really prepared for college-level work, and it often seems like they can’t wait to get out of class and return to their smart-phones and ear-buds.

And that doesn’t include the facts that the local university is located in a culture where more than half the students take off two years for a Church mission, where women are pressured to marry and have children young, and where the majority of students feel “crushed,” if they get a grade below an “A” even when they don’t do the work. That doesn’t take into account that roughly half of the students are working part-time or full-time because families averaging five children spaced close together can almost never provide anywhere close to the funds necessary for college.

Then add to that the fact that many classes are taught by underpaid adjuncts who are juggling other jobs and commitments, and that the administrative loads dumped on full-time teaching faculty continue to increase and result in longer and longer hours providing information and reports to administrators that have very little to do with teaching.

And, of course, it’s absolutely taboo for a faculty member to even hint at asking whether some of these students should even be in college or whether the university is doing those students any favors by trying to keep them in classes as long as possible.

The truly miraculous aspect of it all is that so many faculty members struggle to do their best for students who are seldom grateful and an administration that’s preoccupied with numbers and thinks that excellence can be quantified by retention numbers.

Analyzing to the Death

I’ve always wanted to understand, and worked at developing my own abilities to do that whether the subject happened to be technological, historical, political, or otherwise in nature. One of the many things I’ve learned through these exercises is that while I may think I understand something, there’s always more to be learned… but there comes a point where additional knowledge adds little to understanding. Likewise, understanding is only the first step in resolving problems, and far too many individuals seem to believe that if they just “understand” the situation or problem, it can be solved or resolved.

Years and years ago, A.E. van Vogt wrote about non-Aristotlian [Null-A] thinking, presenting it as rejection of “single-valued” or straight-line logic or thinking and suggesting that a multi-valued/perspective logic structure was better for dealing with problems. That kind of approach sounded good on paper – as a good author can often make something sound – but I had a feeling that there was something inherently flawed with the idea.

Recent interactions have brought to mind that feeling, and I realized exactly what van Vogt had missed. While his proposed Null-A thinking may well work better in solving technological and physical problems, it’s limited, and often useless, in dealing with people problems, because the overwhelming majority of people don’t think that way… and don’t want to. Every individual has his or her own value system, in most cases differing slightly from that of others in his or her society, but those systems are essentially based on “either-or” assumptions. Either something is “good” or it’s not, and when something goes wrong, or is not to their liking, their default feeling is that someone else or something else is wrong or the problem.

Sometimes, that may be largely the problem, but usually, from what I’ve observed, most problems, especially human problems, have multiple causes and contributing factors, and most people reject their own contributing factors and insist that the problem is caused by other people or other factors.

Now… you can analyze this to death and come up with and list all the factors. You can point out all the psychological impediments those involved with or concerned with the problem have. But all that analysis does nothing to solve the problem – because those involved have emotional anchors to their point of view, and a number of studies, some of them quite recent, have indicated, those emotional anchors are far more powerful than either facts or logic. Only an emotional impact of some sort will change those views.

And all the analyses and data don’t seem able to change that. Likewise, bashing those who observe that this is in fact an accurate observation of current human nature doesn’t change the fact that the vast majority of human beings are governed by emotionally-based, either-or feelings and decision-making.