The Religious Selfie

One of the basic underpinnings of religion, almost any religion, is the worship of something or some deity bigger than oneself, and the overt acknowledgment that the individual worshipper is less than the deity worshipped. Some religions even incorporate that acknowledgment as part of liturgy and/or ritual. Such acknowledgments can also be part of “secular religions,” such as Nazism, Fascism, and Communism.

Today, however, there’s a totally different secular religion on the rise, with many of the old trappings in a new form, which might be called the “New Narcissism,” the elevation and exaltation of the individual, or the “self,” to the point where all other beliefs and deities are secondary.

Exaggeration? Not necessarily. What one believes in is reflected in the altars before which one prostrates oneself. Throughout history the altars of the faithful have either held images of a deity, perhaps accompanied by those of less deity, or no images whatsoever. While images of private individuals have also existed throughout history, those images or sculptures were created for posterity, of for the afterlife, so that others would have something to remember them by… or to allow them to remember themselves as they were. At one point in time, only the wealthy or the powerful could afford such images. Even until very recently, obtaining an image of one’s self required either the cooperation of others or special tools not particularly convenient to use. This tended to restrict the proliferation of self-images.

The combination of the personal communicator/camera/ computer and the internet has changed all that. Using Facebook, Instagram, Twitter, and the internet, now each individual has the ability to create themselves as a virtual deity – and tens, if not hundreds, of millions of people are doing just that, with post after post, selfie after selfie, proclaiming their presence, image, and power to the universe [with all three possibly altered for the best effect].

It’s the triumph of “pure” self. One no longer has to accomplish something for this presence and recognition. One can just proclaim it, just the way the prophets of the past proclaimed their deity. And given what positions and in how many ways people have prostrated themselves before their portable communications devices in order to obtain yet another selfie, another image of self, it does seem to resemble old-fashioned religious prostration.

Of course, one major problem with a culture obsessed with self and selfies is that such narcissism effectively means self is bigger than anything, including a deity or a country, and I have to wonder if and when organized religions will see this threat to their deity and belief system.

Another problem is that selfies have to be current; so everyone involved in the selfie culture is continually updating and taking more selfies, almost as if yesterday’s selfie has vanished [which it likely has] and that mere memory of the past and past actions mean nothing. All that counts is the latest moment and selfie. That, in turn, can easily foster an attitude of impermanence, and that attitude makes it hard for a society to build for the future when so many people’s attention is so focused on the present, with little understanding of the past and less interest in building the future… and more in scrambling for the next selfie.

All hail Narcissus, near-forgotten prophet of our multi-mirrored, selfie-dominated present.

Cultural Appropriation

Over the past several years, there’s been a great deal of talk about the need for “diversity.” So far as I can tell, this means stories set in cultures other than those of white, Western-European males and told by protagonists other than white males. I certainly have no problem with this.

I do, however, have some misgivings about the idea that such stories must always be written by authors from those cultures, and the equally disturbing idea that when someone other than a member or a descendent of those cultures writes about them, even when projected into the future, or into a fantasy setting, that is “cultural appropriation,” and a literary sin of the first level. The rationale behind this judgment appears to be that no one who is not a member of a different or a minority culture can do justice to representing that culture in a fictional setting.

Beside that fallacious point, what is exactly the point of fiction? Is it just to be culturally accurate? Or to entertain? To make the reader think? And for that matter, how does one determine “cultural accuracy,” especially when there are significant social and even geographic differences within most cultures?

Taken to extremes, one could classify Rob Sawyer’s hominid series, about an alternate world populated by Neandertals, as “cultural appropriation,” since most of us only have a tiny fraction of Neandertal genes. Roger Zelazny’s Lord of Light could easily be classed as cultural appropriation of Hindu beliefs and myths. For that matter, Tolkien certainly used the Elder Edda of Iceland as a significant basis of Lord of the Rings. And I wrote The Ghost of the Revelator even though I wasn’t born in Utah and I’m not LDS [although I have lived here for more than twenty years].

Obviously, writers should take seriously the advice to write what they know, and know what they write, but “non-members” of a minority or another culture may well know and understand that culture as well as or even better than members of that culture. Should they be precluded from writing fiction based on those cultures because editors fear the charge of “cultural appropriation”?

This concern, unfortunately, isn’t just academic. I’ve heard editors talk time and time again about how they want more diversity, but… In one case, the significant other of a Chinese-American born and raised in Hawaii wrote and offered a YA fantasy novel based on Hawaiian myth to a number of editors. When several agents and editors found out that the writer was not Hawaiian genetically, they decided against considering the book. Several well-known authors have also told me that they wouldn’t have considered the book either, because dealing with Hawaiian beliefs would be too controversial.

Shouldn’t it just be about the book…and not the genetics/cultural background of who wrote it?

Teachers

In yesterday’s local paper, there was a front page article headlining the coming teacher shortage in Utah, to which I wanted to reply, “How could there not be?”

The beginning salary for a Utah teacher in most systems is not far above the poverty level for a family of four, and the average Utah teacher’s salary is the lowest in the United States. Utah spends the least money per pupil in primary and secondary schools of any state in the United States. Nationwide, anywhere from twenty to fifty percent of newly certified teachers drop out of teaching in five years or less [depending on whose figures you trust], and that rate is even higher in Utah. In 2015, half of all newly hired teachers in Utah quit after just one year. Yet studies also show that the longer teachers teach, the more effective they become. Add to that the fact that Utah has on average the largest class sizes in the United States. The academic curriculum leading to a teaching degree has also become more demanding [at least at the local university], and it often takes even the best students more than the standard four years to complete a course of study that leads to teacher certification, especially if they have to work to help pay for their studies.

Despite the often dismal salaries, study after study shows the comparatively poor level of pay is down the list for why teachers walk away from teaching. Almost all prospective teachers know that teaching isn’t a high-paid profession. What they don’t know is just how effectively hostile the teaching environment is to a healthy and balanced life.

Here in Utah, for example, there are state legislators who complain about pampered and lazy teachers. They’re obviously unaware of the unpaid after-school, weekend, and evening workload required to support an eight-hour teaching day. Or of the number of parents who complain about their darling children’s grades – such as the one who wanted to know how his son could possibly flunk an art class [which turned out to be the fact that said son failed to attend most of the classes and never did a single art activity]. Or about the increasing reliance on testing to determine teaching effectiveness [when the testing itself reduces instructional time, when the test results determine teacher retention and ratings, and when the tests tend to measure factoids, and fill-in-the-blank skills, rather than thinking or being able to write even a coherent paragraph].

It also doesn’t help when the local papers are filled with pages and pages about the sports activities of the local high schools, with seldom a word about academic achievements or other more academic successes, such as plays, concerts, success in engineering competitions and the like.

Nor is it exactly encouraging when school administrators offer little understanding or support of their teaching faculty. That’s more commonplace than one might realize, although national surveys show it’s a significant factor in contributing to teacher drop-out/burnout. Certainly, a number of former students of my wife the university professor have mentioned this as a difficulty in their middle school or high school teaching positions.

And finally, in the end, what’s also overlooked is that it’s actually more expensive to continually replace a high number of departing teachers than to take the necessary steps to cut the teacher drop-out rate. But based on the current public view of education and the unwillingness to make meaningful changes, I don’t see this problem changing any time soon. In fact, it’s only going to get worse… far worse.

There’s Always Someone…

I admit it. I did watch the Super Bowl. How could I not when my grandfather was one of the first season ticket holders back in the days when the Broncos were truly horrible? I can still remember him taking me to a game, and he went, rain, shine, or snow, until he was physically no longer able. I wasn’t able to go with him, unfortunately, because by then I was working in Washington, D.C.

And yes, I was definitely happy that the Broncos won, particularly since I’ve always felt that Peyton Manning is a class act, but that brings me to the point — Cam Newton’s postgame interview, if it could be called that, which was anything but a class act. Yes, he was disappointed, and he wasn’t the first great quarterback to be disappointed, and certainly won’t be the last.

Newton’s real problem is that he is so physically gifted and also has a mind good enough to use those gifts that he’s never considered a few key matters. First, in anything, no matter how big you are, how fast you, how strong you are, how intelligent you are… there’s always someone bigger, faster, stronger, and more intelligent. Second, football is a team game, and the team that plays better as a team usually wins. Third, sometimes you get the breaks, and sometimes you don’t. Fourth, you don’t win just because you have the better record or the better offense – as Denver found out two years ago. Fifth, it is a game, if a very serious one played for high stakes.

Newton also needs to realize that he’s paid extraordinarily well to do exactly the same thing that every writer does, except few of us, indeed, are paid as well as he is. He’s paid to entertain the fans, and while that means winning as much as possible, it also means not pissing everyone off and coming off like a spoiled kid. This is also something writers need to keep in mind.

Given his talent, I’m sure Newton will be a factor for years to come, but it would be nice to see a bit more class when things don’t go well. You don’t have to like losing, but in the end, as even the great Peyton Manning has discovered, we all lose… and the mark of the truly great is to show class both when things go well and when they don’t.

High Tech – Low Productivity

The United States is one of the high-tech nations of the world, yet our productivity has hovered around a measly two percent per year for almost a decade. In the depths of the great recession that made a sort of sense, but the “recovery” from the recession has been anemic, to say the least. With all this technology, shouldn’t we be doing better?

Well… in manufacturing, productivity has to be up, whether the statistics show it or not, considering we’re producing more with fewer workers, and that has to mean greater output per worker. Despite the precipitous drop in the price of crude oil, the oil industry is almost maintaining output with far fewer rigs drilling and far fewer workers.

But perhaps what matters is what technology is productive and how it is used. I ran across an article in The Economist discussing “collaboration” with statistics indicating that electronic communications were taking more than half the work-week time of knowledge workers, and that more and more workers ended up doing their “real work” away from work because of the burden of dealing with electronic communications such as email and Twitter. And, unhappily, a significant proportion of the added burden comes under the “rubric” of accountability and assessment. But when you’re explaining what you’re doing and how you’re accountable, you’re not producing.

This is anything but the productive use of technology, and it may provide even greater incentive for businesses to computerize lower-level knowledge jobs even faster than is already happening. It just might be that, if you want to keep your job, less email is better. But then, if your boss doesn’t get that message as well, that puts you in an awkward position. I suppose you could console yourself, once you’re replaced by a computerized system, that your supervisor will soon have no one to badger with those endless emails demanding more and more status reports… before he or she is also replaced by an artificial intelligence.

We’ve already learned, despite the fact that too many Americans ignore the knowledge, that texting while driving runs a higher risk of causing fatalities than DUI. Will the supervisory types ever learn that excessive emailing may just lead not only to lower productivity, but eventual occupational suicide?