Cultural Appropriation

Over the past several years, there’s been a great deal of talk about the need for “diversity.” So far as I can tell, this means stories set in cultures other than those of white, Western-European males and told by protagonists other than white males. I certainly have no problem with this.

I do, however, have some misgivings about the idea that such stories must always be written by authors from those cultures, and the equally disturbing idea that when someone other than a member or a descendent of those cultures writes about them, even when projected into the future, or into a fantasy setting, that is “cultural appropriation,” and a literary sin of the first level. The rationale behind this judgment appears to be that no one who is not a member of a different or a minority culture can do justice to representing that culture in a fictional setting.

Beside that fallacious point, what is exactly the point of fiction? Is it just to be culturally accurate? Or to entertain? To make the reader think? And for that matter, how does one determine “cultural accuracy,” especially when there are significant social and even geographic differences within most cultures?

Taken to extremes, one could classify Rob Sawyer’s hominid series, about an alternate world populated by Neandertals, as “cultural appropriation,” since most of us only have a tiny fraction of Neandertal genes. Roger Zelazny’s Lord of Light could easily be classed as cultural appropriation of Hindu beliefs and myths. For that matter, Tolkien certainly used the Elder Edda of Iceland as a significant basis of Lord of the Rings. And I wrote The Ghost of the Revelator even though I wasn’t born in Utah and I’m not LDS [although I have lived here for more than twenty years].

Obviously, writers should take seriously the advice to write what they know, and know what they write, but “non-members” of a minority or another culture may well know and understand that culture as well as or even better than members of that culture. Should they be precluded from writing fiction based on those cultures because editors fear the charge of “cultural appropriation”?

This concern, unfortunately, isn’t just academic. I’ve heard editors talk time and time again about how they want more diversity, but… In one case, the significant other of a Chinese-American born and raised in Hawaii wrote and offered a YA fantasy novel based on Hawaiian myth to a number of editors. When several agents and editors found out that the writer was not Hawaiian genetically, they decided against considering the book. Several well-known authors have also told me that they wouldn’t have considered the book either, because dealing with Hawaiian beliefs would be too controversial.

Shouldn’t it just be about the book…and not the genetics/cultural background of who wrote it?

Teachers

In yesterday’s local paper, there was a front page article headlining the coming teacher shortage in Utah, to which I wanted to reply, “How could there not be?”

The beginning salary for a Utah teacher in most systems is not far above the poverty level for a family of four, and the average Utah teacher’s salary is the lowest in the United States. Utah spends the least money per pupil in primary and secondary schools of any state in the United States. Nationwide, anywhere from twenty to fifty percent of newly certified teachers drop out of teaching in five years or less [depending on whose figures you trust], and that rate is even higher in Utah. In 2015, half of all newly hired teachers in Utah quit after just one year. Yet studies also show that the longer teachers teach, the more effective they become. Add to that the fact that Utah has on average the largest class sizes in the United States. The academic curriculum leading to a teaching degree has also become more demanding [at least at the local university], and it often takes even the best students more than the standard four years to complete a course of study that leads to teacher certification, especially if they have to work to help pay for their studies.

Despite the often dismal salaries, study after study shows the comparatively poor level of pay is down the list for why teachers walk away from teaching. Almost all prospective teachers know that teaching isn’t a high-paid profession. What they don’t know is just how effectively hostile the teaching environment is to a healthy and balanced life.

Here in Utah, for example, there are state legislators who complain about pampered and lazy teachers. They’re obviously unaware of the unpaid after-school, weekend, and evening workload required to support an eight-hour teaching day. Or of the number of parents who complain about their darling children’s grades – such as the one who wanted to know how his son could possibly flunk an art class [which turned out to be the fact that said son failed to attend most of the classes and never did a single art activity]. Or about the increasing reliance on testing to determine teaching effectiveness [when the testing itself reduces instructional time, when the test results determine teacher retention and ratings, and when the tests tend to measure factoids, and fill-in-the-blank skills, rather than thinking or being able to write even a coherent paragraph].

It also doesn’t help when the local papers are filled with pages and pages about the sports activities of the local high schools, with seldom a word about academic achievements or other more academic successes, such as plays, concerts, success in engineering competitions and the like.

Nor is it exactly encouraging when school administrators offer little understanding or support of their teaching faculty. That’s more commonplace than one might realize, although national surveys show it’s a significant factor in contributing to teacher drop-out/burnout. Certainly, a number of former students of my wife the university professor have mentioned this as a difficulty in their middle school or high school teaching positions.

And finally, in the end, what’s also overlooked is that it’s actually more expensive to continually replace a high number of departing teachers than to take the necessary steps to cut the teacher drop-out rate. But based on the current public view of education and the unwillingness to make meaningful changes, I don’t see this problem changing any time soon. In fact, it’s only going to get worse… far worse.

There’s Always Someone…

I admit it. I did watch the Super Bowl. How could I not when my grandfather was one of the first season ticket holders back in the days when the Broncos were truly horrible? I can still remember him taking me to a game, and he went, rain, shine, or snow, until he was physically no longer able. I wasn’t able to go with him, unfortunately, because by then I was working in Washington, D.C.

And yes, I was definitely happy that the Broncos won, particularly since I’ve always felt that Peyton Manning is a class act, but that brings me to the point — Cam Newton’s postgame interview, if it could be called that, which was anything but a class act. Yes, he was disappointed, and he wasn’t the first great quarterback to be disappointed, and certainly won’t be the last.

Newton’s real problem is that he is so physically gifted and also has a mind good enough to use those gifts that he’s never considered a few key matters. First, in anything, no matter how big you are, how fast you, how strong you are, how intelligent you are… there’s always someone bigger, faster, stronger, and more intelligent. Second, football is a team game, and the team that plays better as a team usually wins. Third, sometimes you get the breaks, and sometimes you don’t. Fourth, you don’t win just because you have the better record or the better offense – as Denver found out two years ago. Fifth, it is a game, if a very serious one played for high stakes.

Newton also needs to realize that he’s paid extraordinarily well to do exactly the same thing that every writer does, except few of us, indeed, are paid as well as he is. He’s paid to entertain the fans, and while that means winning as much as possible, it also means not pissing everyone off and coming off like a spoiled kid. This is also something writers need to keep in mind.

Given his talent, I’m sure Newton will be a factor for years to come, but it would be nice to see a bit more class when things don’t go well. You don’t have to like losing, but in the end, as even the great Peyton Manning has discovered, we all lose… and the mark of the truly great is to show class both when things go well and when they don’t.

High Tech – Low Productivity

The United States is one of the high-tech nations of the world, yet our productivity has hovered around a measly two percent per year for almost a decade. In the depths of the great recession that made a sort of sense, but the “recovery” from the recession has been anemic, to say the least. With all this technology, shouldn’t we be doing better?

Well… in manufacturing, productivity has to be up, whether the statistics show it or not, considering we’re producing more with fewer workers, and that has to mean greater output per worker. Despite the precipitous drop in the price of crude oil, the oil industry is almost maintaining output with far fewer rigs drilling and far fewer workers.

But perhaps what matters is what technology is productive and how it is used. I ran across an article in The Economist discussing “collaboration” with statistics indicating that electronic communications were taking more than half the work-week time of knowledge workers, and that more and more workers ended up doing their “real work” away from work because of the burden of dealing with electronic communications such as email and Twitter. And, unhappily, a significant proportion of the added burden comes under the “rubric” of accountability and assessment. But when you’re explaining what you’re doing and how you’re accountable, you’re not producing.

This is anything but the productive use of technology, and it may provide even greater incentive for businesses to computerize lower-level knowledge jobs even faster than is already happening. It just might be that, if you want to keep your job, less email is better. But then, if your boss doesn’t get that message as well, that puts you in an awkward position. I suppose you could console yourself, once you’re replaced by a computerized system, that your supervisor will soon have no one to badger with those endless emails demanding more and more status reports… before he or she is also replaced by an artificial intelligence.

We’ve already learned, despite the fact that too many Americans ignore the knowledge, that texting while driving runs a higher risk of causing fatalities than DUI. Will the supervisory types ever learn that excessive emailing may just lead not only to lower productivity, but eventual occupational suicide?

They Can’t Listen

Some of the complaints that the older generation has about the younger generation have been voiced almost as far back as there has been a way of recording those complaints, and they’re all familiar enough. They young don’t respect their elders; they don’t listen to their elders; they have no respect for tradition; they think they deserve something without really working for it, etc., etc. And, frankly, there’s some validity to those complaints today, and there always has been. That’s the nature of youth, to be headstrong, self-centered, and impatient with anything that hampers what they want.

But being adjacent, shall we say, to a university, I’m hearing what seems to be a variation on an old complaint, except it’s really not a variation, but a very troubling concern. What I’m hearing from a significant number of professors is that a growing percentage of their students can’t listen. They’re totally unable to maintain any focus on anything, often even visual presentations, for more than a few seconds – even when they seem to be trying. When they’re asked what they heard or saw, especially what they heard, they can’t recall anything in detail. We’re not talking about lack of intelligence – they do well on written multiple-guess tests – but an apparent inability to recall and process auditory input.

Unless there’s something of extraordinary interest, their attention span darts from one thing to another in a few seconds. Whether this is the result of a media driven culture, earlier teaching methods pandering to learning in sound-bites, a lack of discipline in enforcing focus, or some combination of these or other factors, I can’t say. But, whatever the reason, far too many students cannot focus on learning, especially auditory learning.

Unfortunately, the response of higher education has been to attempt to make learning “more interesting” or “more inspiring” or, the latest fad, “more experiential.” Learning through experience is an excellent means for attaining certain skills, provided the student has the background knowledge. But when a student hasn’t obtained that background knowledge, experiential learning is just meaningless and a waste of time and resources. And, generally speaking, learning has to begin with at least some listening.

Furthermore, in the “real world,” employers and bosses don’t provide “experiential learning.” They give instructions, usually vocally, and someone who can’t listen and assimilate knowledge from listening is going to have problems, possibly very large ones.

Despite all the academic rhetoric about students being unable to learn from lectures, lectures worked, if not perfectly, for most of human history. That suggests that much of the problem isn’t with the method, but with the listener. And it’s not just with professors. They can’t listen to each other, either. That’s likely why they’re always exchanging text messages. If this keeps up, I shudder to think what will happen if there’s a massive power loss, because they apparently can’t communicate except through electronic screens.