“Downers,” Stereotypes, and Literary Quality

Yesterday, when I was flying back from the World Fantasy Convention in San Diego, I read through two “best-of-the-year” anthologies… and landed in Cedar City thoroughly depressed… somewhat disgusted… and more than a little irritated.  No… I wasn’t irritated that the anthologists hadn’t picked one of my infrequent short stories.  In the first place, I know that it’s unlikely anything I write will find favor with most anthologists [although there have been exceptions].  In the second place, I hadn’t published any short fiction for the year in question.

My anger, irritation, and depression came from the same root cause.  Out of something like fifty stories, all but perhaps ten were downers.  Out of the ten that weren’t, eight were perhaps neutral or bitter-sweet, and only two could be called upbeat.  Now… I don’t have anything against downbeat or depressing stories.  I don’t even have anything against them being singled out as good stories.  Certainly they were all at least better than competently written, and some were indeed superbly written.  And I definitely don’t think a story has to be upbeat to be great or “best-of-the-year,” but after many, many years of writing professionally, and even more of reading all manner of books and stories, ranging from “genre” fiction to the acclaimed classics, it’s clear to me that the excessive citation of literarily depressing stories as classics and excellence is hardly a mark of intellectual distinction, let alone impartial judgment.

All this, of course, reinforces my feelings about those critics and anthologists who seem to dismiss anything with any upbeat feel or positive aspects… or anything that isn’t “literary mainstream.”

The latest New York Times book section contains a review with an opening along the line of “A literary novelist writing a genre novel is like an intellectual dating a porn star.”  Supposedly, reviewers who write about books should be able to transcend stereotypes, not reinforce them, but then, snobbery is often based on the enshrinement of stereotypes contrary to the snob’s world view, and all too many critics, reviewers, and even some anthologists are little more than snobs.

A good story is a good story, and a bad story is a bad story, whether it’s “literary” or genre.

More Wall Street Idiocy

I recently discovered that the cable company Hibernia Atlantic is spending $300 million to construct and lay a new transatlantic cable between London and New York [New Scientist, 1 October].  Why? In order to cut 6 milliseconds from the 65 millisecond transit time in order to get more investment trading firms to use their cable.  For 6 milliseconds?  That’s apparently a comparative age when computers can execute millions of instructions in a microsecond, and London traders must think that those 6 milliseconds will make a significant difference in the prices paid and/or received.

And they may well.  Along the same lines, a broker acquaintance of mine pointed out that New York City real estate closest to the New York Stock Exchange computers commands exorbitant rents and prices for exactly the same reason… but I find the whole idea totally appalling – not so much an additional data cable, but the rationale for its use. Human beings can’t process much of anything in 6 milliseconds so that the speed advantage is only useful to computers using trading algorithms.  As I’ve noted earlier, the use of programmed and computer trading has led to a shift in the rationale behind trading to almost total reliance on technical patterns, which, in turn, has led to increased volatility in trading.  Faster algorithmic trading can only increase that volatility, and, regardless of those who deny it, can also only increase the possibility of yet another “flash crash” like that of May 2010, and, even if the new “circuit-breakers”cut in and work as designed, the results will still disrupt trading significantly and likely penalize the minority of traders without superspeed computers.

Philosophically speaking, the support for building such a cable also reinforces the existing and continually growing reliance on maximizing short-term profits and minimizing longer-term concerns, as if we don’t already have a society that isn’t excessively short-term. You might even call it the institutionalization of business thrill-seeking and attention-deficit-disorder. This millisecond counts; what happens next year isn’t my concern.  Let my kids or grandkids worry about what happens in ten or twenty years.

And one of the problems is that this culture is so institutionalized that any executive who questions it essentially destroys his or her future. All you have to do is look at those who did before the last meltdown.

Yes, the same geniuses who pioneered such great innovations as no-credentials-check-mortgages, misleadingly “guaranteed” securitized mortgages, banking deregulation, fees-for-everything-banking, and million dollar bonuses for crashing the economy are now going to spend a mere hundreds of millions to find another way to take advantage of their competitors… without a single thought about the implications and ramifications.

Isn’t the free market wonderful?

 

Why Don’t the Banks Get It?

Despite the various “Occupy Wall Street” and other grass-roots movements around the country, banks, bankers, and investment bankers really don’t seem to get it.  Oh, they understand that people are unhappy, but, from what I can tell, they don’t seem terribly willing to accept their own role in creating this unhappiness.

It certainly didn’t help that all the large banks ducked out of the government TARP program as soon as possible so that they wouldn’t be subject to restrictions on salaries and bonuses for top executives – bonuses that often exceeded a million dollars an executive and were sometimes far, far greater.  They all insist, usually off the record, that they feared “losing” top talent, but where would that talent go?  To other banks?

Then after losing hundreds of billions of dollars on essentially fraudulently rated securitized mortgage assets, they took hundreds of billions of dollars in federal money, but apparently not to lend very much of it, especially not to small businesses, who are traditionally the largest creators of new jobs in the country. At the same time, they continue to foreclose on real estate on a wholesale basis, even when ordered not to by judges and states and regulators and even in cases when refinancing was feasible with an employed homeowner.

And then… there’s the entire question of why the banks are having financial difficulties.  I’m an economist by training, and I have problems understanding this.  They’re getting money cheaply, in some cases, almost for free, because what they pay depositors is generally less than one percent, and they can obtain federal funds at an even lower rate.

Mortgages are running 4-6%, and interest on credit card debt is in the 20% range and often in excess of 25%.  Yet this vast differential between the cost of obtaining the product and the return on it apparently isn’t sufficient?

And that brings us to the latest bank fiasco.  For years, the banks, all of them, have been urging customers to “go paperless.”  Check your statement electronically; don’t write checks; use your debit card instead. Then, after the federal government tried to crack down on excessive fees for late payments, overdrafts, and the like, now several of the largest banks are floating the idea of a monthly fee for debit card use.  Wait a second!  Wasn’t this the banks’ idea in the first place?  Wasn’t it supposed to reduce costs?  So why are they going to charge depositors more to use their own money?

And the banks still don’t get it?  With all those brilliant, highly compensated executives?

Or don’t they care?

What Is a Cult?

Recently, apparently members of the Christian right are suggesting that presidential candidate Mitt Romney is not a “Christian,” but a member of a “cult.” As a resident of Utah for nearly twenty years, and as a not-very-active Episcopalian who still resents the revision of the King James version of the Bible and Book of Common Prayer, I find the raising of this issue more than a disturbing, not so much the question of what Mr. Romney believes, but the implications that his beliefs are any stranger or weirder than the beliefs of those who raised the issue.

Interestingly enough, the top dictionary definitions of the word “cult” are “a system of religious rites and observances” and “zealous devotion to a person, ideal, or thing.”  Over the past half-century or so, however, the term cult has come to be used in a more pejorative sense, referring to a group whose beliefs or practices are considered abnormal or bizarre.  Some sociologists make the distinction that sects, such as Baptists, Lutherans, Anglicans, Catholics, etc., which are products of religious schism, therefore arose from and maintain a continuity with traditional beliefs and practices while cults arise spontaneously around novel beliefs and practices. Others define a cult as an ideological organization held together by charismatic relationships and the demand of total commitment to the group and its practices.

Mitt Romney is a practicing Mormon, a member of the Church of Jesus Christ of Latter Day Saints, but does that make him a member of a cult?  Since the LDS faith specifically believes in Jesus Christ and follows many “Christian” practices such as baptism, belief in an omnipotent God and his son Jesus Christ, and rejected the practice of polygamy a century ago, can it be said to be a total “novel” faith or any more “bizarre” or “abnormal” than any number of other so-called Christian faiths?  Mormonism does demand a high degree of commitment to the group and its practices, but is that degree of commitment any greater than that required by any number of so-called evangelical but clearly accepted Christian sects?

While I’m certainly not a supporter of excessive religious beliefs of any sort, as shown now and again in some of my work, and especially oppose the incorporation of religious beliefs into the structure of secular government, I find it rather amazing that supporters who come from the more radical and even “bizarre” [in my opinion] side of Christianity are raising this question.  What troubles me most is the implication that fundamentalist Christianity is somehow the norm, and that Mormonism, which, whether one likes it or not, is clearly an offshoot of Christianity, is somehow stranger or more cultlike than the beliefs of the evangelicals who are raising the question.

This isn’t the first time this kind of question has been raised, since opponents of John F. Kennedy questioned whether the United States should have a Catholic president, with the clear implication that Catholicism was un-American, and it won’t be the last time.  The fact that the question has been raised at all in this fashion makes me want to propose a few counter-questions.

Why are those politicians who endorse and are supported by believers in fundamentalist Christianity not also considered members of cults?

Are we electing a president to solve pressing national problems or one to follow a specific religious agenda?

Does rigid adherence to a religious belief structure make a more effective president or a less effective one?  What does history show on this score?

And… for the record, I’m not exactly pleased with any of the candidates so far.

 

The Wrong Message

Social media are here, regardless of whether we like them, dislike them, use them, or don’t use them.  They’re also becoming a part of education, and school districts and colleges and universities across the country are struggling with policies that allow constructive use of social media while curbing abuse.  Some school districts prohibit their use in education entirely, while others range from restricted use to almost unrestricted use.

Time will tell, as with many things, just what uses will be allowed, but there’s one aspect of all of this that, I must say, troubles me greatly.  One educator, cited in a recent article in The Christian Science Monitor, made an observation along the lines that he had to give feedback on assignments to students through Facebook because students never checked email since email just wasn’t part of their world.

I relayed that comment to my wife the college professor, and she nodded sagely, informing me that a growing percentage of college students simply never check their email or answer telephone messages. She should know, since her university system will inform her whether any email she has sent has even been opened – and many aren’t.  An increasing number of students only respond, and not necessarily reliably, to text messages and Facebook postings.

What?  Since when are students determining what forms of communication will be used in education?  The issue here, it seems to me, is not just whether social media has a place in education, and what that place should be, but also who exactly is setting the standards and the ground rules.

To begin with, for a teacher to reach a student through a social network, the teacher must belong to that network, and depending on the settings, etc., must request of the student to be accepted as a “friend,” or request that the student contact them and be accepted as a friend. In short, either party can refuse communications, and, in effect, the students are effectively setting the requirements for what communications they’ll receive and how.  I can certainly see students – and parents – rebelling if teachers required communications via FedEx, UPS, or carrier pigeon, but not accepting emails as opposed to Facebook messages?  Email is a non-obligatory electronic communications system far more open to all users and recipients, and takes no more time or equipment than does Facebook or any other social network. Also, teachers should be teachers, not “friends,” because even the most brilliant of students should not be encouraged to think of themselves as the equal of their teachers, no matter how much greater some of them may doubtless end up.

Again, I may be antiquated, but at this point using social networks for any form of “official” communication, whether educational, governmental, or business, raises questions about security, privacy, scholastic policies, discipline, and propriety that certainly have not been answered.