Once More… Getting It Right… Sort Of…

Once upon a time, there was an author who wrote a near-future science fiction thriller about a former military officer who had pioneered a technique for evaluating product placement in entertainment.  In case, you haven’t guessed, I was that author, and the book was Flash, which was published in September 2004.  Well… last week, Entertainment Weekly [on EW.com] published a story on the Brandcameo Product Placement Award Winners for 2010.  Yes, there’s actually a series of awards about the effectiveness of product placement in movies.

At the time I wrote Flash, product placement was just taking off, and I thought that, once various devices that let viewers flash past commercials on television become more common, product placement would be the advertising of the future… and it still is, because people are still watching television commercials, and, in fact, commercials are becoming a form of entertainment, at least for some viewers.  Where product placement has really taken off is in the movies.

The movie Iron Man 2 won the award for the most commercial placements, with 64 different placements, while Wall Street: Money Never Sleeps won the dubious award for the worse product placement.  And Apple won an award for the most appearances in hit films, with Apple products showing up in ten (or 30%) of the 33 films that were number one U.S. box office films in 2010, outstripping any other single brand for the year.  Somehow, that doesn’t surprise me.

Obviously, I wasn’t as far ahead of events as I thought I was.  In fact, I was behind the times in some ways, because when I checked into the product placement awards, I discovered that they’ve been awarding them since 2001, three years before Flash was published, and two before I even wrote it – and I’d never even heard of the awards until this year.

On the other hand, I’m still ahead of the times in terms of what I postulated, because product placements haven’t yet replaced commercials on television and… so far, unlike my hero Jonat deVrai, no one has yet figured out the effectiveness of a given product placement.

Still… I’ll take being partly right any day, especially in regard to television and its commercialism.

 

The Glorification of…?

Over the past few weeks, there have been two news stories whose juxtaposition has fascinated me, and I suspect they’re not the ones most readers would think of – the Wisconsin teachers’ protests [along with associated demonstrations across the country] and the hoopla surrounding the movie The Social Network, which claimed four Oscars at the recent Academy Awards ceremony.

What is so intriguing, horrifying to me, in fact, about this juxtaposition is the values behind each and the way they’re playing out in the press and politics.  The Social Network is “only” a movie, but it portrays how an egocentric and brilliant young man, with few ethics and less scruples forged a multi-billion dollar corporation by pandering to the need of Americans to essentially be recognized at any cost and by creating the social media structure that so many Americans, especially young Americans, seem unable to function without.  In practice, it’s about the glorification of self and the exaltation of emptiness within those who seem unable to function without continual affirmation by others.  What’s also disturbing about the film is the support it has received from the “younger” generation, who seem oblivious to the issues behind both Facebook and its creator.

The Wisconsin teachers’ protest is about a Republican governor who wants to remove rights and benefits from public school teachers because the state and the body politic cannot “afford” to continue to fund those benefits.  This is happening at a time when almost every public figure is talking about, or giving lip service to, the idea that the future of the United States depends on education. And yet, across the nation, as I’ve noted more than a few times, teachers get little recognition for what so many do right, and whenever budgets are tight, education gets cut.

So… on the one hand, our great media structure is suggesting even more rewards for the monument to self-promotion and inner emptiness represented by Facebook and other social networks and on the other a branch of our political structure is punishing those individuals who are supposed to be the ones on whom our future depends.

To me, this appears to be paradoxical and sends the message that Facebook is great, despite the fact that its social benefits are dubious and those who created it are even more so, and have made billions off such pandering, while a self-serving governor in Wisconsin and politicians, generally but not exclusively Republicans, across the nation are making political capital by castigating teachers for benefits and salaries they negotiated in legal and proper ways over generations… and firing thousands of them along the way.  Are all teachers and public employees perfect?  Heavens no!  But to glorify those who made money by capitalizing on vanity and by setting ethics aside while penalizing those who earn far, far less under far more onerous conditions certainly sends a message as to what we as a nation really think is important.

And yet, I haven’t seen anyone else point out this juxtaposition.  I wonder why not.

Reversion Under Stress

The other day, my wife, the opera singer and professor of voice was lamenting an all too common student problem – the fact that when singers, particularly young or inexperienced singers, get stressed, they tend to revert to their old bad habits and ways of singing. Most every voice teacher has to deal with this problem at one time or another, but I realized, perhaps far later than I should have, that it’s not just a problem for singers.  It’s a problem for societies and civilization.

It’s no accident that most social and legal progress tends to happen in times of prosperity, and that in times of economic and cultural stress, societies and individuals tend to regress.  For example, in World War II, the land of the free, the good old USA, got so fearful that the vast majority of individuals of Japanese descent on the west coast of the United States were packed off to relocation camps, their lands and property seized, much of it never restored.  Fear and stress made a hash out of the Bill of Rights and due process.  Under the fear of communism, Joe McCarthy ruined the lives of thousands of law-abiding Americans who made the mistake of believing in free speech during the early cold war.

At the beginning of the twentieth century, most Jews believed Germany was one of the most progressive countries and one that granted them the most freedom, but thirty years later, in the depths of the greatest depression, stress and fear gave rise to the Third Reich, and we all know where that led.

Under stress, most people revert to old habits we thought we’d left behind, and those habits usually aren’t the best, because the habits we’ve worked to leave behind are the ones we’re usually not too proud of… or the ones that have worked against us in school or in whatever occupation we’re involved in.  Unhappily, that also seems to be the case in societies as well.  Under stress, dictators and rulers who’ve been showing a softer side, revert to weapons and violence.  Under stress, Americans who’ve been talking about the need for better education collectively decide to cut billions from education, but not from farm and industry subsidies.  Under stress, political and religious discussions get uglier and less compromising and understanding.  Under stress…

I hope you get the idea.

The problem with all this is that the old bad habits of societies and individuals are even less productive and useful in poor political and economic times… and yet, time after time, generation after generation, this pattern repeats itself.

It’s not surprising, I suppose, given that my wife has trouble with students in this.  They can hear that they sing better with their new techniques.  They know it, and they can hear it… but when they get stressed, most of them subconsciously retreat to the “comfort” of their old poorer technique, and then they don’t do as well in recitals and competitions.  It’s only the very best who can surmount their fears and stresses.

So… which will we be?  The ones who surmount fear and stress and progress… or those who collapse under it and revert?

“True” Knowledge is Not an Enemy of Faith

But all too often “true beliefs” are the enemy of knowledge – and that sometimes even occurs within the so-called hallowed halls of science and academia.  True believers exist in all fields, and all of them are characterized by an unwillingness to change what they believe as knowledge and understanding of the world and the universe improve.

Human beings are far from knowing everything, but both as individuals and as a species, we are, so far, continuing to learn.  What we believe about the world is largely based on what we have observed and what we have heard or read.  The more we learn and advance, the more our beliefs should reflect that change, and yet more and more people seem to think that the opposite is true, even though the largest problem with “belief” and with “true believers” occurs when what people believe is at variance with what is.  Or, as the old saying goes, “It isn’t what you don’t know that hurts you so much as what you know that isn’t so.”

From the reaction to the last blog post… and others in the past, I’m getting the impression that at least some of my readers feel that I’m against or opposed to “faith” or religion.  I’m not.  I’m opposed to those versions of religion that deny what is, and what has been proved to be.  When some die-hard fundamentalist insists that the Earth was created in 4004 B.C., given the wealth of scientific evidence and facts to the contrary… well, rightly or wrongly, I don’t think that such a view should be given public credence, nor should it be allowed to impede the teaching of science that has an array of demonstrated facts to show that the universe is somewhere around 14 billion years old, while the fundamentalist only has scripture and faith.

Some branches of certain religions “honestly” believe that women are not the equal of men. While one would be a fool not to accept that there are differences in the sexes, including the fact that for a given body weight, men generally have more muscle mass, in most highly  industrialized economies it’s become very clear that women do at least as well in almost all ranges of occupations as do men, and the fact that women are now surpassing men in academic honors in most fields of higher education in the United States should prove the idea that in general women are at least equal, if not superior, to men in intellect.  Yet such statistics and achievements have little impact in changing the views of such religious “true believers.”

Another problem with “true beliefs” at variance with what can be proved or demonstrated, particularly those that get enshrined in legal codes and laws, is that they create moral conflicts for honest and less doctrinaire individuals.  For example, if a law, as did Tennessee’s law at the time of the Scopes trial, prohibits the teaching of evolution, then a teacher must either teach a falsehood or not teach what he or she knows to be accurate in order to obey the law.  If the teacher obeys the law, then the teacher is essentially false to the very goal of education.  If the teacher is true to the goal of education, the teacher breaks the law.  This dilemma is far from new; essentially the same kind of conflict led to the death of Socrates over 2,400 years ago. 

Is there a God?  At present, there’s no scientific proof one way or the other, and I really don’t care if you believe in a greater deity or you don’t.  What I do care about is how you act and how whatever you believe affects me, those I love, and others in society.  All throughout history, beliefs that have been at variance with what is have resulted in oppression, repression, tyranny, and violence, not to mention a lack of progress and human improvement.  And given the fact that we’ve tendencies in those directions anyway, the last thing we need as a species is the support and encouragement of such misguided “true believers.”

The Problem of Proof/Truth

The other day I happened to catch a few minutes of the movie Inherit the Wind  [the 1960 Spencer Tracy version], a film which is essentially a fictionalized version of the Scopes trial of 1925, where a Tennessee public school teacher was convicted of  teaching of evolution in the public schools, in violation of then state law. In the film and in the actual trial, the presiding judge forbid the defendant’s attorney from calling witnesses from the scientific community on the grounds that the science was not relevant to the charge, because the question was not about whether the law was accurate, but whether the defendant had violated that law.  Scopes was found guilty and fined $100 [equivalent to roughly $1,250 today], but the verdict was later overturned on appeal by a technicality, and Scopes was never re-tried.  In 1968, the Supreme Court ruled that prohibition of teaching evolution was unconstitutional because it represented the favoring of one religious view over others [a fact seemingly overlooked or forgotten in the forty years following].

What struck me, however, about both the trial and the film was the underlying problem faced by the scientific community whenever a scientific theory, factual finding, or discovery conflicts with popular or religious beliefs.  All too often, the popular reaction is a variation on “shoot the messenger” who bears bad or unpleasant news.  The plain fact, which tends to be overlooked, is that a significant proportion, if not an overwhelming majority, of deeply religious individuals who identify themselves as Christians do not trust scientists, or indeed, anyone who does not share their beliefs.

This viewpoint is certainly not limited to Christians, and there are more than a few scientists who do not trust the ability of deeply believing Christians to guide public policy, especially in regard to science and education.  The radical factions of Islam are unlikely to trust western secularists on much of anything, and all stripes of militants are going to be skeptical of those who do not share their views of how the world should be.

In essence, one person’s “truth” can all too often be another’s heresy, even when there is overwhelming factual evidence of that truth.  That overwhelming factual evidence can be denied is most easily seen in dealing with hard science [regardless of belief, there is far too much evidence of the development of the universe to allow any credibility to the idea that the earth and the cosmos were created in 4004 B.C.], but the problem exists in all areas of human endeavor. 

Simply put… how do we know whether what someone says is accurate or truthful?  Generally speaking, we weigh what is said against what we know and believe, but how do we know whether what we know and believe is accurate?

The “traditional” answer to that question was the basis for so-called liberal education, where an individual studied a wide range of subjects, questioning and experimenting with facts and ideas and obtaining a broader range of knowledge and perspective.  Unfortunately, the increasing complexity and technological basis of modern civilization has resulted in a growing class of individuals who are highly educated in narrower and narrower fields of knowledge, and who believe that they are “knowledgeable” in areas well beyond their education and experience.  Some indeed are.  Most aren’t.

Nonetheless, the problems remain.  How can society educate its citizens so that they can distinguish more accurately between what actually is and was and what they wish to believe that cannot be supported by facts, observation, and verifiable technology and science?  And how should society deal with those who wish society’s rules to be based upon beliefs that  can be factually shown to be false or inaccurate?