Technology, Society, and Civilization

In today’s modern industrial states, most people tend to accept the proposition that the degree of “civilization” is fairly directly related to the level of technology employed by a society.  Either as a result or as a belief, then, each new technological gadget or invention is hailed as an advance. But… how valid is that correlation?

In my very first blog [no longer available in the archives, for reasons we won’t discuss], I made a number of observations about the Antikythera Device, essentially a clock-work- like mechanical computer dating to 100 B.C. that tracked and predicted the movement of the five known planets, lunar and solar eclipses, the movement of the moon, as well as the future dates for the Greek Olympics. Nothing this sophisticated was ever developed by the Roman Empire, or anywhere else in the world until more than 1500 years later.  Other extremely technological devices were developed in Ptolemaic Egypt, including remote-controlled steam engines that opened temple doors and magnetically levitated statues in those temples.  Yet both Greece and Egypt fell to the more “practical” Roman Empire, whose most “advanced” technologies were likely the invention of concrete, particularly concrete that hardened under water, and military organization.

The Chinese had ceramics, the iron blast furnace, gunpowder, and rockets a millennium before Europe, yet they failed to combine their metal-working skill with gunpowder to develop and continue developing firearms and cannon.  They had the largest and most advanced naval technology in the world at one point… and burned their fleet.  Effectively, they turned their backs on developing and implementing higher technology, but for centuries, without doubt, they were the most “civilized” society on earth.

Hindsight is always so much more accurate than foresight, but often it can reveal and illuminate the possible paths to the future, particularly the ones best avoided. The highest level of technology used in Ptolemaic Egypt was employed in support of religion, most likely to reinforce the existing social structure, and was never developed in ways that could be used by any sizable fraction of the society for societally productive goals.  The highest levels of Greek technology and thought were occasionally used in warfare, but were generally reserved for the use of a comparatively small elite.  For example, records suggest that only a handful of Antikythera devices were ever created.  The widest-scale use of gunpowder by the early Chinese was for fireworks – not weapons or blasting powder.

Today, particularly in western industrial cultures, more and more technology is concentrated on entertainment, often marketed as communications, but when one considers the time and number of applications on such devices, the majority are effectively entertainment-related.  In real terms, the amount spent on basic research and immediate follow-up in the United States has declined gradually, but significantly, over the past 30 years.  As an example, NASA’s budget is less than half of what it was in 1965, and in 2010, its expenditures will constitute the smallest fraction of the U.S. budget in more than 50 years.  For the past few years, the annual budget of NASA has been running around $20 billion annually.  By comparison, sales of Apple’s I-phone over 9 months exceeded the annual NASA budget, and Apple is just one producer of such devices.  U.S. video game software sales alone exceed $10 billion annually.

By comparison, the early Roman Empire concentrated on using less “advanced” technology for economic and military purposes.  Interesting enough, when technology began to be employed primarily for such purposes as building the coliseum and flooding it with water and staging naval battles with gladiators, subsidized by the government, Roman power, culture, and civilization began to decline.

More high-tech entertainment, anyone?

Sacred? To Whom?

I’ll admit right off the top that I have a problem with the concept that “life is sacred,” not that I don’t feel that my life, and that of my wife and children and grandchildren aren’t sacred to me.  But various religions justify various positions on social issues on the grounds that human life is “sacred.”  I have to ask the question why human life, as opposed to other kinds of life, is particularly special – except to us.

Once upon a time, scientists and others claimed that Homo sapiens were qualitatively different and superior to other forms of life.  No other form of life made tools, it was said.  No other form of life could plan logically, or think rationally.  No other form of life could communicate.  And, based on these assertions, most people agreed that humans were special and their life was “sacred.”

The only problem is that, the more we learn about life on our planet, the more every one of these assertions has proved to be wrong.  Certain primates use tools; even Caledonian crows do.  A number of species do think and plan ahead, if not in the depth and variety that human beings do.  And research has shown and is continuing to show that other species do communicate, from primates to gray parrots.  Research also shows that some species have a “theory of mind,” again a capability once thought to be restricted to human beings. But even if one considers just Homo sapiens, the most recent genetic research shows that a small but significant fraction of our DNA actually comes from Neandertal ancestors, and that genetic research also indicates that Neandertals had the capability for abstract thought and speech.  That same research shows that, on average, both Neandertals and earlier Homo sapiens had slightly larger brains than do people today.  Does that make us less “sacred”?

One of the basic economic principles is that goods that are scarce are more valuable, and we as human beings follow that principle, one might say, religiously – except in the case of religion.  Human beings are the most common large species on the planet earth, six billion plus and growing.  Tigers and pandas number in the thousands, if that.  By the very principles we follow every day, shouldn’t a tiger or a panda be more valuable than a human?  Yet most people put their convenience above the survival of an endangered species, even while they value scarce goods, such as gems and gold, more than common goods.

Is there somehow a dividing line between species – between those that might be considered “sacred” and those that are not?  Perhaps… but where might one draw that line?  A human infant possesses none of the characteristics of a mature grown adult.  Does that make the infant less sacred?  A two year old chimpanzee has more cognitive ability than does a human child of the same age, and far more than a human infant.  Does that make the chimp more sacred?  Even if we limit the assessment of species to fully functioning adults, is an impaired adult less sacred than one who is not?  And why is a primate who can think, feel, and plan less sacred than a human being?  Just because we have power… and say so?

Then, there’s another small problem.  Nothing on the earth that is living can survive without eating in some form or another something else that is or was living.  Human beings do have a singular distinction there – we’re the species that has managed to get eaten less by other species than any other species.  Yes… that’s our primary distinction… but is that adequate grounds for claiming that our lives, compared to the lives of other thinking and feeling species, are particularly special and “sacred”?

Or is a theological dictum that human life is sacred a convenient way of avoiding the questions raised above, and elsewhere?

Making the Wrong Assumption

There are many reasons why people, projects, initiatives, military campaigns, political campaigns, legislation, friendships, and marriages – as well as a host of others – fail, but I’m convinced that the largest and least recognized reason for such failures is that those involved in such make incorrect assumptions.

One incorrect assumption that has bedeviled U.S. foreign policy for generations is that other societies share our fundamental values about liberty and democracy.  Most don’t.  They may want the same degree of power and material success, but they don’t endorse the values that make our kind of success possible.  Among other things, democracy is based on sharing power and compromise – a fact, unfortunately, that all too many U.S. ideologues fail to recognize, which may in fact destroy the U.S. political system as envisioned by the Founding Fathers and as developed by their successors… until the last generation.  Theocratically-based societies neither accept nor recognize either compromise or power-sharing – except as the last resort to be abandoned as soon as possible.  A related assumption is that peoples can act and vote in terms of the greater good.  While this is dubious even in the United States, it’s an insane assumption in a land where allegiance to the family or clan is paramount and where children are taught to distrust anyone outside the clan.

On a smaller scale, year after year, educational “reformers” in the United States assume, if tacitly and by their actions, that the decline in student achievements and accomplishments can be reversed solely by testing and by improving the quality of teachers.  This assumption is fatally flawed because student learning requires two key factors – those who can and are willing to work to teach and those who can learn and who are willing to learn.  Placing all the emphasis on the teachers and testing assumes that a single teacher in a classroom can and must overcome all the pressures of society, the media, the social peer pressures to do anything but learn, the idea that learning should be fun, and all the other societal pressures that are antithetical to the work required to learn. There are a comparative handful of teachers who can work such miracles, but basing educational policy and reforms on those who are truly exceptional is both poor policy and doomed to failure.  Those who endorse more testing as way to ensure that teachers teach the “right stuff” assume that the testing itself will support the standards, which it won’t, if the students aren’t motivated, not to mention the fact that more testing leaves less time for teaching and learning.  So, in a de facto assumption, not only does the burden of teaching fall upon educators, but so does the burden of motivating the unmotivated, and disciplining the undisciplined at a time when society has effectively removed the traditional forms of discipline without providing any effective replacements.  Yet the complaints mount, and American education is failing, even as the “reformers” keep assuming that teachers and testing alone can stem the tide.

For years, economists used what can loosely be termed “the rational person” model for analyzing the way various markets operated.  This assumption has proved to be horribly wrong, as recent studies – and economic developments – proved, because in all too many key areas, individuals do not behave rationally.  Most people refuse to cut their losses, even at the risk of losing everything, and most continue uneconomic behaviors not in their own interests, even when they perceive such behaviors in others as irrational and unsound.  Those who distrust the market system assume that regulation, if only applied correctly, can solve the problems, and those who believe that markets are self-correcting assume that deregulation will solve everything.  History and experience would suggest both assumptions are wrong.

In more than a few military conflicts dating back over recent centuries, military leaders have often assumed that superior forces and weapons would always prevail.  And… if the military command in question does indeed have such superiority and is willing to employ it efficiently to destroy everything that might possibly stand in its way, then “superiority” usually wins.  This assumption fails, however, in all too many cases where one is unable or unwilling to carry out the requisite slaughter of the so-called civilian population, or when military objectives cannot be quickly obtained, because, in fact, in virtually every war of any length a larger and larger fraction of the civilian population becomes involved on one side or another, and “superiority” shifts.  In this regard, people usually think of Vietnam or Afghanistan, but, in fact, the same sort of shift occurred in World War II.  At the outbreak of WWII in 1939, the British armed forces had about 1 million men in arms, the U.S. 175,000, and the Russians 1.5 million.  Together, the Germans and Japanese had over 5 million trained troops and far more advanced tanks, aircraft, and ships.  By the end of the war, those ratios had changed markedly.

While failure can be ascribed to many causes, I find it both disturbing and amazing that seldom are the basic assumptions behind bad decisions ever brought forward as causal factors… and have to ask, “Why not?”  Is it because, even after abject failure or costly success that didn’t have to be so costly, no one wants to admit that their assumptions were at fault?

Ends or Means

By the time they reach their twenties, at least a few people have been confronted, in some form or another, with the question of whether the ends justify the means.  For students, that’s usually in the form of cheating – does cheating to get a high grade in order to get into a better college [hopefully] justify the lack of ethics?  In business, it’s often more along the lines of whether focusing on short-term success, which may result in a promotion or bonus [or merely keeping your job in some corporations], is justified if it creates long-term problems or injuries to others.

On the other hand, I’ve seldom seen the question raised in a slightly different context.  That is, are there situations where the emphasis should be on the means? For example, on vacation, shouldn’t the emphasis be on the vacation, not on getting to the end of it?  Likewise, in listening to your favorite music, shouldn’t the emphasis be on the listening and not getting to the end?

I suppose there must be some few situations where the end is so vital that the means don’t matter, but the older I get, the fewer examples of that I’ve been able to cite because I’ve discovered that the means so affect the ends that you can seldom accomplish the ends without a disproportionate cost in collateral damage.

This leads to those situations where one needs to concentrate on perfection in accomplishing the means, because, if you don’t, you won’t get to the end.  Some instances such as these are piloting, downhill ski racing, Grand Prix driving [or driving in Los Angles or Washington, D.C., rush hour traffic], or undertaking all manner of professional tasks, such as brain or heart surgery, law enforcement, or fire fighting.

The problem that many people, particularly students, have is a failure to understand that, in the vast majority of cases, learning the process is as critical [if not more so] as the result.  Education, for example, despite all the hype about tests and evaluations, is not about tests, grades, and credentials [degrees/certification].  Even if you get the degree or certification or other credential, unless you’ve learned enough in the process, you’re going to fail sooner or later – or you’ll have to learn all over what you should have learned the first time.  Unfortunately, because many entry-level jobs don’t require the full skill set those who were trying to provide the education were attempting to instill, that failure may not come for years… and when it does, the results will be far more catastrophic.  And, of course, some people will escape those results, because there are always those who do… and, unfortunately, for some reasons, those “evaders” are almost invariably the ones those who don’t want to do the work to learn the process pick as examples and reasons why they shouldn’t work on learning the processes behind the skills.

Studies done on college graduates two generations ago “discovered” that such graduates made far more income over their lifetimes than did those without a college degree.  Unfortunately, the message became that a degree was what mattered, not the skills represented by that degree, and ever since then people have focused on the credential, rather than on the skills, a fact emphasized by rampant grade and degree inflation and documental by the noted scholar Jacques Barzun, in his book, From Dawn to Decadence: 500 Years of Western Cultural Life, 1500 to the Present , where he observed that one of the reasons for the present and continuing decline of Western Civilization is the fact that our culture now exalts credentials over skills and real accomplishments.

One of the most notable examples of this is the emphasis on monetary gain, as exemplified by developments in the stock and securities markets over the past two years.  The “credential” of the highest profit at any cost has so distorted the process of underwriting housing and business investment that the profit levels reaped by various sectors of the economy bear no relationship to their contribution to either the economy or culture.  People whose decisions in pursuit of ever higher and unrealistic profit levels destroyed millions of jobs are rewarded with the “credential” of high incomes, while those who police our streets, fight our fires, protect our nation, and educate our children face salary freezes and layoffs – all because ends justify any means.

Hypocrisy… Thy Name Is “Higher” Education

The semester is over, or about over, in colleges and universities across the United States, and in the majority of those universities another set of rituals will be acted out.  No… I’m not talking about graduation.  I’m talking about the return of “student evaluations” to professors and instructors. The entire idea of student evaluations is a largely American phenomenon that caught hold sometime in the late 1970s, and it is now a monster that not only threatens the very concept of improving education, but it’s also a poster child for the hypocrisy of most college and university administrations.

Now… before we go farther, let me emphasize that I am not opposing the evaluation of faculty in higher education.  Far from it.  Such evaluation is necessary and a vital part of assuring the quality of faculty and teaching.  What I am opposed to is the use of student evaluations in any part of that process.

Take my wife’s music department.  In addition to their advanced degrees, the vast majority have professional experience outside academia.  My wife has sung professionally on three continents, played lead roles in regional operas, and has directed operas for over twenty years.  The other voice professor left a banking career to become a successful tenor in national and regional opera before returning to school and obtaining a doctorate in voice.  The orchestra conductor is a violinist who has conducted in both the United States and China.  The band director spends his summer working with the Newport Jazz Festival.  The piano professor won the noted Tchaikovsky Award and continues to concertize world-wide.  The percussion professor performs professionally on the side and has several times been part of a group nominated for a Grammy.  This sort of expertise in a music department is not unusual, but typical of many universities, and I could come up with similar kinds of expertise in other university departments as well.

Yet… on student evaluations, the students rate their professors on how effective the professors are at teaching, whether the curricula and content are relevant, whether the amount of work required in the course is excessive, etc.  My question/point is simple:  Exactly how can 18-24 year-old students have any real idea of any of the above?  They have no relevant experience or knowledge, and to obtain it is presumably why they’re in college.

Studies have shown that the closest correlation between high student evaluations is that the professors with the easiest courses and the highest percentage of As get the best evaluations. And, since evaluations have become near-universal, college level grades have experienced massive grade inflation.  In short, student evaluations are merely student Happiness Indices – HI!, for short.

So why have the vast majority of colleges and universities come to rely on HI! in evaluating professors for tenure, promotion, and retention?  It has little to do with teaching effectiveness or the quality of education provided by a given professor and everything to do with popularity.  In the elite schools, student happiness is necessary in order to keep student retention rates up, because that’s one of the key factors used by U.S. News and World Report and other rating groups, and the higher the rating, the more attractive the college or university is to the most talented students, and those students are most likely to be successful and eventually boost alumni contributions and the school’s reputation.  For state universities, it’s a more direct numbers game.  Drop-outs and transfers represent lost funds and inquiries from state legislatures who provide some of the funding.  And departments who are too rigorous in their attempts to maintain or [heaven forbid] upgrade the quality of education often either lose students or fail to grow as fast as other departments, which results in fewer resources for those departments.  Just as Amazon’s reader reviews greatly boosted Amazon’s book sales, HI! boost the economics of colleges and universities.  Professors who try to uphold or raise standards face an uphill and usually unsuccessful battle – as evidenced by the growing percentage of college graduates who lack basic skills in writing and logical understanding.

Yet, all the while, the administrations talk about the necessity of HI! [sanctimoniously disguised as thoughtful student evaluations] in improving education, when it’s really about economics and their bottom line… and by the way, in virtually every university and college across the country, over the past 20 years, the percentage growth in administration size has dwarfed the growth in full-time, tenure-track, and tenured faculty.  But then, why would any administration really want to point out that perceived student happiness trumps academic excellence in every day and in every way or that all those resources are going more and more to administrators, while faculties, especially at state universities, have fewer and fewer professors and more and more adjuncts and  teaching assistants?