National Identity and Anger

A recent poll from The Associated Press-NORC Center for Public Affairs Research showed that seventy percent of Americans felt that the country was “losing its identity.” Unfortunately, what the poll also revealed was that Americans couldn’t agree on what were the important components of that “identity.”

Although there are some points of agreement among Democrats, Republicans and independents about certain aspects of what makes up the country’s identity, such as a fair judicial system and rule of law, the freedoms enshrined in the Constitution, and the ability to get good jobs and achieve the American dream, recent political developments make it clear that the consensus on these points is overshadowed by the differences.

Fifty-seven percent of Republicans thought one of the most important parts of the national identity was a Christian belief structure, as opposed to twenty nine percent of Democrats. On the other hand, sixty-five percent of Democrats thought that that the mixing of global cultures in the U.S. was important, compared to thirty-five percent of Republicans.

According to the poll, seventy-four percent of Democrats say that the ability of immigrants to come to the U.S. to escape violence and persecution is very important, as opposed to fifty-five percent of Republicans. Forty-six percent of Republicans agreed the culture of the country’s early European immigrants was very important, versus twenty-five percent of Democrats.

Putting these findings together suggests that, in general, Republicans think that the national identity should be based on an enshrined Christian faith and the Anglo-centric patriarchal culture of the first immigrants, while Democrats emphasize a more global-culture, welcoming to immigrants, and more concerned with the present than the past. Obviously, that’s an oversimplification, but there’s still a basic conflict, almost between the past and the present.

That conflict was definitely revealed in the last election, with the Republicans essentially claiming that the country was turning from its white, European, and totally Christian roots, and that such a turn was destroying and/or diminishing not only the United States, but the position of middle-class white American males.

As both the AP-NORC Poll and the Women’s March on Washington [with millions of women in hundreds of cities and towns across the country] showed this Republican “traditional” society is not endorsed by a significant percentage of the country.

Yet the Founding Fathers attempted to hold together thirteen colonies of very different belief structures, some with the [to me] abhorrent idea that slavery was morally acceptable, and they crafted a government based on shared principles that did not require a specific religious belief, or indeed, any belief in a supreme deity at all. For the time, this was an extraordinarily radical enterprise, so radical that the American Revolution equally merits the title of the Anglo-American Civil War.

So why is there so much disagreement about national identity and national priorities?

The election results and the vitriolic rhetoric from the right reflect, among other things, that there are fewer and fewer well-paid unskilled and semi-skilled jobs, and those jobs already lost to out-sourcing and technology, but mainly to technology, removed some eight million largely white men from the middle class. Those men and their families and relatives look to a past of more secure and prosperous employment and believe that the country has lost its way… and its traditional identity, and they’re angry.

On the other hand, there are over forty million African Americans in the U.S., and while the Civil War that resulted in their freedom ended over 150 years ago, those blacks still face discrimination and other barriers to rights equal to other white ethnicities. After 150 years they’re angry, and getting angrier, especially given the number of young black males killed and incarcerated, particularly when study after study shows discrimination still exists and that blacks receive harsher jail sentences for the same offense as do whites… among other things.

Educated women of all ethnicities are angry that they do not receive even close to equal pay for the same jobs as men and that the male-imposed glass ceilings in business, government, and politics still remain largely unbroken.

Since women and minorities are getting more and more vocal, and since minorities are becoming a bigger and bigger share of the American population, I foresee some very “interesting” years ahead, and I’d suggest that the largely white male Congress consider those facts very carefully.

The “Other” Culture

There are several definitions of “culture.” One is the development of microorganisms in an artificial media. Another is “the refinement of mind, morals, or taste.” A third is “the cultivation of plants or animals.” But there are two other definitions that tend to get overlooked: (1) the specific period or stage in the development of a civilization and (2) the sum total of the attainment and learned behavior patterns of any specific period or group of people regarded as expressing a way of life. The second of those latter definitions is the one that tends to get overlooked in government and politics, and yet the problems caused by the “learned behavior patterns” of smaller groups within a society represent one of the principal reasons for societal unrest.

That is largely because quite a few nations, including the United States, are in fact composed of various subcultures. In the U.S., those subcultures, especially those disliked by the majority, are often minimized or denigrated in racial or religious terms. An important point, and one consistently ignored by ideologues, businesses, and particularly politicians, is that “culture,” as exemplified by learned patterns of behavior, trumps “race” or religion. By that I mean that the good or bad traits of group or subgroup of people have virtually nothing to do with their religion or their skin color or ethnicity. What determines how people act is their “learned patterns of behavior.”

And while religion is definitely a learned behavior, how people of a certain religion act can and does vary enormously from cultural group to cultural group. It also varies over time. Some 500 years ago, good “Christian” countries in Europe were slaughtering each other in a fashion even more brutal than that in which the Sunni and Shia factions of Islam are now doing. Yes, religion is a critical part of “culture,” but it ranges from being the primary determinant of a culture to being merely one of many factors, and in the history of certain civilizations, the impact of a religion can and has changed the culture drastically.

As I’ve also noted before, likely more than a few times, history is filled with examples of both great and failed societies and nations identified as being predominantly of one race or religion. There have been great empires in all parts of the world – except, so far, Antarctica, and there have been failed societies everywhere in the world, regardless of race or religion.

Certain cultural practices seem to work better than others, one of which is that cultures that allow religion to control society tend to stagnate and become ever more brutal. Cultures with great income inequality tend to be more likely to be oppressive, and a greater percentage seem to have either de jure or de facto polygamy. A good sociologist could likely carry this much farther, but the basic point is that it’s not only morally wrong to claim that a given race or ethnicity or religion is “stupid” or “inferior” (or any other number of pejorative terms), but also such unthinking “type-casting” totally misses the point. Culture – not race, genes, skin color, or religion – determines how people behave. More to the point, one can change a toxic culture [although it takes time] and a beneficial culture is always only a cultural change or two away from becoming toxic.

The Threat from Radical Islamic Terrorists

I’m fed up with the propaganda from the White House about the “overwhelming” danger to U.S. citizens from radical Islamic terrorists. Yes, there are radical Islamic terrorists, but here in the United States, radical Islamic terrorists are far less of a threat than home-grown right-wing terrorists. Overseas, that’s another question, which is why so many law-abiding members of the Islamic faith want to get to the U.S. – or did before Donald Trump became President.

While consensus on hard numbers is difficult to come by, and numbers vary by source, whatever the source, those numbers suggest that radical Islamic terrorists are not the major threat to Americans – not even close. Other Americans are.

On terms of terrorist attacks in the United States, the numbers are lopsided, to say the least. According to a study by the United States Military Academy’s Combating Terrorism Center, domestic right-wing extremists averaged 337 attacks in the U.S. in the decade after 9/11, accounting for 254 fatalities, while, depending on the study and the definitions, between a total of 20 and 24 terrorist attacks in the U.S. were carried out by Islamic radicals with between and 50 and 123 fatalities.

In the last several years, the vast majority of police deaths have come from domestic extremists. An ADL report states that of the forty-five police officers killed by extremists since 2001, ten were killed by left-wing U.S. extremists, thirty-four by right-wing U.S. extremists, and one by domestic Islamic extremists.

As far as Trump’s proposed travel ban goes, there has not been one single terrorist attack on U.S. soil in the last four decades that has been carried out by citizens of any the seven countries on Trump’s ban list. Out of the 240,000 Americans who have been murdered since the attacks on the Twin Towers in 2001, exactly 123 of those deaths were linked to Muslim-American extremists. In other words, .05123 percent of the murders in the United States in a sixteen year period were carried out in the name of radical Islam. Even figures from the right-wing National Review only list 88 deaths in the U.S. from radical Islamic terrorists since 2009.

Yet at the same time that Trump is citing the danger from radical Islamic terrorists, reports have surfaced that he plans to shut down Homeland Security’s watch list for domestic extremists. Not only that, but bowing to the NRA, he decided to void an Executive Order by former President Obama that would have put people declared to be mentally incompetent by a court on a do-not-buy list for firearms. The NRA argued that mentally incompetent people should have the same right to firearms as anyone else.

And we’re worrying about Islamic terrorists?

Education and the Business Model

More and more state legislators across the country are demanding that education be run in a more business-like fashion. While greater efficiency is likely necessary in many educational systems, running higher education “like a business” is not only counter-productive, but it’s more likely to create more problems than it purports to solve – and business-like approaches so far haven’t shown much success at the university level.

One of the tools employed by both business and educational systems run by the “business model” is to reduce costs by reducing the number of employees and their compensation in relation to “output.” In the business area, this has given us outsourced manufacturing or high-tech automated manufacturing or, on the other hand, in retailing, lots and lots of underpaid, part-time employees without benefits. In education, a similar change is occurring, particularly in higher education, where university faculties have shifted from those primarily comprised of full-time dedicated professors to faculties where the majority of teaching faculty are part-time adjuncts, many of them far less qualified or experienced than seasoned full-time faculty. At the same time, administrations spouting the “business model” mantra have burgeoned.

At virtually all public universities, administrative overhead and full-time administrative positions have increased two to threefold over the past twenty plus years, while full-time faculty positions have decreased, except in the case of some smaller state universities that have expanded massively so that full-time positions have increased somewhat, even though the percentage of full-time positions has decreased to the same level as at other state universities, if not more.

The chief reason for this emphasis on part-time teaching positions is cost. As I’ve noted before, fifty years ago, on average, state legislatures supplied the bulk of higher education funding. In 1974, states provided 78% of the cost of educating a student. Today, although total funding is actually higher, because almost four times as many students attend college today, the amount of state funding per student averages around 20%, although it varies widely by state, and in some cases it is around 10%.

For example, almost until 1970, California residents could attend the University of California [Berkeley] tuition-free. Today, tuition and fees for in-state students are around $15,000 a year. This trend, if anything, is accelerating. Just since 2008, state funding of higher education has dropped by 20% per student

The response by legislatures, predictably, is to push for more efficiency. Unhappily that has translated into “get lower costs however you can.” The problem with this is that the emphasis, no matter what administrators say, is to turn out the most graduates at the lowest cost. Universities also tend to phase out departments with small numbers or high costs, and expand departments with large numbers and low costs, even if students that major in that area have difficulty getting jobs.

In addition, political pressure, both to “keep” students in school for budgetary reasons and to graduate a higher percentage of students, has inexorably decreased the academic rigor of the majority of publicly funded universities and colleges. This, in turn, has led to more and more businesses and other employers demanding graduate degrees or other additional qualifications, which further increases the tuition and debt burden on students. That’s scarcely “economic” on a societal basis because it pressures students to aim for high income professions or high income specialties in a profession, rather than for what they’re good at doing and what they love. It’s also created an emphasis on paper credentials, rather than the ability to do a job. On top of that, it’s meant more highly qualified individuals are avoiding professions such as teaching, library science, music, art, government civil service, and others; and those professions, especially teaching, are being filled by a greater percentage of less highly qualified individuals.

The end result of the combination of stingy state legislatures and the “business model” is less rigorous academic standards and watered down curricula at the majority of public colleges and universities, skyrocketing student debt, a smaller and smaller percentage of highly qualified, excellent, and dedicated full-time professors, and a plethora of overpaid administrators, the majority of whom heap even more administrative requirements on full-time teaching faculty.

No efficient business actually operates this way, and why higher education gets away with calling what it’s doing “the business model” has baffled me for more than two decades.

Economics, Politics, Business, and Regulation

To begin with, economics and business are not the same, although they share much of the same terminology, because the economics of business center on business, while the study of economics, at least in theory, encompasses all of society, and just not business, even though some business types have trouble comprehending that a nation’s economy consists of more than business, or more than government and business.

And no matter what they claim, most business people really don’t understand economics, or choose not to. Likewise, very few economists really understand business. Politicians, for the most part, understand neither, and most Americans understand even less than the politicians. This is, I submit, one of the fundamental problems facing the U.S. today.

Let’s just look at why in terms of fundamentals. Supposedly, the basis of economics and business rests on the interaction of supply and demand. In general terms, “supply” means the amount of a good sellers are willing to provide at a given price. Demand is what buyers will purchase at a given price. In a relatively free market [there are no totally free markets, and never can be, a point too many business types fail to acknowledge publicly], the going price of a good or service is set when supply and demand meet. If there is greater demand or a lesser supply, usually prices rise. If demand falls, or supply increases significantly, prices usually fall, again in a relatively free economy.

Of course, no economy is completely free because to have a working economy requires a working society, and human beings have yet to create a working society without government, and government, for various reasons, always imposes restrictions on the market. Some of those restrictions, given human nature, are necessary. Why? Because of the intersection of the way business operates and human nature.

As some have pointed out, the price of a good or service is not necessarily its cost plus remuneration to the supplier, but over time, price has to consist, at the least, of the amount necessary to cover costs of production plus enough above that to keep the supplier or business going. But the devil is in the details, and one of those details is how one defines “costs of production.”

There are all sorts of costs – fixed costs, marginal costs, operating costs, external diseconomies [otherwise known as negative externalities], etc. The cost that matters most to a business is whatever costs the business is required to pay by both the demands of the marketplace (i.e., supply and demand) and the government. If a business has to pay taxes, that’s a cost imposed by government. So are wage, benefit, safety, and environmental standards.

So… by what right, in a supposedly free market economy, is government imposing those costs on business?

The reason for government action is because: (1) the marketplace doesn’t include all the costs of production and (2) a totally “free” marketplace creates wage levels and working conditions virtually all western governments have declared unacceptable, and, therefore, governments have set minimum standards for wages, safety, and worker health conditions.

In addition, some of those government taxes provide for the highways and airways on which business goods are transported, for the national defense which protects business and everyone else from enemies from coming in and seizing businesses and properties and which allows U.S. businesses to conduct operations elsewhere in the world, for regulation and continuance of a stable banking system, for public safety, and so forth, all of which make the operation of businesses possible.

One of the reasons that, years ago, the Cuyahoga River next to the Republic steel mill in Cleveland caught fire was because the marketplace cost, and thus the price of a good, didn’t include costs passed on to others in society in the form of polluted air or water, and thus, any manufacturer who did restrict the emissions of pollutants incurred higher costs compared to producers who didn’t. Consequently, marketplace “discipline” effectively encouraged pollution, or at the very least, certainly didn’t discourage it. Costs inflicted on others are usually termed negative externalities [the older term is external diseconomies], but such terms tend to gloss over the fact that pollution and other degradation of the environment caused by manufacturing is not reflected in the cost of production unless government requires it.

So, when a manufacturer claims that environmental or worker safety regulations are stifling the economy, what that manufacturer really is saying is that he or she can’t compete with manufacturers in other countries that have fewer environmental regulations, and thus, often lower costs of production… and when that manufacturer demands less regulation, it is a demand to allow more pollution so that the manufacturer can make more money – or even stay in business.

Balancing economic output and worker and environmental health and safety is a trade-off. Although some regulations have been ill-thought-out, in general, stricter regulations result in a better environment for both workers and society, but if the rest of the world has lower levels, those U.S. industries competing in a global market will suffer higher costs, unless they have other cost advantages, such as better technology or far more productive workers. Because environmental control technology is expensive, most industries tend to oppose regulations requiring more technology.

In certain industries, workers, such as coal miners, often oppose environmental rules because those rules raise costs, and higher costs may result in the loss of their jobs. The question in such cases is whether continuing such jobs is worth the environmental and health damage, both to workers and to others. The Trump administration is working to remove an Obama administration rule that put stricter limits on how close to watercourses coal mining and chemical wastes could be placed, claiming that the rule will cost jobs, which it likely would to some degree. But the rule would also cut the number of coal and chemical industrial storage and waste disposal sites near rivers and streams in an effort to eliminate slurry and waste accidents such as the one along the Elk River in West Virginia in 2014 that fouled miles of streams and rivers, poisoned hundreds of people who drank the water unknowingly, and left more than 300,000 people without drinkable water for months.

History has shown, convincingly, for all who are willing to look at the facts, actual deaths, poisonings, and worse, that, without government regulations, a significant proportion, sometimes all, manufacturers in an industry will commit unspeakable wrongs in the search to maximize profit. Remember when the Ford Motor Company tried to cover up the faulty design of the gas tank in the Ford Pinto, deciding that it was cheaper to pay legal costs for deaths [which Ford estimated at $49 million] rather than produce a more expensive gas tank, which would have cost $113 million. Ford decided against the fix on a cost-benefit basis, then ended up paying out much more in legal settlements, in addition to a costly recall.

This kind of business cost-benefit analysis continues today, and that’s why the “business model” can’t be allowed without oversight and regulation. The question is not whether to regulate or not to regulate, but how much regulation is appropriate in what circumstances. Or put another way, is your business more important than my health? Except that business owners would say, an increase in regulations will kill my business and probably won’t measurably improve your health. Both are likely exaggerating, and that’s why verifiable science and facts – scientific, financial, and economic – are critical, and why political slogans and political pressure brought by outside interests have no place in determining whether a regulation is necessary, and if so, the degree of regulation required.