Law

What’s the point of law? Or law and order?

I’d say that it’s to provide a common set of rules that everyone in a society can understand and accept, ideally to accept as providing a degree of fairness. Others have or might have another concept – law as a hard and fast rule that defines good and evil in terms similar to their theological beliefs – and still others might feel that law is a tool for the elites of a society to control those beneath them. Some lawyers, I know, believe that the law is a tool they use in attempting to obtain justice, meaning a “fair” outcome for their clients, but, of course, what “fair” is always depends on individual viewpoints. From a technical point, in the United States, a law is essentially a statement by enacted by a governmental body which allows or prohibits certain acts, or imposes certain limitations on them.

And I’m certain there are other definitions of law, but why do we need laws? And why, once we have laws, do we seemingly need more and more of them?

Human societies need laws because there are always individuals who refuse to accept limitations on their acts, even when those acts harm others.

The answers to the second question are more multifold. Every law has areas where it lays down absolutes. Every time an absolute is codified into law, it creates situations where the absolute imposition of that law is unfair and unjust, or perceived as such. And someone often wants to remove that unfairness, which requires another law. In addition, every law excludes as well as including, and people want to “clarify” the law to assure that something heretofore excluded gets included. Then add to that that certain groups want certain laws for their benefit.

When people who share the same culture enact laws, they see those laws similarly among themselves and in a different way than do people who come from a different culture or economic class. That’s one reason why more egalitarian and homogenous societies tend to have lower crime rates.

In addition, equal penalties or “requirements” under law have differing impacts on people from differing social and/or economic strata.

The entire issue of so-called “voter fraud prevention” laws” being pushed by affluent white Republicans in the U.S. provides a good example of this, because those laws are regarded essentially as voter suppression laws by those of minority and lower income levels.

The difference in viewpoint comes from the difference in situation. For me, a photo ID isn’t a problem. It’s a slight hassle at most, a few hours once every five years, when I renew my driver’s license, and because I travel occasionally internationally, I have a passport as a back-up. Because I live in a moderate sized town, it’s a ten minute drive to the post office or the Department of Motor Vehicles, and because I was educated to the need for certain papers, I’ve always kept copies of things like birth certificates.

That’s all very easy and convenient – for me. My offspring, however, all live in large metropolitan areas where obtaining or renewing a driver’s license – or a passport — can be a lengthy affair, requiring travel and time. But they’re well-off enough that they can arrange the time and deal with the costs… and they had parents who prepared and educated them to those needs.

A minority single parent working a minimum wage job who lives in a state requiring a photo I.D. has a much tougher time of it. First off, most of the offices that can issue an I.D. are only open during working hours, and most minimum or low-wage earners don’t have much flexibility in working hours and often have to forgo paying work to get through the process. Also, the fees for getting such an I.D. take a greater percentage of their income. Then, even before that, they may have to obtain a certified birth certificate – taking more time and money. They are likely renting, rather than owning a home, and that requires more documents to prove where they live.

And the impact of other laws falls harder on the poor. If you don’t have the money to immediately fix a broken tail-light or a faulty muffler, that risks getting a ticket, and the cost of the ticket just adds to the burden. If you can’t drive the car, you may not be able to work. What is a modest cost and inconvenient repair for a middle-class worker can literally be a disaster for a poor worker.

What so many Americans fail to realize is that “equal” laws, even assuming that they’re enforced equally, which study after study shows they’re not, fall more heavily on the poorer members of society.

In reality… the “law” isn’t the same for everyone, nor is it seen as the same by everyone…but we’d like to pretend that it is… or that it’s stacked against us – and which belief you’re likely to hold depends on where you come from…and, often, how well off you are.

Formality in F&SF

All civilizations have at least two sets of rules. The two most basic sets of rules are laws and custom, and the most obvious subset of custom is manners. With the recent revival/ renaissance of Jane Austen and various spin-offs, there are a number of writers who focus on manners and social etiquette, generally in such sub-genres as steampunk or Regency-style fantasies.

But all cultures, in all times and places, have unspoken codes of manners, and they’re not restricted to just attire, although at times, cultures have gone so far as to legally define and restrict what people could wear, based on their wealth and social position, through sumptuary laws, which carried significant penalties.

As one of the older practicing and producing writers, I grew up in household where manners and custom were drilled into me. Of course, they had to be, because I was, to put it politely, socially oblivious. The majority of human beings have innate social senses. Mine were largely absent. That made little difference to my parents. I was drilled in every possible social grace and situation by my mother, while my father made certain I was more than adequate in sports, particularly those of social value, while both emphasized the importance of achievement in education. For the time, place, and setting in which I grew up, this was the norm.

What tends to get overlooked by a number of younger writers is that such an upbringing is not an aberration in human cultures, and for the majority of human history, those who have ruled and shaped society have had an upbringing that emphasized what was required to succeed. Those who were well-off but not of the elite also did their best to instill such education and manners in hopes that their offspring would have the background and manners to rise economically and socially.

At present, in the United States, the iron requirements of formality required prior to roughly the 1960s have been relaxed, or battered into scattered remnants of a once-uniform code of elite conduct, just as the former elites have been disparaged and often minimized.

This situation is not usual for cultures. More social rigidity is the norm, just as the studies of Thomas Piketty have shown that, historically, high levels of income inequality have also been the norm. Whether less rigid standards of manners and social behavior are the result of higher technology remains to be seen, but writers should consider [more carefully than many do, and no, I’m not naming names] whether the manners and social conduct of their characters match the actual culture that they’re depicting. The shepherd boy who attains power will never fit [and this almost never happens, except in fiction], except through brute power. His children might, if his wife/consort is from the elite and is in charge of their upbringing.

Also, contrary to what some believe, manners don’t reflect weakness, but are a way of displaying and reinforcing power. The decline of formal manners in the United States reflects the decline of old elite structure, and the often enforced casualness of new up-and-comers is meant as a symbol of a new elite, one problem of which is that an apparent lack of manners too easily suggests a lack of control… and a certain level of chaos and uncertainty.

In any case, any culture will have a form of mannered behavior that reinforces whatever elite governs, something that writers should consider.

Diversity… and Diversity… in Fiction?

At present in fantasy and science fiction, and, it seems to me, especially in fantasy, there’s a great push for cultural and ethnic diversity, especially in the last few years, at least in part as a reaction to the history in the genre, where stories and books largely focused on white male characters. That’s not to say that there haven’t been quite a number of notable exceptions that dealt with non-European ethnicities or with female characters, or even hermaphroditic characters, as in the case of LeGuin’s The Left Hand of Darkness. But the criticism that the field has been too “white male oriented” definitely has validity.

I certainly applaud works that effectively create or celebrate different racial or ethnic backgrounds, and even those that tastefully explore sexual diversity, but I’d like to sound a note of reality and caution for authors in dealing with “diversity.”

Some writers explore “diversity” by creating and exploring a culture very different from those traditionally depicted in fiction, and that can be enlightening and entertaining, but that’s very different from presenting a civilization/society which contains large numbers of people from diverse ethnicities.

First, all low-tech powerful civilizations [the kind often depicted in fantasy] have been dominated by an ethnic elite. These elites haven’t been all white, either. The Nubian culture conquered and ruled Egypt for a time, and that was definitely not a “white” culture. Most people know about the Mongol culture, and the fact that it ruled China for a time [until the Chinese absorbed the Mongols in China, which has happened more than once]. I could give a substantial list of non-Caucasian empires throughout history, but the point is that these cultures weren’t “diverse.”

They were different in ethnicity from other cultures, but there have been very few successful civilizations that embodied a great diversity in cultures. One could make the point that the United States, for all its failings, is the largest multicultural nation that has ever existed. Now, there have been empires that included different cultures, but those cultures, for the most part, were geographically distinct and united into the empire by force. About the only places where you might see diversity in any significant numbers were major port cities and the capital city.

Second, diversity in a society creates internal conflicts, sometimes on a manageable level, but if history is any indication, usually not. Even the “melting pot” United States struggles with internal ethnic diversity, and the rest of those nations with significant ethnic minority populations aren’t, for the most part, doing even as well as we are with diversity issues.

That doesn’t mean that a writer shouldn’t write about different cultures. I’m all for that – if it’s well-thought-out and consistent. In reality, however, such stable cultures will likely have a dominant ethnicity/culture, unless, of course, the author is going to explore internal ethnic conflict or unless the author has some truly “magic” potion that can solve the problems of wide-spread cultural internal diversity, because past experience shows that any internally diverse culture is likely to be highly fractious. And that’s something that both writers… and almost everybody else… tend to ignore.

The Multiplier Tool or… Not So Fast…

Technology by itself, contrary to popular beliefs, is neither good nor evil. It is a tool. More precisely, it is a multiplier tool. Technology multiplies what human beings can do. It multiplies the output from factories and farms. It also multiplies the killing power of the individual soldier or assassin. Fertilizers multiply grain and crop yields. Runoff of excess fertilizers ends up multiplying ocean algae blooms and making areas of oceans inhospitable to most life.

Modern social media makes social contacts and communication more widespread and possible than ever before. Paradoxically, it also multiplies loneliness and isolation. As recent events show, this communication system multiplies the spread of information, and, paradoxically, through belief-generated misinformation and “false news” multiplies the spread of ignorance. Use of fossil fuels has enabled great industrial and technological development, but it’s also created global warming at a rate never experienced before.

Those are general observations, but in individual cases, certain technological applications are clearly one-sided. Vaccines do far more good than harm. The harm is almost statistically undetectable, despite belief-inspired opposition. Use of biotechnology to create bioweapons benefits no one. The use of technology to turn rifles into what are effectively machine guns does far more harm than good.

The other aspect of technology is a failure of most people to understand that, with each new technology, or technological adaptation or advancement, there is both a learning curve and a sorting-out period before that technology is debugged and predictably reliable – and that period is just beginning – not ending – when the technology or application first hits the marketplace.

So… the late-adopters of new technology aren’t technophobes… or old-fashioned. They’re just cautious. But one of the problems today is the feeling by too many that it’s vital to be the first to have and use new technology or new apps. Over the years I’ve seen far more problems caused by rushing to new system and gadgets than by a deliberate reserve in adopting “the new stuff.” In addition, changing systems every time a manufacturer or systems producer upgrades wastes the time of employees and also creates anger and frustration that usually outweigh the benefits of being “early adopters.” Adopted too early or unwisely, technology can also multiply frustration and inefficiency.

Add to that the continual upgrades, and it’s very possible that the “drag effect” caused by extra time spent educating employees, installing upgrades, and debugging systems either reduces productivity or actually decreases it until reliability exceeds the problems caused by the “rush to the new.”

All of which is why I’m tempted to scoff at those individuals who rush to be the first with the newest and brightest gadget. But I don’t. I just wait a while until they’ve stumbled through all the pitfalls and most of the debugging. There’s definitely a place for “early adopters.” It’s just not a place where I need to be.

Truth…

Recently, a reader made an interesting comment to the effect that what I personally believed to be true doesn’t necessarily turn out to be true for others. This is a statement that initially sounds very reasonable, and studies indicate that it’s something that most people believe.

But… it’s also incredibly deceptive and dangerous. Now, I may have been correct, or I may have been incorrect. I may have had my facts wrong, or perhaps they were right. But the idea that correctness, accuracy, or “the truth” of something varies from individual to individual, depending on individual perception, is a very dangerous proposition.

Part of the reason why that proposition is dangerous is the use of the word “truth.” The word “truth” embodies a connotation of moral purity and certainty on the part of the individual defining that truth. On the other hand, facts are. How they’re perceived by individuals obviously varies, and different individuals give different weight to the same set of facts. Different individuals cite different sets of facts in support or opposition to policies, proposals, books, laws, or in other settings. But the bottom line should always be based on whether the facts are indeed accurate, and whether they apply to the situation at hand, not upon my beliefs about them or someone else’s beliefs about them.

It appears to me that today we’ve gotten into a societal mindset that places what we feel about anything far above determining what is accurate, what is actually so, and what is not. As feeling beings, this tendency has always been a great part of being human, but one of the great drivers of the advancement of human civilization has been the effort to determine verifiable facts, workable scientific theories based on replicable experiments and solid facts, as opposed to belief based on what cannot be determined to be accurate.

Yes, scientists and true empiricists have beliefs, but they try [and sometimes fail] to base those beliefs on hard evidence.

I’m not dismissing the importance of belief. Every human being needs things or ideals in which to believe, but the idea that what is “true” for one individual is not for another puts individual perception above accuracy and tends to support the idea that each set of beliefs is as valid as any other set of beliefs, when time and history and science have shown that “truth” resides far more often in what can be accurately determined and verified than in what cannot.

Despite the fact that in the third century BCE the Greek astronomer Aristarchus of Samos had presented a proof that the Earth revolved around the sun, more than 1500 years later the Christian Church was burning as heretics those who stated that the Earth was not the center of the universe and that it revolved around the sun. The “moral certainty” of faith trumped the facts, at least until science advanced to the point where the proof was irrefutable.

We’ve now reached a point where individuals realize that they must have at least some facts to support the “truth” of their beliefs… and in welter of “information” that surrounds us, too many individuals pick inaccurate or inapplicable facts in order to support their beliefs.

The idea that truth of belief varies from individual to individual is actually an accurate statement of a dangerous proposition – that “individual truth” is superior to verified evidence and facts, when, in fact the converse should be what we all strive for, that verified evidence and facts support our beliefs, rather than having our beliefs force us to find facts to support those beliefs.

Yet recent study after recent study shows that the majority of people tailor their facts to support their beliefs, rather than using verifiable facts to change their beliefs. Will we revert to faith over facts, as did the Christian Church of the 1500s? Given what I’ve seen over the last few years, it’s anything but an unreasonable question.