I happened to glance at a recent issue of Forbes [yes, I read both Forbes and The New York Times, not to mention The Economist, New Scientist, Scientific American, and even occasionally those left wing publications like Sierra and Mother Jones] and ran across a poll that asked fifty billionaires to what factor(s) they most attributed their success. While a number mentioned more than one factor, the leading factor given was “discipline and hard work” (cited by 35), followed by “willingness to take risks” (24); education and intelligence (20), and, oh, yes, “luck” (14).
The funny thing is that I know and have known quite a few people who are intelligent, educated, disciplined, and work hard, and out of several hundred I’ve met well enough to make some personal observations, only two of them are multi-millionaires. And in fact, the idea that all it takes to become a multimillionaire, let alone a billionaire, is intelligence, drive, and persistence is in fact an American myth, and a rather damaging one at that. Now, I don’t deny that the overwhelming majority of multi-millionaires are reasonably intelligent, work hard, and persist in a disciplined fashion. They’d have to have those characteristics to succeed, but what I strenuously doubt is that any single factor, or even one or two, can make someone that successful. It takes a whole constellation of factors, including but not limited to having good ideas, being in the right place at the right time, having or making the right contacts, being able to raise the necessary investment, and a certain amount of luck. And a great number of those factors are environmental, and so obvious that they’re taken for granted, such as a stable home life and decent schools while growing up. A single factor just doesn’t cut it.
The same principle applies to other situations as well. Most industrial accidents, especially major ones, aren’t the result of a single factor going wrong, but a combination of at least two, if not more, problems. The same thing is definitely true in aviation accidents, and even though the NSTB often cites pilot error, it’s almost always a mechanical or weather problem, or something else, combined with pilot error. Most automobile accidents involve two factors, if not more.
So why do we persist as a society in trying to identify the single factor, or the “key” factor, when life is so much more like a jigsaw puzzle, where every piece plays a part? Are we trying to make things too simple? Or is it just intellectual laziness?
I think, fittingly, there’s more than one factor involved. When successful people are asked the reason for their success, they are going to use the opportunity to find a reason to feel good about themselves — a narrative of struggle and triumph allows them to do that. It’s another version of the fundamental attribution fallacy.
The desire for simple answers is a different thing that operates not only in human narratives as above but also in just about anything we think about. Generally uncertainty is unpleasant, hence avoided. Maybe it has something to do with the different thinking systems that Kahneman talks about.
I have an MS in statistics, so I should know better, but I still find myself settling for one simple answer all the time, even in domains where it ought to be obvious that that’s not true.
I think the theme song of twenty-first century living (and certainly of twenty-first century science) is : “It’s more complicated than we thought”. Science (including social sciences and medicine) used to be about looking for the rules that explained most of what was happening: now it is all about looking for the exceptions and circular and cumulative causations that explain why the rules sometimes don’t work. Even a decade ago people talked about looking for “the” gene that “controlled” whatever: now we feel lucky to find something as simple as a 53-gene composite that affects Response to Adversity via multiple changed expressions (Fredrickson et al, PNAS, 110, 13684-13689).
But we’re wired to look for heuristics that enable us to make quick decisions (via probably a multi-thousand gene complex) that are mostly right: good enough for day-to-day, but mostly not good enough to do better than 50:50 on predicting outcomes from what we do.
Also problems with Belief in a Just World – a subject for another day.
Occam’s Razor is a highly seductive idea because it is quite often gives the proper (right? true?) answer. The problem is that when it is wrong, it can be very (sometimes devastatingly so) wrong.
Perhaps the primary difficulty is when Occam’s Razor is oversimplified in application. The only deficiency of the principle itself is that it does provide a temptation toward that abuse.
“Plurality must never be posited without necessity” (the English version of what he actually wrote) allows for all complexity needed to construct a useful (and effectively true, pending discovery of further information) model, but no more than that.
As was stated, this concept applies to may topics, and the one I personally find problematic are health statistics. Along the lines of “Statistically people that did, or ate ‘X’ are far healthier then people that didn’t.” This almost always overlooks the possibility that if a person is taking the time out of their day to do one of the ‘in fashion’ health fads, they likely take other measures to improve their well being.