As it happens, most of the events upon which this book was based either happened in my recent lifetime, or are matters of well known historical events, such as Chamberlin meeting Hitler. Thus, I do have a perspective on these issues. All in all, this is a hard book to summarize. In part because it is basically clinical psychology, which is a soft science at the best of times.
The book wraps its arguments around several well known cases and/or events. It opens and closes with Sandra Bland, the young black woman who was pulled over in Texas (she drove down from Chicago) for "failing to signal", ended up being manhandled out of the car, cuffed and arrested, and then hung herself in the jail.
The first case described in detail was about a high ranking CIA official who turned out to be a Cuban spy. The second was about Chamberlin meeting Hitler. The basic message is that we suck at spotting liars. Ample evidence is provided that shows that a computer evaluating parole issues from raw data scores much better results than a judge who can look the perp in eye and judge their demeanor. This does not surprise me one bit. As a life long poker player, I have determined that I suck at it too, and I would be better off simply following the odds.
We rely way to much on our ability to tell when we are being lied to. And once we make the decision to trust, we are easier marks because we will defend the indefensible far too long. We are too trusting, and we "default to the truth". I think a philosopher would call this the Principle of Charity. It is, as the book acknowledges, both a very risky thing to do; and very necessary as it is the grease that keeps society alive.
In this vein, we tend to think of others as simple and ourselves as nuanced. E.g.: A cop might see a nuanced response to stimulus from you as proof you prevaricating. This is exactly what happened to the Amanda Knox (another case study in the book). She reacted in a way the prosecutors saw as irrational, and therefore she was guilty despite the fact the physical evidence pointed elsewhere. This went all the way to the Italian Supreme Court before it finally got tossed for the rubbish it was. What reaction set the cops off? Basically, she was cool, clam, collected and quite. If she had cried like a girl, or some such, she would have walked.
The book calls these encounters as two "mismatched" people. We might say "talking or acting at cross purposes".
Another case with which I was very familiar was Harry Markopolus and his take down of Bernie Madoff. The SEC "defaulted to truth" and believed Madoff. Markopolis is unusual in that he does not "default to truth", but he has paid price: paranoia.
"Transparency is a myth." This line is repeated and emphasized and I agree with it. Transparency is the idea that if we can see what is going on on the outside, we can tell what is going on on the inside. On TV, we see people say and do things, and their faces and their words match exactly. This is called emoting. In real life it is rare. We do not wear ourselves on our sleeves. We are not transparent. But we tend to think others are.
Another interesting point is that the facial expressions and other tells that we think we universal, are not. A smile is not just a smile. It depends on the culture you are in.
The book examines the Sandusky pedophile case; a well know fraternity sexual assault involving loads of alcohol; the KSM Guantanamo Bay torture story and several others. It also delves into policing a fair bit. The bottom line of the policing analysis is that cops are incredibly reluctant to give up their supposed god-like ability to see into men's souls and pick the good guys and the bad guys. When science gets involved, and it shows them incapable of doing what they think they can do, then and only then, do they-ever-so-slowly recognize the truth.
Which leads into the study of the opening story about Sandra Bland.
I read this book to the end because it did keep my interest. My overall feeling is that it is far to broad, and overly simplistic in its analysis. I am always skeptical when people are broken down into groups that too broad to useful. The book does not really give advice on what to do about it.
Chamberlain was played by Hitler, who was really good at it. For some people, lying is their first option, and they do so with remarkable ease. Hitler, Bernie Madoff and my ex wife have these traits in common. I "defaulted to truth" with her as I do with most people. But I am skeptic and would love to know how I can tell if me and my listener are "mismatched".
Other cases are also explainable without appealing the "mismatched" perspective of the book. Bernie's success turned on greed and he was an expert con man. And Sandra Bland can be explained by bad policing policy (pull over everyone you can, give out as many tickets as you can, dig for a reason to doubt "them", abuse your authority), and bigoted, poorly trained cops. IMHO, Bland was pulled over for being black. When she got uppity, the cop lost his sense of perspective, she died, and, thankfully, the asshole cop was fired.
The ideas in this book are useful, but I see very little in the way of practical advice other than the skeptics mantra: Doubt is the handmaiden of truth, and above all else, doubt yourself and your ability to detect lies.
Chamberlain had the facts… he was just a pussy who did not doubt his ability to read people.
George HW Bush made the same mistake with Putin (remember "I looked in his eyes?") and Donald Trump, the stupidest person to ever occupy the White House, is doing it again.
The pages of this book are small and the leading loose, so this book is fast read and I read it. Perhaps this is my disdain for clinical psychology (which has done a lot of damage over the years), but I cannot recommend this book unless you are a student of the subject.
When pondering on disaster relief, it is helpful to know what natural disasters we should prepare for.
War is hardly natural, and we seem to always be on the brink of it somewhere in the world, so I ignore it.
There are two major issues to deal with: How big might it be, and what are the chances of it happening. A crude break down of “how big” is Local vs Global.
Here is the list off the top of my head (Disaster; Size, Probability):
Only “Pandemic” stands out. It has the potential to go global; AND since it has happened in the past (Spanish Flu, Swine Flu, SARS), and, due largely to air travel, is more likely to happen in the future… the probability is high.
Covid 19 was not unlucky, it was inevitable. A moments thought, which is about as much as I put into this piece, will convince anyone that the smart money should go to prevention of pandemics.
Therefore, ergo, ipso facto, and thus, Donald Trump disbanded the pandemic teams, cut the budget for the CDC and tried to build a wall to keep out Mexicans. He has turned recklessness into an art form. Of course, he did not create C19, but he has, and is, making it much worse.
I always want to end on a happy thought. This can be fixed.
1) Reduce the number of people on the planet by 3 billion at least;
2) Reduce the number of Donald Trumps by one (ideally, remove the whole family tree); and
3) Stop treating every square inch of the planet as a tourist destination.
The following paper was written by my friend Dale Beyerstein for the journal Humanist Perspectives ( https://www.humanistperspectives.org/issue211/index.html ) .
People who debunk false or nonsensical claims sometimes tend to specialise – e.g., concentrating on pseudohistorical claims such as the 9/11 conspiracy, paranormal claims such as those involving extraterrestrial UFOs, alternative health claims such as the canard that MMR vaccine causes autism, or the claims of particular religions. This is understandable, because the one thing that unites modern day critics of these intellectual travesties is their commitment to thoroughness in researching and analysing claims before accepting them or debunking them. Of course, this is not to say that skeptics have always is lived up this ideal. But when one skeptic does not, fellow skeptics are generally just as critical of that lapse as they are towards any paranormalist.
But despite this tendency towards depth of knowledge, many skeptics also show a wide breadth. Many are quite well versed in what’s wrong with many different pseudosciences – from different versions of alternative or complementary medicine to astrology to UFOs. In addition, some are well versed in other areas as well, such as religion, politics, or anything you may wish to discuss around the watercooler. As well as being able to discuss these topics, they also can give a coherent account of scientific reasoning and critical thinking. Is this because these people are polymaths? Well, some are, but I think that the main reason has to with their curiosity.
This is not the way many people view skeptics. Many people see skeptics as, in the words of U.S. President Richard Nixon’s Vice President Spiro Agnew, “nattering nabobs of negativism.” This is because skeptics reject quite a few popularly held beliefs. But skeptics hold a number of beliefs, albeit provisionally. Sit down with a group of people that includes at least one skeptic, and you will find a genuine interest in ideas. But more important, you will hear a lot of “Why is that?”, “What’s the evidence for that?”, or “But what about…”. And they are curious, but they practise a special type of curiosity, which I shall call what I shall call guided curiosity.
What am I on about here? Well, everyone holds that curiosity is a good thing; but if you think about it, unbridled curiosity can actually inhibit understanding. Take conspiracy theorists, for example. Being curious about how every girder split from its mountings in the World Trade Center on 9/11 wouldn’t seem to be a bad thing. However, when you think about it, it’s obvious that such detailed knowledge about an event such as this just isn’t going to be available. Who would be wandering around the buildings as they were collapsing to gather it? And if anyone did, would they have survived long enough to tell us? And it begs the question about a conspiracy to assume that anyone was figuring these things out in advance in order to plan the attack. It’s obvious that flying a plane into an iconic building will cause chaos; the perpetrators didn’t need to know exactly how much or exactly how it would happen. So why should we be worried that we lack that information? But this lack of information is what starts some conspiracy theorists down the rabbit hole. Why don’t we have it? Who is hiding it? Why? Religious believers fall into the same trap. What happened before the Big Bang? Obviously, since we have no answer at present, we must conclude that there must have been a God to cause it. Asking questions that admit of no answers, or improperly formulating them in a way that prevents a sensible answer to be given, is a surefire method for generating false, and sometimes ridiculous beliefs. So there needs to be limits on our curiosity.
I’m not suggesting limiting the range of one’s curiosity to what is ‘practical’, or of immediate interest, or to easily answerable questions. Rather, the point of limiting curiosity is this: The wider one draws the curiosity net, the more information one receives, everything else being equal. But on the other hand, one will pick up more flotsam and jetsam as well. So the point is to maximise knowledge, or at least justified beliefs while minimising the amount of nonsense or outright falsehood.
The method which does the most to achieve this is best stated by the eighteenth-century philosopher David Hume: “A wise man, therefore, proportions his belief to the evidence” (Hume 1955). Let’s refer to this as Hume’s Rule. Sexism aside, the rule is stating that the stronger the evidence, the more confidence one should have in one’s belief, and the opposite holds as well. Less evidence should result in a weaker belief. And a corollary of this principle is that one should not have a belief at all until one has examined the evidence. Following Hume’s Rule is the basis of guided curiosity. In what follows I shall give some examples from religion and paranormal belief to show how guided curiosity keeps us from falling for nonsense.
The consequences for religious belief of Hume’s Rule are readily apparent. Very few religious believers, when pressed, will hold that the evidence for religious belief is very strong. This is where the argument typically takes a turn: to hold that there is, after all, an exception to Hume’s Rule, which applies only to religion. Religious belief is grounded on faith, not reason or evidence; and Hume’s Rule applies only to beliefs based on evidence. Faith is, of course, precisely belief in the absence of evidence, and is, according to the religionist, a virtue which not only elevates the religionist who possesses it, but shows the simple-mindedness and shallowness of thought and character of the atheist or agnostic who lacks it, and instead asks for evidence.
By the time the atheist has defended her character from the above charge, there probably won’t be much time left to return to the question why faith is only appropriate for religious claims. After all, faith has a companion in mundane affairs, gullibility – which is also belief in the absence of evidence – which is decidedly not considered a virtue in those who invested in Bernie Madoff’s Ponzi scheme. But if you do find the time to pursue this point with the religionist, it’s very unlikely that you will receive an informed answer. Instead you will probably be told that this comparison is insulting, and the debate will end there. Perhaps it is; but the fact that a person may be insulted by being told that he has a long nose doesn’t by itself prove that the claim is wrong.
Hume’s Rule has another important implication for religion, atheism and agnosticism. To see it, let’s introduce just a bit of probability theory. Since evidence is what makes a belief probable, it follows that evidence which establishes the probability of a claim to be less than .5 (or 50%) should be disbelieved. This is because the probability of belief p and the belief in the denial of p (Not-p) must add up to 1. The sentence “It will either rain or not rain on my house today” has a probability of 1, or, in other words, expresses a certainty (the Law of the Excluded Middle in logic). According to the app on my phone, the probability that it will rain here today is 40%, or .4. When a belief p has a probability of .4, its denial, Not-p has a probability of .6 (1 minus .4). So I should believe that it won’t rain today – but, applying Hume’s Rule, my belief shouldn’t be very strong, and I should be prepared to change it. Ditto my belief in a god, except that I assign the probability of there being a god to be much lower.
A claim with a probability of exactly .5 should be neither believed nor disbelieved. Thus, agnostics must be holding that the evidence for belief in a god is just as compelling as that for disbelief, or, in other words, a probability of .5 for each. A probability of less than .5 is grounds for atheism, for the reason just given. But most people who call themselves agnostics do not really believe that the probabilities are equal. They concede that the probability that there is a god is less than that of the belief that there isn’t one, but they stick to the claim that we cannot be sure that a god doesn’t exist. But this is just to miss the point of one of the basic axioms of probability theory given above. It is important to remember that the denial of the existence of something does not require evidence that the probability of its existence is 0. Most agnostics have no difficulty in dismissing the existence of Santa Claus or the Tooth Fairy, despite not having checked every claim of where Christmas presents or quarters under the pillow came from, and therefore not being in the position to say that the probability of their existence is 0.
It might be thought that the agnostic has an answer to this. Given that God is supposed to be transcendent, completely outside the realm of human experience, no evidence is possible for belief or disbelief, because there is no evidence at all. With no evidence leading us in either direction, suspension of belief (agnosticism) seems to be the only reasonable position. But this rebuttal isn’t conclusive.
The best reason for suspending belief is that we are awaiting further evidence that might require us to change our minds. Now let’s return to the agnostic’s strong point that we are considering the transcendent; for which no empirical evidence can be found. If this is so, then there would be no reason to suspend belief pending further evidence – the supposition is just that this won’t be any. Now, add this to another corollary of Hume’s Law: The onus of proof is always on the person who puts the idea forward. When the claim is presented without any evidence to support it, Hume’s wise person would disbelieve it. This is because there are always more ways of getting something wrong than of getting it right. Take for example, guessing the day of the week on which a total stranger was born. If you guess Thursday, you have one chance in seven of getting it right, and the smart money will be on you getting it wrong. So in the debate between the atheist and agnostic where both agree that there is no evidence available about the transcendent (literally the world for which no empirical evidence possible), the onus of proof is on the believer, and the believer has none. Therefore, the claim should be disbelieved, and the atheist wins by default. In addition, remember that the theist claims to believe in some god or another; one with certain properties (even if she admits that there is no empirical evidence for those properties. But now go to a second theist, who believes in another god, with somewhat different properties). Both of these theists will be implicitly recognising the onus of proof, since they apply it to each other: the first will deny the second’s god on the grounds that the evidence is insufficient, and vice versa. Repeat this a few thousands of times, and we have the point made so well by Richard Dawkins (2006): “We are all atheists about most of the gods that humanity has ever believed in. Some of us just go one god further.”
Pseudoscientific claims fudge the onus of proof too. In fact, they do so with such regularity that we might consider this error as one of the defining characteristics of pseudoscience. Take conspiracy theories for example. Why is all the evidence about how “they” killed Kennedy, or placed the explosives in the World Trade Center towers to supplement the work of the airplanes, completely hidden? That it is missing is just the evidence that skeptics are supposedly too dense to see; its absence shows how clever and powerful “they” are that “they” can hide it so well. You can diagnose the informal fallacy involved here as failing to respect the onus of proof; or you can equally well call it begging the question (appealing to the very fact you are attempting to prove as evidence for the very fact that you are attempting to prove). Or you can call it the argument from ignorance, which involves saying that because you cannot disprove my claim, I must be right – whether or not I have presented any evidence. But the main point is that they have managed the impossible feat of creating something out of nothing. This, by the way, is the thing about informal fallacies: They are like cockroaches, in that if you spot one you can be assured that there are a bunch more lurking where you can’t see them.
A corollary of Hume’s Rule is the requirement that we search for not only the evidence that supports our belief, but for that which goes against it. Looking only for the supporting evidence is confirmation bias. The religious, conspiracy theory and the paranormal believer on the other hand are notorious for this cherry picking. The Christian apologist, searching for miracles, concentrates on the one little baby that survives the plane crash, but ignores the 200 others who perish. The believer in her own telepathic powers zeroes in on the few times she ‘knows’ what her friend will say next, while remaining blindly oblivious to the many more times she guesses wrong. The 9/11 conspiracy theorist pays attention to any problem, no matter how insignificant, in the received account of how the towers collapsed, while not being troubled at all by the fact that there is no evidence whatsoever of anyone planting any explosives in the buildings beforehand. The believer in any sort of complementary or alternative medicine will keep track of every cold that cures itself after taking echinacea while ignoring the ones that cure themselves without taking it. The graphologist (handwriting analyst) keeps track of every case where a handwriting sample shows large loops on the descenders of fs, gs, js, ps, qs and zs and its writer has a better than average libido, and ignores those with large libidos but wih handwriting characteristics that cannot be described this way, as well as those whose handwriting has thin or small descenders but who are nevertheless quite libidinous. (If you think I’m making this up, check Paterson, (1980:11), and don’t ask how she measured libidinousness.)
Let us look at one more rule of guided curiosity. It’s not only important to pick up new information; it’s also essential to compare that new information with what you already have. Is the new information consistent with what you already believe? If not, you have work to do to reconcile these beliefs. Perhaps you will have to reject the new information; perhaps you will have to modify it to a certain degree. Or maybe you will have to do one or both of these things to your present beliefs to achieve a fit. This shuffling process will always be with us, as long as we are gaining new information. (For a better account of this process, see Quine and Ullian (1978).) If the new observation is consistent with our old beliefs, then you can ask how it enhances your understanding of what you already know. What implications are there from the new belief to other hitherto unsuspected claims? Do these implications suggest further claims which can be tested? An example will show what I mean. Therapeutic Touch (TT) is a healing modality which involves practitioners manipulating the “energy field” which surrounds our body without actually touching the body itself. (This has led me to describe TT as “neither therapy nor touch.”} One shape of the field is healthy, others are making us sick. The crucial thing to note about TT theory is that practitioners work on just this energy field, not on the body underneath it. So, let’s take these claims at face value. If the healers can manipulate the field, they must be able to discern its presence somehow without simply inferring it from the presence of a body. But if we can perceive its presence through sight, smell, touch or hearing (probably taste isn’t an option here), then everyone should be aware of it. But we are not. Only TT practitioners are. Well, that must be because there’s another sensory mechanism which not all of us have – only those who would be good therapeutic touch practitioners have it. The inference from the claim that the energy field can be worked with to the claim that practitioners must be able to recognise its presence is not one that is often made by TT believers. But it was made by a nine-year-old from Loveland, Colorado, Emily Rosa. And though no TT practitioner had thought of doing this, the young Ms Rosa thought about how to test this claim. For her science project she set up a solid barrier dividing a table in half. The barrier left enough room for a TT practitioner to pass her hand underneath, just high enough for the experimenter to have her hand underneath it, or not. Whether it was or was not was determined by a randomizer. So, if the TT practitioner could do better than chance, on a test designed to rule out the other sensory modalities, this would be evidence of the energy field that some gifted people could detect. Needless to say, Ms Rosa’s experiment did not confirm this hypothesis, but it did lead to her being the youngest person ever to be published in a top rank medical journal (Rosa , et al, 1998).
There is another important implication of Hume’s Rule. He tells us that belief should be based on the preponderance of evidence, or on the probability that evidence confers on the belief. But the evidence for or against the belief is continually shifting as more of it becomes available. Along with this, the probabilities will fluctuate. Remembering this is how the skeptic following guided curiosity avoids dogmatism even when she has a fairly strongly held belief. She is always ready to modify her beliefs, and in some cases switch from belief to disbelief or vice versa as the new evidence comes in. And it is also why a skeptic should not only state her beliefs, but state them along with the degree of confidence – her estimate of the likelihood that more evidence will require her to revise or abandon those beliefs. Or better yet, always be prepared to state the belief along with the evidence for it. With these qualifications, there is no harm in provisionally stating a belief with a probability not much higher than .5, or disbelief even when the probability is a bit less than .5. Taking a belief seriously confers the benefit that, once a belief is stated along with the evidence for it, it can be examined, and implications drawn from it, which in turn can lead to new understanding. However, dismissing it because the probability is not much above .5 forgoes this possibility. The important thing to remember about dogmatism is that what is wrong with it is not the forceful stating of the belief, but concentrating on the belief rather than the evidence for it, or the unwillingness to budge from it when new evidence comes to light.
Some skeptics will be disappointed that I have gone all this way without mentioning one of the cardinal principles of skepticism, that there are times – quite a lot of them, actually – when you shouldn’t express a belief at all; you should straightforwardly admit that you do not know. There are two advantages to this admission. The first is that it serves as a stimulus to curiosity: having admitted that you don’t already know gives you a good reason to try to find out. Second, it prevents you from misleading others (and yourself). When they think you know and you don’t, they might follow you when they shouldn’t.
There are two situations where one should say that one doesn’t know. The first is when this expression is simply a substitute for “I don’t care.” This expression is tantamount to admitting that you have very little evidence and you are not prepared to gather any more. For example, just by reading the headlines and deciding that I have no interest in the articles they head, I couldn’t help finding out that Prince William, the Duke of Cambridge, and his wife, Kate Middleton, recently had a son, whom they named “Archie”. But his birthweight? I don’t know; meaning …. After all, one cannot expect to have time to look into everything; one must prioritise.
The second situation is where the evidence you have at present is about equally compelling for belief and disbelief, and it is possible to get more. In this situation it makes sense to suspend belief and wait for further evidence. But there’s an important exception here: sometimes waiting isn’t a viable option; circumstances require an immediate response. Fortunately, these special cases are quite rare; so withholding judgement while awaiting more evidence is an available option, and a good one.
Otto von Neurath (1921) compared our belief system to a leaky ship at sea. We are continually replacing rotten planks with fresh ones, but never are we able to replace the whole bottom at once, given that we wish to remain afloat. Thus we will never have a perfect set of beliefs; there will always be some false ones in there that we haven’t found yet. The best we can hope for is a gradual improvement. To continue with his metaphor, when we find a particularly strong plank on the boat which doesn’t fit very well with the old ones already in place, going to the all the work to make it fit may result in a much less leaky boat overall. Similarly, encountering a new belief that is inconsistent with some old ones, but with a lot of evidence backing it up may require the modification of several of the old beliefs at once. But the result may be a more coherent belief system overall. But not a perfect one. Non-skeptics may find this disconcerting. They are like the sailors who aren’t in it for the pleasure, but just want to get somewhere – anywhere -- and who just want to go along for the ride. But the true skeptic enjoys the sailing for its own sake.
Dawkins, Richard 2006: The God Delusion.Boston, Houghton Mifflin Company.
Hume, David: “Of Miracles”, in Hume 1955, Indianapolis, The Bobbs-Merill Company Inc., An Inquiry Concerning Human Understanding.
Paterson, Jane 1980: Know Yourself Through Your Handwriting. Montreal, Readers Digest Assn.
Quine, W.v.O. Quine and J.S. Ullian, 1978: The Web of Belief. New York, Random House.
Rosa, L., E. Rosa, L. Sarner, S. Barrett 1998: "A Close Look at Therapeutic Touch". Journal of the American Medical Association, Vol 279(13):1005–1010.
Von Neurath, Otto 1921: Anti-Spengler. Munich, G.D.W. Callwey
Dyer is a Canadian war researcher who wrote this book in 1985. From what I can gather, it is well respected. This edition has been revised and updated.
The book tours war historically, examining its roots and its evolution over the millennia.
The opening comments discuss some of the big picture history. For example, major battles prior to the 20th century would generate casualty rates as high as 50-60%, with an average of 20. In modern battles, this figure rarely exceeds 1 percent. This was due largely to changes in technology (weapons got better). This in turn changed the nature of soldiers and their mission. In the old days, soldiers did not get combat fatigue (shell shock, or today: PTSD) because they would die before it ever happened. Today, all armies recognize that troops can only take so many combat days (in WWII, it was 240) before they fall apart.
Dyer has always impressed me especially with regard to basic training, which he discusses in a chapter called "Any one's Son Will Do". Get them when they are young. Break them down to the same level, making them all equals, and then build up a small group dynamic where each man relies on the others to help him stay alive. Drill Sergeants are masters of psychological manipulation, and they know it.
In his opening chapter, Dyer makes this startling revelation: In WWII in the ETO, only 15% of combat infantry riflemen ever fired their weapon in anger! Most soldiers would never have reveal such a fact, until they were told they were not at all alone. More than four fifths of combat soldiers got through the war with killing anyone, and without firing a shot! More research has backed this up. Gettysburg was a hugely costly battle for Americans because it was Americans on both sides. More than half of the recovered muskets from the battle were loaded with more than one round, and only 5% were ready to be fired when they were abandoned or dropped. Six thousand had as many as 10 rounds in the barrel. In other words, most of the fighters were spending their time reloading a loaded gun, and, one assumes, ducking. No one is suggesting these men were cowards. Many were simply principled and did not want to kill.
The upshot of this is startling: if you could get the malingerers to enter the fray, the other guys didn't stand a chance. Which they did. New training got the numbers up to 50% in Korea and 80% in Vietnam. The US had figured out how to get men to kill automatically. Dehumanizing the enemy was a big part of it.
War goes back a long way. Chimps war on each other and so do we. We did not invent war, we inherited it. Dyer examines hunter gatherer groups and their interaction; ritualized warfare in groups like New Guinea bushmen; and other evidence of our more primitive past. Some suggest that ritualized warfare is not the "real deal". It does generate casualties at a low rate, which meant that they could do it a lot, which in turn meant high casualty rates over time. Hobbs, Rousseau and Darwin are invoked and examined for their viewpoints.
The birth of war was driven by human life style choices. Hunting and gathering can keep you alive. So can farming. But a new type of living was now being made: pastoralism (i.e.: nomads). They lived by herding domesticated animals. Inevitably, these groups clashed. The nomads would win fights because they were mobile, which meant they could concentrate their forces when needed. To combat this, walls around towns were built. When horses were domesticated, things accelerated. It is worth noting that Egypt was largely spared from nomadic attacks by its geography.
The Sumerians hit upon the idea that religion could be a better way of settling disputes. It worked for a while. Priests liked it because it gave them power. But as we know, religion is not a cure for war, but more often an excuse or a direct cause.
Aside: Women were equal partners in life until civilization and agriculture came along, driving the need for a power structure which eschewed women because it could.
Sargon was the worlds first emperor over a militarized society (circa 2300 BC). Around then, there was a major technological innovation: the compound bow. The bow and arrow has killed more people than any other weapon, ever. Unlike the English longbow (yet to be invented), the composite bow was short, powerful and could be fired from horseback.
Other inventions over time include the chariot (fast, hit and run, archery platform); the pike and phalanx; war galleys; improved, harder metals; bigger horses; the saddle and stirrup (700 AD); gun powder (1300 AD); and organized armies (which gave the powers-that-be the willies).
Aside: Japan had a warrior based culture. When the musket was introduced, they were appalled. A samurai could be killed by a commoner! Unacceptable… so they just stopped making them. That obviously changed later in history.
Armies got bigger, the percentage of soldiers dying went up, battles were huge but infrequent; and the average citizen was left alone.
Armies had to be trained and fed. You could not just conscript someone and throw them into battle. Standing armies were a part of every major European power from 1700 on.
What happened next was the introduction over time of the concept of "total war".
New weapons were being developed at a quick pace. It was soon discovered that a few men behind cover with guns could stop a large number of advancing troops. Tactics changed. WWI introduced the concept of the continuous front. The tank was used for the first time. It gave professional soldiers the hope that it might end the wars of attrition. It did, but it caused the continuous front to become mobile. This and aircraft took a terrible toll on civilians. The tank eventually spawned "blitzkrieg", or mobile "lightning" war. The world wars were the first where civilian deaths outnumber the deaths of combatants.
The next chapter in the book documents the history of nukes. There have been a lot of ideas over the years on this subject, but the prevailing idea is that you only need enough to destroy the other guy. This was known as MAD (Mutually Assured Destruction). And so far, this math has kept the nuclear peace for 75 years. The advent of nukes gave rise to the phrase "conventional war".
The birth of the professional army came in 1803 Prussia with the creation of the Kriegsakademie. This approach of using well trained professional soldiers was soon adopted by others. One hundred Germans in WWII were a match for 125 British or 250 Russian troops. Why? Because they had 10x as many good generals who knew how to get the most from men and equipment. The old game is now protected turf and generals everywhere want to keep it that way (the start of the military-industrial complex?). Technology has changed the equation a great deal. One example: a Spitfire in WWII cost 5,000 pounds to make. The supersonic Tornado, its 3rd generation successor, cost 17 million, 172 times more after adjusting for inflation.
The last chapter speculates about the future of war:
Baboons are nasty creatures. The males are obsessed with status and fight all the time. But in one troop something unusual happened. The aggressive males all died off at once from eating infected meat. Overnight, the troop settled down to a much more egalitarian society. And it stayed that way even after the demographics of the tribe returned to normal! The ritualized war fare on New Guinea bushmen killed a large percentage of combatants, one at a time, year after year. The government stepped in and told them this had to stop. They all agreed enthusiastically and never looked back! It seems both were caught in a local stability point that they could not get out of without a nudge. Perhaps there is hope in both those stories.
While not as eye-opening a book as Guns Germs and Steal, this book is a must read for anyone who wishes to grapple with these thorny issues. War has been with us for all of our history. But perhaps we can first ritualize it, and then dump it as a bad idea. I doubt it. The New Guinea bushmen were offered an alternative to their wars… essentially third party arbitration. This is something the top dogs will never agree to.
One final tidbit: Have you ever wondered if you were safer as an officer in combat, or as a soldier? In WWII, the answer was "soldier". In Vietnam, it did not matter.
Dyer himself has served, so he has seen some of all this from the inside.
Lee Moller is a life-long skeptic and atheist and the author of The God Con.