his book is in fine company. It is a book primarily about skepticism. The author and I have a little in common, in that we were or are members of a Skeptics Group. Prothero, a PhD geologist, is with the Pasadena Skeptics. I note that he is a PhD because he warns of books written by people who flaunt their PhDs.
The book covers geology related subjects that are a decent sub-set of all the crazy ideas that are out there. Young Earthers are trashed, as are flat Earthers, hawkers of crystals, Atlantis, dowsers, and moon-landing deniers.
Aside: Andy Kaufman died because he rejected modern medicine and relied instead on crystal healing.
It was a quick read and a good addition to my library on subjects (like Ley lines) that I would otherwise have to research.
One thing that comes across very clearly is that scientific illiteracy in the US is driven largely by the cesspool of the internet. In fact, by my count, he called the internet a "cesspool" four times. Ironically, the internet was created to serve scientists and promote data exchange. He speaks highly of, and quotes often, Carl Sagan. As a long time skeptic myself, I am familiar with the arguments about wrt scientific literacy, basic logical arguments, human biases, and such.
I only know one person personally that is foolish enough to posit a 10,000 year old (or less) Earth. I have had several exchanges with him over the years. One argument that gets repeated a lot is that his belief in god is no different than my belief in Newton's gravity and other scientific ideas like evolution. I often reply to this attack by explaining that I use the word "believe" in a different way (based on probability) than he does. The book suggests a different language, the gist of which is below:
Science has only one "belief"… namely, that the world is understandable. I do not "believe" in Newton's law, but rather I accept it, based on, in this particular case, an overwhelming preponderance of the evidence (Newton's laws got us to the moon and back). I like this language better, as it is easier to justify.
I also enjoyed the obvious fact that the author likes movies. He mentions several, including the worst SF film ever made (as voted by geoscientists), The Core.
The Abruzzo, Italy earthquake resulted in many deaths, and six seismologists were convicted of manslaughter for not predicting the quake! After 5,000 seismologists wrote letters, the conviction was overturned. Its hard to be a scientist sometimes.
The book has a well researched chapter on The Flood. The details of how Gilgamesh and the various versions of the old testament are weaved together into a mish-mash of "god's word" is very interesting. No one who understands how the Bible came to be can believe that it is the actual word of god, because it comes from several different sources, and it contradicts itself and reality… a lot. The absolute most charitable one can be is to say that the bible might reflect god's wishes, as filtered and understood by man. But that is thin gruel at best.
This book has a lot of fine photos and illustrations. It discusses basic skeptical issues like reserving judgment and human bias. And many of its topics are historical in nature, so there is lot here for a newcomer to the skeptical world to absorb.
If you have any interest in geology and the basics of skepticism, this is a good book for you.
I do not have a lot to say about this book. A friend suggested it and I read it. It is well written, engrossing at times, and had many insights into the terrible struggle to survive that England went through in the first years of the war. There are a few very colorful characters, and few dark ones as well:
The image of Churchill walking around naked, drinking brandy and champagne, barking orders, and smoking his cigars is almost a cliché.
Lord Beaverbrook, the most interesting personality in the book, almost single-handedly ramped up airplane production and won the Battle of Britton. The Battle of Britton was the air war during the summer of 1940. The Blitz was the bombing of London and other civilian sites, and it went on much longer. The Battle Britton ended when the Blitz began.
Lindemann is a character I knew a bit about from other readings. He was Churchill's right-hand man, science advisor, and general all-round dickhead... hated by everyone. "Often wrong but never in doubt" sums him up pretty well. While he may have contributed to the war in many positive ways, the reverse was also true. For example, a few years later, he dismissed the idea of the V2 as a physically impossibility. He was very, very wrong.
Randolph was Churchill's ne'er-do-well son (his other children were women). A drunken, philandering, gambler… he spent about a million dollars a year (today's money) on booze, broads and gambling!
Rudolph Hess was the most interesting German character that was explored. I learned a few new things about his ill-fated trip to England to make peace. He would remain in Spandau prison, its last inmate, until he killed himself at age 94.
Adolph Galland is another interesting German character. He was an ace fighter pilot. I read his biography in my 20s. By the war's end, he was a general and ran the entire fighter defense of Germany. Later on, he was a technical consultant on the making of the movie "The Battle of Britton". He was the only pilot who was allowed, personally, by Hitler, to smoke cigars and fly at the same time. He smoked 20 a day.
The other principle characters are the remaining members of the Churchill family. A low-key Peyton Place.
Politically, the first years of the war were about survival and getting the US into the fray. The Lend-Lease Act was a big part of that. The pressure of constant bombing had eased up by mid '41, as Hitler turned his eye to the East. Within a week of Pearl Harbor (Dec 7, '41), the US was at war with both Japan and Germany. Germany declared war on the US, and the US reciprocated. The US was the only country that Germany declared war on in WWII! This is where the book ends.
This is a long book, but an easy read. If you skip a sentence or two, you don't miss much. I normally scribble notes into the books I read, and then summarize them afterward. I did that here, but only scribbled about a dozen times. In other words, I learned very little worth knowing. But if you want to get a feel for the gestalt at the time; the attitudes and feelings of both the government players and the people; and the nature of the suffering they went though, I would recommend it.
Another very enjoyable book from Christie Blatchford. I have always liked reading her columns in the NP. Her earthy style of writing is restrained in newsprint, but not so in her books. It is a worthy successor to Helpless, the story of the OPP and the feds turning their backs on the small town of Caledonia, ON. Truth be told, the sins of the system as described in Life Sentence do not hold a candle to the system allowing politics and ambition to trump even the most basic tenets of the rule of law in Caledonia.
A nice type size and good leading means a fairly quick read.
The book is broken into broken into several large chunks consisting of an anecdotal review of her career; then four long chapters on the big cases: R v. : Abreha, Elliott, Bernardo; and Ghomeshi.
In the opening chapter, she recounts some fun moments, like when the Special Investigative Unit that investigates police shootings hired a hot homicide detective only to discover that he was a fraud; or the when the government hired a race relations specialist who told lawyers that the Holocaust was not racist because no black people were involved. She notes as well, after years of legal wrangling, Duffy is back in the Senate sucking on the same teat as before. And more importantly, she asks why judges do not get the same scrutiny as senators. She points out that judges work for us, and that it is within our rights to criticize them, and they have a duty to disclose expenses just like everybody else.
In Abreha, Christie rails against the condescending treatment of jurors. In fact, we just had the Oland case pitched due to an issue of jury instruction. Jurors seem to be unable to get even the most trivial of research sources themselves, like having access to a dictionary. It is assumed that jurors are incapable of, for example, separating past misdeeds from current misdeeds, but it is inherently assumed that lawyers and judges are capable of such feats, as well as many others that mere mortals can only aspire to. Blatchford quotes one juror who said: "The arrogance of the judicial system doling out just enough information to keep us pure 'intolerable'. " I agree. In some cases, judges have actually lied to jurors. Actually, they all lie to the jurors, because they all say the same thing at the end of the trial… "You have now heard all the evidence.", and that is almost always a lie. If you say that is not right, you will get a lecture on "probative value versus prejudicial effect". IMHO: If we are going to have juries, they should have all the facts.
The Elliott case focused on a judge Cosgrove who went right off the rails during the trial. To make a long weird tale short, Cosgrove was incompetent. He threw his weight around illegally, and, at the end of the day, still did not acknowledge his misdeeds. Cosgrove was a patronage appointment. The Canadian Judicial Council was involved and actually debated whether "incompetence" should be tolerated in judges, so untouchable as they are once appointed. Camp is another judge recently in the news who actually used "ignorance of the law" as an excuse for his errors as a judge!. The appointment process is totally screwed up in Canada, but the good news is that it is getting better.
Reading about Bernardo again is hard. The facts of the case are stomach-turning. The Bernardo trial was totally screwed up by the prosecution. Innocent lawyers were trashed by the system. Politics, optics and expediency ruled the court's decision making processes. The crown made a deal with the devil (Homolka) when they definitely should not have. But worse for the legal system, victims were granted de facto status in the court, with their own attorney, who the crown then tasked to do things that were clearly in conflict. This mess resulted in some really dumb stuff. The press was not allowed to see the Homolka tapes (due to the victim's weight in the court), but could hear them. But the sound was bad, so the crown provided a transcript that they could not read, but the cops could read it to them. So the reporters had to scribble the text from the readings of the cops while listening to a tape, which they could not understand, of a video they were not allowed to see. The Bernardo case saw the legal system turn on itself, and it was ugly. This rise of the victim does not bode well, and we are seeing the impacts today. The victim should have no say in the evidence presented at trial, but in Bernardo, they ruled the roost. The state even went after reporters for breaching court orders WRT banned information, information that they had made public earlier. In one instance, the OPP fabricated evidence to get at a lawyer who had crossed the Province's AG, who was hip deep in conflict issues.
Finally, the Ghomeshi trial is discussed, and it too was a fiasco. Once again, the victims rose up, screwed up everything, and disappeared. The details of the Ghomeshi trial are still fresh in most peoples mind, but if you want more, read the book.
This was a good read. The system is not broken. I am sure 95% of convictions are routine and well handled. But it seems the bigger the trial, the more it seems like the lunatics are running the asylum. We recently had a literal show trial and it showed us that the judge was a screw-up.
The Philippines are a large group of islands in the Pacific. The western part of the island group is largely open to the ocean. It is easy to enter the inner seas from the west. The east is a different matter. Luzon to the north and Mindanao to the south are the largest islands in the group. Samar and Leyte, essentially one island, are to the south and east of Luzon and form the eastern shores of the Philippines along with Mindanao.
In late October, 1944, a major landing was underway at Leyte Gulf in anticipation of MacArthur's return and the liberation of the Philippines. Leyte Gulf was filled with helpless transports and an attack by the Japanese was expected.
There are three ways to get to Leyte gulf. From the west (the Japan side), one can go through the Philippines via the Sibuyan sea and exit on the east side of the Philippines through the San Bernardino Straights, which separates Luzon from Samar and Leyte, and then turn south towards Leyte gulf. Or one can approach from the south, taking the Surigao Straight north of Mindanao, which opens onto Leyte gulf. The only other approach is from the eastern Pacific, an area controlled by the US. This is the field of battle for the largest naval conflict in history.
Japan was going all-in. Either they beat the US back, or Japan's navel dominance in then Pacific would be over, and Japan's fate sealed. Prior battles, especially the "Great Marianas Turkey Shoot", had all but wiped out Japans naval air power. They had carriers, but few planes and fewer crews to man them. However, they still had the world's two largest battleships ever: the Yamato and the Musashi.
The US was fighting far from home. They were stretched thin on ammo and fuel. But by all measures, they out gunned the Japanese.
This was a very complex battle that took place over a few days. I will give a 50,000 foot description.
The Japanese were in three groups (JN, JM, JS).
JN (for North) hung off the Philippines to the north and east. It was a carrier fleet, with almost no planes. It included the Zuikaku, the last Pearl Harbor flat top still afloat. It would not survive this battle. This fleet was assigned to throw itself at the northerly American ships as a feint to draw the Americans away from Leyte. They expected to get cut to pieces… and they were.
JM (for middle) went through the Sibuyan Sea and out the San Bernardino Straights. Its job, as was JS's, was to sink American ships and stop the invasion.
JS (for South) approached from the south through the narrow Surigao Straights.
The Americans were in three groups (AN, AM, AS).
AN was Bull Halsey's group. Their primary task was to guard the San Bernardino Straights. Halsey would split his force to create AM.
AM included a group returning from refits that were not completed. It steamed into the middle off Samar Island.
AS was to guard the Surigao Straights and the Leyte landing.
The Americans hit JN first by air in the middle of the Sibuyan Sea (the Battle of the Sibuyan Sea) , sinking the Musashi (the first Japanese battleship to be sunk my air power alone), and hurting the Japanese. The US over-estimated the damage done.
Then the Americans waited for the Japanese JS at the top of the Surigao Straights and beat them back decisively.
The Battle of Cape Engano (located off the north end of Luzon) followed with JM vs AM. It was a draw. And finally the Battle of Samar finished the encounter. This battle saw the introduction of the Kamikaze. The US took a lot of damage in this battle, mostly to escort carriers.
A lot of ink has been spilled on AN and Halsey. He did manage to finish off Japanese naval air power, but it was already all but dead. He spent most of his time steaming toward a fight rather than fighting. He left his post guarding the San Bernardino Straights to get JN. This left AM unable to handle JN when it slipped through the San Bernardino Straights un-noticed.
An equal amount of ink was spilled on why JM decided to bug out. The Battle of Samar was in Japan's favor. AM was on it knees, but they did not know that, nor did they know that JAs feint had actually worked. Poor communications. Had they continued to steam south to Leyte Gulf, they would have wreaked carnage. But instead they turn back through the San Bernardino Straights.
The Americans out gunned the Japanese and expected a win. Halsey almost reversed that. Poor intelligence on both sides led to poor decisions. Both sides also had command issues. There was no over all commander on either side. This led to poor coordination of attacks.
There was much carnage to follow before the war would end, but the Japanese would never again pose a serious navel threat (this does not count kamikazes).
The book is a fairly quick read. Lots of details on who did what, when, and why. It illustrates the old maxim that no battle plan ever survives contact with the enemy. It certainly underscores the need for good communications. In 1944, most ships ran "silent" and only sent short transmitted radio messages in code. This lead to delays and errors.
When pondering on disaster relief, it is helpful to know what natural disasters we should prepare for.
War is hardly natural, and we seem to always be on the brink of it somewhere in the world, so I ignore it.
There are two major issues to deal with: How big might it be, and what are the chances of it happening. A crude break down of “how big” is Local vs Global.
Here is the list off the top of my head (Disaster; Size, Probability):
Only “Pandemic” stands out. It has the potential to go global; AND since it has happened in the past (Spanish Flu, Swine Flu, SARS), and, due largely to air travel, is more likely to happen in the future… the probability is high.
Covid 19 was not unlucky, it was inevitable. A moments thought, which is about as much as I put into this piece, will convince anyone that the smart money should go to prevention of pandemics.
Therefore, ergo, ipso facto, and thus, Donald Trump disbanded the pandemic teams, cut the budget for the CDC and tried to build a wall to keep out Mexicans. He has turned recklessness into an art form. Of course, he did not create C19, but he has, and is, making it much worse.
I always want to end on a happy thought. This can be fixed.
1) Reduce the number of people on the planet by 3 billion at least;
2) Reduce the number of Donald Trumps by one (ideally, remove the whole family tree); and
3) Stop treating every square inch of the planet as a tourist destination.
The following paper was written by my friend Dale Beyerstein for the journal Humanist Perspectives ( https://www.humanistperspectives.org/issue211/index.html ) .
People who debunk false or nonsensical claims sometimes tend to specialise – e.g., concentrating on pseudohistorical claims such as the 9/11 conspiracy, paranormal claims such as those involving extraterrestrial UFOs, alternative health claims such as the canard that MMR vaccine causes autism, or the claims of particular religions. This is understandable, because the one thing that unites modern day critics of these intellectual travesties is their commitment to thoroughness in researching and analysing claims before accepting them or debunking them. Of course, this is not to say that skeptics have always is lived up this ideal. But when one skeptic does not, fellow skeptics are generally just as critical of that lapse as they are towards any paranormalist.
But despite this tendency towards depth of knowledge, many skeptics also show a wide breadth. Many are quite well versed in what’s wrong with many different pseudosciences – from different versions of alternative or complementary medicine to astrology to UFOs. In addition, some are well versed in other areas as well, such as religion, politics, or anything you may wish to discuss around the watercooler. As well as being able to discuss these topics, they also can give a coherent account of scientific reasoning and critical thinking. Is this because these people are polymaths? Well, some are, but I think that the main reason has to with their curiosity.
This is not the way many people view skeptics. Many people see skeptics as, in the words of U.S. President Richard Nixon’s Vice President Spiro Agnew, “nattering nabobs of negativism.” This is because skeptics reject quite a few popularly held beliefs. But skeptics hold a number of beliefs, albeit provisionally. Sit down with a group of people that includes at least one skeptic, and you will find a genuine interest in ideas. But more important, you will hear a lot of “Why is that?”, “What’s the evidence for that?”, or “But what about…”. And they are curious, but they practise a special type of curiosity, which I shall call what I shall call guided curiosity.
What am I on about here? Well, everyone holds that curiosity is a good thing; but if you think about it, unbridled curiosity can actually inhibit understanding. Take conspiracy theorists, for example. Being curious about how every girder split from its mountings in the World Trade Center on 9/11 wouldn’t seem to be a bad thing. However, when you think about it, it’s obvious that such detailed knowledge about an event such as this just isn’t going to be available. Who would be wandering around the buildings as they were collapsing to gather it? And if anyone did, would they have survived long enough to tell us? And it begs the question about a conspiracy to assume that anyone was figuring these things out in advance in order to plan the attack. It’s obvious that flying a plane into an iconic building will cause chaos; the perpetrators didn’t need to know exactly how much or exactly how it would happen. So why should we be worried that we lack that information? But this lack of information is what starts some conspiracy theorists down the rabbit hole. Why don’t we have it? Who is hiding it? Why? Religious believers fall into the same trap. What happened before the Big Bang? Obviously, since we have no answer at present, we must conclude that there must have been a God to cause it. Asking questions that admit of no answers, or improperly formulating them in a way that prevents a sensible answer to be given, is a surefire method for generating false, and sometimes ridiculous beliefs. So there needs to be limits on our curiosity.
I’m not suggesting limiting the range of one’s curiosity to what is ‘practical’, or of immediate interest, or to easily answerable questions. Rather, the point of limiting curiosity is this: The wider one draws the curiosity net, the more information one receives, everything else being equal. But on the other hand, one will pick up more flotsam and jetsam as well. So the point is to maximise knowledge, or at least justified beliefs while minimising the amount of nonsense or outright falsehood.
The method which does the most to achieve this is best stated by the eighteenth-century philosopher David Hume: “A wise man, therefore, proportions his belief to the evidence” (Hume 1955). Let’s refer to this as Hume’s Rule. Sexism aside, the rule is stating that the stronger the evidence, the more confidence one should have in one’s belief, and the opposite holds as well. Less evidence should result in a weaker belief. And a corollary of this principle is that one should not have a belief at all until one has examined the evidence. Following Hume’s Rule is the basis of guided curiosity. In what follows I shall give some examples from religion and paranormal belief to show how guided curiosity keeps us from falling for nonsense.
The consequences for religious belief of Hume’s Rule are readily apparent. Very few religious believers, when pressed, will hold that the evidence for religious belief is very strong. This is where the argument typically takes a turn: to hold that there is, after all, an exception to Hume’s Rule, which applies only to religion. Religious belief is grounded on faith, not reason or evidence; and Hume’s Rule applies only to beliefs based on evidence. Faith is, of course, precisely belief in the absence of evidence, and is, according to the religionist, a virtue which not only elevates the religionist who possesses it, but shows the simple-mindedness and shallowness of thought and character of the atheist or agnostic who lacks it, and instead asks for evidence.
By the time the atheist has defended her character from the above charge, there probably won’t be much time left to return to the question why faith is only appropriate for religious claims. After all, faith has a companion in mundane affairs, gullibility – which is also belief in the absence of evidence – which is decidedly not considered a virtue in those who invested in Bernie Madoff’s Ponzi scheme. But if you do find the time to pursue this point with the religionist, it’s very unlikely that you will receive an informed answer. Instead you will probably be told that this comparison is insulting, and the debate will end there. Perhaps it is; but the fact that a person may be insulted by being told that he has a long nose doesn’t by itself prove that the claim is wrong.
Hume’s Rule has another important implication for religion, atheism and agnosticism. To see it, let’s introduce just a bit of probability theory. Since evidence is what makes a belief probable, it follows that evidence which establishes the probability of a claim to be less than .5 (or 50%) should be disbelieved. This is because the probability of belief p and the belief in the denial of p (Not-p) must add up to 1. The sentence “It will either rain or not rain on my house today” has a probability of 1, or, in other words, expresses a certainty (the Law of the Excluded Middle in logic). According to the app on my phone, the probability that it will rain here today is 40%, or .4. When a belief p has a probability of .4, its denial, Not-p has a probability of .6 (1 minus .4). So I should believe that it won’t rain today – but, applying Hume’s Rule, my belief shouldn’t be very strong, and I should be prepared to change it. Ditto my belief in a god, except that I assign the probability of there being a god to be much lower.
A claim with a probability of exactly .5 should be neither believed nor disbelieved. Thus, agnostics must be holding that the evidence for belief in a god is just as compelling as that for disbelief, or, in other words, a probability of .5 for each. A probability of less than .5 is grounds for atheism, for the reason just given. But most people who call themselves agnostics do not really believe that the probabilities are equal. They concede that the probability that there is a god is less than that of the belief that there isn’t one, but they stick to the claim that we cannot be sure that a god doesn’t exist. But this is just to miss the point of one of the basic axioms of probability theory given above. It is important to remember that the denial of the existence of something does not require evidence that the probability of its existence is 0. Most agnostics have no difficulty in dismissing the existence of Santa Claus or the Tooth Fairy, despite not having checked every claim of where Christmas presents or quarters under the pillow came from, and therefore not being in the position to say that the probability of their existence is 0.
It might be thought that the agnostic has an answer to this. Given that God is supposed to be transcendent, completely outside the realm of human experience, no evidence is possible for belief or disbelief, because there is no evidence at all. With no evidence leading us in either direction, suspension of belief (agnosticism) seems to be the only reasonable position. But this rebuttal isn’t conclusive.
The best reason for suspending belief is that we are awaiting further evidence that might require us to change our minds. Now let’s return to the agnostic’s strong point that we are considering the transcendent; for which no empirical evidence can be found. If this is so, then there would be no reason to suspend belief pending further evidence – the supposition is just that this won’t be any. Now, add this to another corollary of Hume’s Law: The onus of proof is always on the person who puts the idea forward. When the claim is presented without any evidence to support it, Hume’s wise person would disbelieve it. This is because there are always more ways of getting something wrong than of getting it right. Take for example, guessing the day of the week on which a total stranger was born. If you guess Thursday, you have one chance in seven of getting it right, and the smart money will be on you getting it wrong. So in the debate between the atheist and agnostic where both agree that there is no evidence available about the transcendent (literally the world for which no empirical evidence possible), the onus of proof is on the believer, and the believer has none. Therefore, the claim should be disbelieved, and the atheist wins by default. In addition, remember that the theist claims to believe in some god or another; one with certain properties (even if she admits that there is no empirical evidence for those properties. But now go to a second theist, who believes in another god, with somewhat different properties). Both of these theists will be implicitly recognising the onus of proof, since they apply it to each other: the first will deny the second’s god on the grounds that the evidence is insufficient, and vice versa. Repeat this a few thousands of times, and we have the point made so well by Richard Dawkins (2006): “We are all atheists about most of the gods that humanity has ever believed in. Some of us just go one god further.”
Pseudoscientific claims fudge the onus of proof too. In fact, they do so with such regularity that we might consider this error as one of the defining characteristics of pseudoscience. Take conspiracy theories for example. Why is all the evidence about how “they” killed Kennedy, or placed the explosives in the World Trade Center towers to supplement the work of the airplanes, completely hidden? That it is missing is just the evidence that skeptics are supposedly too dense to see; its absence shows how clever and powerful “they” are that “they” can hide it so well. You can diagnose the informal fallacy involved here as failing to respect the onus of proof; or you can equally well call it begging the question (appealing to the very fact you are attempting to prove as evidence for the very fact that you are attempting to prove). Or you can call it the argument from ignorance, which involves saying that because you cannot disprove my claim, I must be right – whether or not I have presented any evidence. But the main point is that they have managed the impossible feat of creating something out of nothing. This, by the way, is the thing about informal fallacies: They are like cockroaches, in that if you spot one you can be assured that there are a bunch more lurking where you can’t see them.
A corollary of Hume’s Rule is the requirement that we search for not only the evidence that supports our belief, but for that which goes against it. Looking only for the supporting evidence is confirmation bias. The religious, conspiracy theory and the paranormal believer on the other hand are notorious for this cherry picking. The Christian apologist, searching for miracles, concentrates on the one little baby that survives the plane crash, but ignores the 200 others who perish. The believer in her own telepathic powers zeroes in on the few times she ‘knows’ what her friend will say next, while remaining blindly oblivious to the many more times she guesses wrong. The 9/11 conspiracy theorist pays attention to any problem, no matter how insignificant, in the received account of how the towers collapsed, while not being troubled at all by the fact that there is no evidence whatsoever of anyone planting any explosives in the buildings beforehand. The believer in any sort of complementary or alternative medicine will keep track of every cold that cures itself after taking echinacea while ignoring the ones that cure themselves without taking it. The graphologist (handwriting analyst) keeps track of every case where a handwriting sample shows large loops on the descenders of fs, gs, js, ps, qs and zs and its writer has a better than average libido, and ignores those with large libidos but wih handwriting characteristics that cannot be described this way, as well as those whose handwriting has thin or small descenders but who are nevertheless quite libidinous. (If you think I’m making this up, check Paterson, (1980:11), and don’t ask how she measured libidinousness.)
Let us look at one more rule of guided curiosity. It’s not only important to pick up new information; it’s also essential to compare that new information with what you already have. Is the new information consistent with what you already believe? If not, you have work to do to reconcile these beliefs. Perhaps you will have to reject the new information; perhaps you will have to modify it to a certain degree. Or maybe you will have to do one or both of these things to your present beliefs to achieve a fit. This shuffling process will always be with us, as long as we are gaining new information. (For a better account of this process, see Quine and Ullian (1978).) If the new observation is consistent with our old beliefs, then you can ask how it enhances your understanding of what you already know. What implications are there from the new belief to other hitherto unsuspected claims? Do these implications suggest further claims which can be tested? An example will show what I mean. Therapeutic Touch (TT) is a healing modality which involves practitioners manipulating the “energy field” which surrounds our body without actually touching the body itself. (This has led me to describe TT as “neither therapy nor touch.”} One shape of the field is healthy, others are making us sick. The crucial thing to note about TT theory is that practitioners work on just this energy field, not on the body underneath it. So, let’s take these claims at face value. If the healers can manipulate the field, they must be able to discern its presence somehow without simply inferring it from the presence of a body. But if we can perceive its presence through sight, smell, touch or hearing (probably taste isn’t an option here), then everyone should be aware of it. But we are not. Only TT practitioners are. Well, that must be because there’s another sensory mechanism which not all of us have – only those who would be good therapeutic touch practitioners have it. The inference from the claim that the energy field can be worked with to the claim that practitioners must be able to recognise its presence is not one that is often made by TT believers. But it was made by a nine-year-old from Loveland, Colorado, Emily Rosa. And though no TT practitioner had thought of doing this, the young Ms Rosa thought about how to test this claim. For her science project she set up a solid barrier dividing a table in half. The barrier left enough room for a TT practitioner to pass her hand underneath, just high enough for the experimenter to have her hand underneath it, or not. Whether it was or was not was determined by a randomizer. So, if the TT practitioner could do better than chance, on a test designed to rule out the other sensory modalities, this would be evidence of the energy field that some gifted people could detect. Needless to say, Ms Rosa’s experiment did not confirm this hypothesis, but it did lead to her being the youngest person ever to be published in a top rank medical journal (Rosa , et al, 1998).
There is another important implication of Hume’s Rule. He tells us that belief should be based on the preponderance of evidence, or on the probability that evidence confers on the belief. But the evidence for or against the belief is continually shifting as more of it becomes available. Along with this, the probabilities will fluctuate. Remembering this is how the skeptic following guided curiosity avoids dogmatism even when she has a fairly strongly held belief. She is always ready to modify her beliefs, and in some cases switch from belief to disbelief or vice versa as the new evidence comes in. And it is also why a skeptic should not only state her beliefs, but state them along with the degree of confidence – her estimate of the likelihood that more evidence will require her to revise or abandon those beliefs. Or better yet, always be prepared to state the belief along with the evidence for it. With these qualifications, there is no harm in provisionally stating a belief with a probability not much higher than .5, or disbelief even when the probability is a bit less than .5. Taking a belief seriously confers the benefit that, once a belief is stated along with the evidence for it, it can be examined, and implications drawn from it, which in turn can lead to new understanding. However, dismissing it because the probability is not much above .5 forgoes this possibility. The important thing to remember about dogmatism is that what is wrong with it is not the forceful stating of the belief, but concentrating on the belief rather than the evidence for it, or the unwillingness to budge from it when new evidence comes to light.
Some skeptics will be disappointed that I have gone all this way without mentioning one of the cardinal principles of skepticism, that there are times – quite a lot of them, actually – when you shouldn’t express a belief at all; you should straightforwardly admit that you do not know. There are two advantages to this admission. The first is that it serves as a stimulus to curiosity: having admitted that you don’t already know gives you a good reason to try to find out. Second, it prevents you from misleading others (and yourself). When they think you know and you don’t, they might follow you when they shouldn’t.
There are two situations where one should say that one doesn’t know. The first is when this expression is simply a substitute for “I don’t care.” This expression is tantamount to admitting that you have very little evidence and you are not prepared to gather any more. For example, just by reading the headlines and deciding that I have no interest in the articles they head, I couldn’t help finding out that Prince William, the Duke of Cambridge, and his wife, Kate Middleton, recently had a son, whom they named “Archie”. But his birthweight? I don’t know; meaning …. After all, one cannot expect to have time to look into everything; one must prioritise.
The second situation is where the evidence you have at present is about equally compelling for belief and disbelief, and it is possible to get more. In this situation it makes sense to suspend belief and wait for further evidence. But there’s an important exception here: sometimes waiting isn’t a viable option; circumstances require an immediate response. Fortunately, these special cases are quite rare; so withholding judgement while awaiting more evidence is an available option, and a good one.
Otto von Neurath (1921) compared our belief system to a leaky ship at sea. We are continually replacing rotten planks with fresh ones, but never are we able to replace the whole bottom at once, given that we wish to remain afloat. Thus we will never have a perfect set of beliefs; there will always be some false ones in there that we haven’t found yet. The best we can hope for is a gradual improvement. To continue with his metaphor, when we find a particularly strong plank on the boat which doesn’t fit very well with the old ones already in place, going to the all the work to make it fit may result in a much less leaky boat overall. Similarly, encountering a new belief that is inconsistent with some old ones, but with a lot of evidence backing it up may require the modification of several of the old beliefs at once. But the result may be a more coherent belief system overall. But not a perfect one. Non-skeptics may find this disconcerting. They are like the sailors who aren’t in it for the pleasure, but just want to get somewhere – anywhere -- and who just want to go along for the ride. But the true skeptic enjoys the sailing for its own sake.
Dawkins, Richard 2006: The God Delusion.Boston, Houghton Mifflin Company.
Hume, David: “Of Miracles”, in Hume 1955, Indianapolis, The Bobbs-Merill Company Inc., An Inquiry Concerning Human Understanding.
Paterson, Jane 1980: Know Yourself Through Your Handwriting. Montreal, Readers Digest Assn.
Quine, W.v.O. Quine and J.S. Ullian, 1978: The Web of Belief. New York, Random House.
Rosa, L., E. Rosa, L. Sarner, S. Barrett 1998: "A Close Look at Therapeutic Touch". Journal of the American Medical Association, Vol 279(13):1005–1010.
Von Neurath, Otto 1921: Anti-Spengler. Munich, G.D.W. Callwey
Dyer is a Canadian war researcher who wrote this book in 1985. From what I can gather, it is well respected. This edition has been revised and updated.
The book tours war historically, examining its roots and its evolution over the millennia.
The opening comments discuss some of the big picture history. For example, major battles prior to the 20th century would generate casualty rates as high as 50-60%, with an average of 20. In modern battles, this figure rarely exceeds 1 percent. This was due largely to changes in technology (weapons got better). This in turn changed the nature of soldiers and their mission. In the old days, soldiers did not get combat fatigue (shell shock, or today: PTSD) because they would die before it ever happened. Today, all armies recognize that troops can only take so many combat days (in WWII, it was 240) before they fall apart.
Dyer has always impressed me especially with regard to basic training, which he discusses in a chapter called "Any one's Son Will Do". Get them when they are young. Break them down to the same level, making them all equals, and then build up a small group dynamic where each man relies on the others to help him stay alive. Drill Sergeants are masters of psychological manipulation, and they know it.
In his opening chapter, Dyer makes this startling revelation: In WWII in the ETO, only 15% of combat infantry riflemen ever fired their weapon in anger! Most soldiers would never have reveal such a fact, until they were told they were not at all alone. More than four fifths of combat soldiers got through the war with killing anyone, and without firing a shot! More research has backed this up. Gettysburg was a hugely costly battle for Americans because it was Americans on both sides. More than half of the recovered muskets from the battle were loaded with more than one round, and only 5% were ready to be fired when they were abandoned or dropped. Six thousand had as many as 10 rounds in the barrel. In other words, most of the fighters were spending their time reloading a loaded gun, and, one assumes, ducking. No one is suggesting these men were cowards. Many were simply principled and did not want to kill.
The upshot of this is startling: if you could get the malingerers to enter the fray, the other guys didn't stand a chance. Which they did. New training got the numbers up to 50% in Korea and 80% in Vietnam. The US had figured out how to get men to kill automatically. Dehumanizing the enemy was a big part of it.
War goes back a long way. Chimps war on each other and so do we. We did not invent war, we inherited it. Dyer examines hunter gatherer groups and their interaction; ritualized warfare in groups like New Guinea bushmen; and other evidence of our more primitive past. Some suggest that ritualized warfare is not the "real deal". It does generate casualties at a low rate, which meant that they could do it a lot, which in turn meant high casualty rates over time. Hobbs, Rousseau and Darwin are invoked and examined for their viewpoints.
The birth of war was driven by human life style choices. Hunting and gathering can keep you alive. So can farming. But a new type of living was now being made: pastoralism (i.e.: nomads). They lived by herding domesticated animals. Inevitably, these groups clashed. The nomads would win fights because they were mobile, which meant they could concentrate their forces when needed. To combat this, walls around towns were built. When horses were domesticated, things accelerated. It is worth noting that Egypt was largely spared from nomadic attacks by its geography.
The Sumerians hit upon the idea that religion could be a better way of settling disputes. It worked for a while. Priests liked it because it gave them power. But as we know, religion is not a cure for war, but more often an excuse or a direct cause.
Aside: Women were equal partners in life until civilization and agriculture came along, driving the need for a power structure which eschewed women because it could.
Sargon was the worlds first emperor over a militarized society (circa 2300 BC). Around then, there was a major technological innovation: the compound bow. The bow and arrow has killed more people than any other weapon, ever. Unlike the English longbow (yet to be invented), the composite bow was short, powerful and could be fired from horseback.
Other inventions over time include the chariot (fast, hit and run, archery platform); the pike and phalanx; war galleys; improved, harder metals; bigger horses; the saddle and stirrup (700 AD); gun powder (1300 AD); and organized armies (which gave the powers-that-be the willies).
Aside: Japan had a warrior based culture. When the musket was introduced, they were appalled. A samurai could be killed by a commoner! Unacceptable… so they just stopped making them. That obviously changed later in history.
Armies got bigger, the percentage of soldiers dying went up, battles were huge but infrequent; and the average citizen was left alone.
Armies had to be trained and fed. You could not just conscript someone and throw them into battle. Standing armies were a part of every major European power from 1700 on.
What happened next was the introduction over time of the concept of "total war".
New weapons were being developed at a quick pace. It was soon discovered that a few men behind cover with guns could stop a large number of advancing troops. Tactics changed. WWI introduced the concept of the continuous front. The tank was used for the first time. It gave professional soldiers the hope that it might end the wars of attrition. It did, but it caused the continuous front to become mobile. This and aircraft took a terrible toll on civilians. The tank eventually spawned "blitzkrieg", or mobile "lightning" war. The world wars were the first where civilian deaths outnumber the deaths of combatants.
The next chapter in the book documents the history of nukes. There have been a lot of ideas over the years on this subject, but the prevailing idea is that you only need enough to destroy the other guy. This was known as MAD (Mutually Assured Destruction). And so far, this math has kept the nuclear peace for 75 years. The advent of nukes gave rise to the phrase "conventional war".
The birth of the professional army came in 1803 Prussia with the creation of the Kriegsakademie. This approach of using well trained professional soldiers was soon adopted by others. One hundred Germans in WWII were a match for 125 British or 250 Russian troops. Why? Because they had 10x as many good generals who knew how to get the most from men and equipment. The old game is now protected turf and generals everywhere want to keep it that way (the start of the military-industrial complex?). Technology has changed the equation a great deal. One example: a Spitfire in WWII cost 5,000 pounds to make. The supersonic Tornado, its 3rd generation successor, cost 17 million, 172 times more after adjusting for inflation.
The last chapter speculates about the future of war:
Baboons are nasty creatures. The males are obsessed with status and fight all the time. But in one troop something unusual happened. The aggressive males all died off at once from eating infected meat. Overnight, the troop settled down to a much more egalitarian society. And it stayed that way even after the demographics of the tribe returned to normal! The ritualized war fare on New Guinea bushmen killed a large percentage of combatants, one at a time, year after year. The government stepped in and told them this had to stop. They all agreed enthusiastically and never looked back! It seems both were caught in a local stability point that they could not get out of without a nudge. Perhaps there is hope in both those stories.
While not as eye-opening a book as Guns Germs and Steal, this book is a must read for anyone who wishes to grapple with these thorny issues. War has been with us for all of our history. But perhaps we can first ritualize it, and then dump it as a bad idea. I doubt it. The New Guinea bushmen were offered an alternative to their wars… essentially third party arbitration. This is something the top dogs will never agree to.
One final tidbit: Have you ever wondered if you were safer as an officer in combat, or as a soldier? In WWII, the answer was "soldier". In Vietnam, it did not matter.
Dyer himself has served, so he has seen some of all this from the inside.
Very Stable Genius, A: Donald J. Trump's Testing of America; Philip Rucker & Carol Leoning; 2020; Penguin Press; 416 pgs; notes, index
This was a much awaited book when it showed up. It covers the Trump history from getting elected (when the first thing he did was lie and claim he won the popular vote) to just after the infamous July 25th phone call to the president of Ukraine. Like books from other reporters like Bob Woodward, this book is heavy on basic facts. It is clearly written and as the title of the book implies, the authors know about whom they are writing. Unlike Woodward's books, which I found dry and dull, I rather enjoyed reviewing the events of the last few years in one compact reference.
Trump fired Comey -- who he knew was on the west coast -- by sending a letter a letter through one of his henchmen. He also torpedoed him on Twitter. When told that he had screwed up in sending the letter as he did, he replied "I know, fucking incompetence. Drives me crazy!" (referring to his staff). Trump never errs. Period.
The material in the book is largely familiar to anyone who has followed the election of the Mango Mussolini. If you had negative opinions of him, and who doesn't, this book will be satisfying and scary at the same time. If sheds some light on the Mueller report and why it fell so flat. It also illuminates the mind of Trump.
He is a petty, pompous, pugnacious, pinhead (at that is just the "P"s) in charge of a country he does not understand. Every person who has been in contact with him for any length of time has walked away from him covered in bullshit and fleas. Some still serve, but most have been arrested or driven from office.
In years to come, this book will become a reference for the times. The only unsatisfying aspect of the book is that it ends before the story is over.
As a skeptic, I have read a lot about cults. We had at least two ex-cult members lecture the BC Skeptics . I spoke with them in person. The difference between a cult and a religion is often subtle. In this case, we are talking about a political figure who demands utmost fealty and believes he can do no wrong. The is nothing subtle about Trump , and the parallels between the Trump movement and the rise of Nazi Germany continue to trouble me.
This was a read of a different nature. I wanted to learn a little more about feminism and a friend recommended this and one other book. It is different for me because it views the world from a perspective that I might try to empathize with, but cannot experience. It is worth noting that the book is 30 years old now, and some things have changed. I also note that Ms Wolf is very "beautiful", or at least she was in 1990. How much beauty-bias she personally suffered is hard to say.
The book has six chapters: Work, Religion, Culture, Violence, Sex, and Hunger,
There are many examples in every chapter (too many at times) of the various burdens women must bear… some external and some internal. For example: the intro points out that the average American woman would rather loose 10-15 lbs than any other goal. That is really sad. It certainly has a strong external component to it. You can blame Twiggy et al for that, but surely part of this issue is internal.
She introduces the concept of the PBQ (Professional Beauty Qualification). She argues that the PBQ is a way to discriminate against women in a safe, litigation-proof way. The standards for on-air female personalities are quite different than for men. One woman lost a case where her employer said she was too, old, too unattractive, and not deferential enough to men. Yikes! The PBQ, Wolf argues, is the currency of womanhood.
Here is a shocking statement from the book: Young professional women spend up to 1/3 of their income on their appearance. Or to put it slightly cruder: Want to keep your job: get your boobs done.
Culture obviously has shaped women. Women's culture is driven by women's magazines and the advertisers who support them. The advertisers are constantly telling women how they should or could look, and assure them that if they buy their goop, all their woes will go away. In fact they create the woes first, and then fix them. And they deliberately promote competition between women. The author suggests that "adornment" is a huge part of female culture, and I am sure she is right. But I wonder if that aspect of female culture is not part of the problem. Putting such a heavy emphasis on adornment seems rather shallow to me, but I am a guy.
Women and weight is always an issue. Weight Watchers tells women "Always wear your makeup. Even to walk the dog. You never know whom you are going to meet." That says a lot. About appearance issues and weight. The author's Religion chapter focuses primarily on the cult aspects of beauty and make-up. She spends quite a bit of time focusing on the bullshit of skin creams that promise rejuvenation. One marketing line caught my eye: "A lipstick you can have a lasting relationship with." I wonder if it comes with batteries.
Cults are something I know a little about. And it is all (well, most of it) there. Chanting, purifications, confessions and other mind control techniques are on full display. If the woman is also hungry, that helps as it will impair her reasoning.
Sex will always be an issue. And men control the issue. The book cites several cases where women were raped or brutalized, only to have the legal system tut-tut them, or suggest it was all in good fun. Who can forget the Canadian judge who asked a female victim why she didn't just keep her knees together? Ms Wolf does not approve of pornography and she may have a point, but these are issues that can be resolved academically, which is my only real complaint about this book… It could have used more science to back up its conclusions.
There is no doubt that weight and conditions like bulimia and anorexia are major concerns for women. The author quotes a number of statistics on the subject. If one takes the worst numbers that have been put forward, then 1 in 10 college age women are anorexic and 5 more are bulimic. If the true figures are even close to that, that is very troubling indeed. The rail thin heroin-chic skinny look came in with Twiggie and has never left. Porn and women's magazines are part of the problem, to be sure. They both make people of both sexes feel that their bodies are not as good as they could be.
Cosmetic surgeries have gone through the roof. Joan Rivers had more and more done until the last one killed her. The author argues that this is a form of violence toward women, and it is hard to argue that she has a point. Doctors invade diseased bodies as a last resort. Cosmetic surgeons call healthy bodies sick and then invade them. The industry is huge and largely unregulated. Things may have changed in 30 years, but in 1990, doctors could not tell a patients the risks of cosmetic surgery because they did not know themselves.
I found the book over-long on examples, and short on analysis and statistics. But these are quibbles. I can say this: it is more complicated being a woman in our society than a man. We have a long way to go as a society to level the playing field. And women (and men) need to learn how to be happy with themselves and others.
I recently visited my sister-in-law and complemented her on her looks. I had caught her in a rare moment: au natural (no makeup). She thought I was ribbing her. That should not happen.
A few things you should know before reading this email from my aunt:
This is the story of me becoming an atheist.
In the evening our mother told us stories written by H.C. Andersen.
Among them was the story (The Tinderbox) about the soldier, a tinderbox and three big dogs. And on Sundays, I went to the Sunday school and heard stories about Jesus and his disciples.
And then, in school one day, the teacher rolled down a big map of Palestine and he said: This is where Jesus was walking with his disciples.
I was shocked.
Like I would be, if the teacher had taken us to a tree with a big hole in it and declared that:
Here was the tree, where soldier killed the witch, and got the tinderbox.
But I was living in the 1930’s, and opposite to now, people went to church, so I kept my opinion to myself, until one day I openly declared myself as a nonbeliever, no longer a member of the church (I saved taxes), and none of my children, grandchildren, and great grandchildren are members.
Churches in Denmark are empty mostly – only a few old people mostly women are sitting there, and some churches are used for other kinds of social events.
And I need no medicine
And remember: No herring on white bread! (no white food at all).
Lee Moller is a life-long skeptic and atheist and the author of The God Con.