The following paper was written by my friend Dale Beyerstein for the journal Humanist Perspectives ( https://www.humanistperspectives.org/issue211/index.html ) .
People who debunk false or nonsensical claims sometimes tend to specialise – e.g., concentrating on pseudohistorical claims such as the 9/11 conspiracy, paranormal claims such as those involving extraterrestrial UFOs, alternative health claims such as the canard that MMR vaccine causes autism, or the claims of particular religions. This is understandable, because the one thing that unites modern day critics of these intellectual travesties is their commitment to thoroughness in researching and analysing claims before accepting them or debunking them. Of course, this is not to say that skeptics have always is lived up this ideal. But when one skeptic does not, fellow skeptics are generally just as critical of that lapse as they are towards any paranormalist. But despite this tendency towards depth of knowledge, many skeptics also show a wide breadth. Many are quite well versed in what’s wrong with many different pseudosciences – from different versions of alternative or complementary medicine to astrology to UFOs. In addition, some are well versed in other areas as well, such as religion, politics, or anything you may wish to discuss around the watercooler. As well as being able to discuss these topics, they also can give a coherent account of scientific reasoning and critical thinking. Is this because these people are polymaths? Well, some are, but I think that the main reason has to with their curiosity. This is not the way many people view skeptics. Many people see skeptics as, in the words of U.S. President Richard Nixon’s Vice President Spiro Agnew, “nattering nabobs of negativism.” This is because skeptics reject quite a few popularly held beliefs. But skeptics hold a number of beliefs, albeit provisionally. Sit down with a group of people that includes at least one skeptic, and you will find a genuine interest in ideas. But more important, you will hear a lot of “Why is that?”, “What’s the evidence for that?”, or “But what about…”. And they are curious, but they practise a special type of curiosity, which I shall call what I shall call guided curiosity. What am I on about here? Well, everyone holds that curiosity is a good thing; but if you think about it, unbridled curiosity can actually inhibit understanding. Take conspiracy theorists, for example. Being curious about how every girder split from its mountings in the World Trade Center on 9/11 wouldn’t seem to be a bad thing. However, when you think about it, it’s obvious that such detailed knowledge about an event such as this just isn’t going to be available. Who would be wandering around the buildings as they were collapsing to gather it? And if anyone did, would they have survived long enough to tell us? And it begs the question about a conspiracy to assume that anyone was figuring these things out in advance in order to plan the attack. It’s obvious that flying a plane into an iconic building will cause chaos; the perpetrators didn’t need to know exactly how much or exactly how it would happen. So why should we be worried that we lack that information? But this lack of information is what starts some conspiracy theorists down the rabbit hole. Why don’t we have it? Who is hiding it? Why? Religious believers fall into the same trap. What happened before the Big Bang? Obviously, since we have no answer at present, we must conclude that there must have been a God to cause it. Asking questions that admit of no answers, or improperly formulating them in a way that prevents a sensible answer to be given, is a surefire method for generating false, and sometimes ridiculous beliefs. So there needs to be limits on our curiosity. I’m not suggesting limiting the range of one’s curiosity to what is ‘practical’, or of immediate interest, or to easily answerable questions. Rather, the point of limiting curiosity is this: The wider one draws the curiosity net, the more information one receives, everything else being equal. But on the other hand, one will pick up more flotsam and jetsam as well. So the point is to maximise knowledge, or at least justified beliefs while minimising the amount of nonsense or outright falsehood. The method which does the most to achieve this is best stated by the eighteenth-century philosopher David Hume: “A wise man, therefore, proportions his belief to the evidence” (Hume 1955). Let’s refer to this as Hume’s Rule. Sexism aside, the rule is stating that the stronger the evidence, the more confidence one should have in one’s belief, and the opposite holds as well. Less evidence should result in a weaker belief. And a corollary of this principle is that one should not have a belief at all until one has examined the evidence. Following Hume’s Rule is the basis of guided curiosity. In what follows I shall give some examples from religion and paranormal belief to show how guided curiosity keeps us from falling for nonsense. The consequences for religious belief of Hume’s Rule are readily apparent. Very few religious believers, when pressed, will hold that the evidence for religious belief is very strong. This is where the argument typically takes a turn: to hold that there is, after all, an exception to Hume’s Rule, which applies only to religion. Religious belief is grounded on faith, not reason or evidence; and Hume’s Rule applies only to beliefs based on evidence. Faith is, of course, precisely belief in the absence of evidence, and is, according to the religionist, a virtue which not only elevates the religionist who possesses it, but shows the simple-mindedness and shallowness of thought and character of the atheist or agnostic who lacks it, and instead asks for evidence. By the time the atheist has defended her character from the above charge, there probably won’t be much time left to return to the question why faith is only appropriate for religious claims. After all, faith has a companion in mundane affairs, gullibility – which is also belief in the absence of evidence – which is decidedly not considered a virtue in those who invested in Bernie Madoff’s Ponzi scheme. But if you do find the time to pursue this point with the religionist, it’s very unlikely that you will receive an informed answer. Instead you will probably be told that this comparison is insulting, and the debate will end there. Perhaps it is; but the fact that a person may be insulted by being told that he has a long nose doesn’t by itself prove that the claim is wrong. Hume’s Rule has another important implication for religion, atheism and agnosticism. To see it, let’s introduce just a bit of probability theory. Since evidence is what makes a belief probable, it follows that evidence which establishes the probability of a claim to be less than .5 (or 50%) should be disbelieved. This is because the probability of belief p and the belief in the denial of p (Not-p) must add up to 1. The sentence “It will either rain or not rain on my house today” has a probability of 1, or, in other words, expresses a certainty (the Law of the Excluded Middle in logic). According to the app on my phone, the probability that it will rain here today is 40%, or .4. When a belief p has a probability of .4, its denial, Not-p has a probability of .6 (1 minus .4). So I should believe that it won’t rain today – but, applying Hume’s Rule, my belief shouldn’t be very strong, and I should be prepared to change it. Ditto my belief in a god, except that I assign the probability of there being a god to be much lower. A claim with a probability of exactly .5 should be neither believed nor disbelieved. Thus, agnostics must be holding that the evidence for belief in a god is just as compelling as that for disbelief, or, in other words, a probability of .5 for each. A probability of less than .5 is grounds for atheism, for the reason just given. But most people who call themselves agnostics do not really believe that the probabilities are equal. They concede that the probability that there is a god is less than that of the belief that there isn’t one, but they stick to the claim that we cannot be sure that a god doesn’t exist. But this is just to miss the point of one of the basic axioms of probability theory given above. It is important to remember that the denial of the existence of something does not require evidence that the probability of its existence is 0. Most agnostics have no difficulty in dismissing the existence of Santa Claus or the Tooth Fairy, despite not having checked every claim of where Christmas presents or quarters under the pillow came from, and therefore not being in the position to say that the probability of their existence is 0. It might be thought that the agnostic has an answer to this. Given that God is supposed to be transcendent, completely outside the realm of human experience, no evidence is possible for belief or disbelief, because there is no evidence at all. With no evidence leading us in either direction, suspension of belief (agnosticism) seems to be the only reasonable position. But this rebuttal isn’t conclusive. The best reason for suspending belief is that we are awaiting further evidence that might require us to change our minds. Now let’s return to the agnostic’s strong point that we are considering the transcendent; for which no empirical evidence can be found. If this is so, then there would be no reason to suspend belief pending further evidence – the supposition is just that this won’t be any. Now, add this to another corollary of Hume’s Law: The onus of proof is always on the person who puts the idea forward. When the claim is presented without any evidence to support it, Hume’s wise person would disbelieve it. This is because there are always more ways of getting something wrong than of getting it right. Take for example, guessing the day of the week on which a total stranger was born. If you guess Thursday, you have one chance in seven of getting it right, and the smart money will be on you getting it wrong. So in the debate between the atheist and agnostic where both agree that there is no evidence available about the transcendent (literally the world for which no empirical evidence possible), the onus of proof is on the believer, and the believer has none. Therefore, the claim should be disbelieved, and the atheist wins by default. In addition, remember that the theist claims to believe in some god or another; one with certain properties (even if she admits that there is no empirical evidence for those properties. But now go to a second theist, who believes in another god, with somewhat different properties). Both of these theists will be implicitly recognising the onus of proof, since they apply it to each other: the first will deny the second’s god on the grounds that the evidence is insufficient, and vice versa. Repeat this a few thousands of times, and we have the point made so well by Richard Dawkins (2006): “We are all atheists about most of the gods that humanity has ever believed in. Some of us just go one god further.” Pseudoscientific claims fudge the onus of proof too. In fact, they do so with such regularity that we might consider this error as one of the defining characteristics of pseudoscience. Take conspiracy theories for example. Why is all the evidence about how “they” killed Kennedy, or placed the explosives in the World Trade Center towers to supplement the work of the airplanes, completely hidden? That it is missing is just the evidence that skeptics are supposedly too dense to see; its absence shows how clever and powerful “they” are that “they” can hide it so well. You can diagnose the informal fallacy involved here as failing to respect the onus of proof; or you can equally well call it begging the question (appealing to the very fact you are attempting to prove as evidence for the very fact that you are attempting to prove). Or you can call it the argument from ignorance, which involves saying that because you cannot disprove my claim, I must be right – whether or not I have presented any evidence. But the main point is that they have managed the impossible feat of creating something out of nothing. This, by the way, is the thing about informal fallacies: They are like cockroaches, in that if you spot one you can be assured that there are a bunch more lurking where you can’t see them. A corollary of Hume’s Rule is the requirement that we search for not only the evidence that supports our belief, but for that which goes against it. Looking only for the supporting evidence is confirmation bias. The religious, conspiracy theory and the paranormal believer on the other hand are notorious for this cherry picking. The Christian apologist, searching for miracles, concentrates on the one little baby that survives the plane crash, but ignores the 200 others who perish. The believer in her own telepathic powers zeroes in on the few times she ‘knows’ what her friend will say next, while remaining blindly oblivious to the many more times she guesses wrong. The 9/11 conspiracy theorist pays attention to any problem, no matter how insignificant, in the received account of how the towers collapsed, while not being troubled at all by the fact that there is no evidence whatsoever of anyone planting any explosives in the buildings beforehand. The believer in any sort of complementary or alternative medicine will keep track of every cold that cures itself after taking echinacea while ignoring the ones that cure themselves without taking it. The graphologist (handwriting analyst) keeps track of every case where a handwriting sample shows large loops on the descenders of fs, gs, js, ps, qs and zs and its writer has a better than average libido, and ignores those with large libidos but wih handwriting characteristics that cannot be described this way, as well as those whose handwriting has thin or small descenders but who are nevertheless quite libidinous. (If you think I’m making this up, check Paterson, (1980:11), and don’t ask how she measured libidinousness.) Let us look at one more rule of guided curiosity. It’s not only important to pick up new information; it’s also essential to compare that new information with what you already have. Is the new information consistent with what you already believe? If not, you have work to do to reconcile these beliefs. Perhaps you will have to reject the new information; perhaps you will have to modify it to a certain degree. Or maybe you will have to do one or both of these things to your present beliefs to achieve a fit. This shuffling process will always be with us, as long as we are gaining new information. (For a better account of this process, see Quine and Ullian (1978).) If the new observation is consistent with our old beliefs, then you can ask how it enhances your understanding of what you already know. What implications are there from the new belief to other hitherto unsuspected claims? Do these implications suggest further claims which can be tested? An example will show what I mean. Therapeutic Touch (TT) is a healing modality which involves practitioners manipulating the “energy field” which surrounds our body without actually touching the body itself. (This has led me to describe TT as “neither therapy nor touch.”} One shape of the field is healthy, others are making us sick. The crucial thing to note about TT theory is that practitioners work on just this energy field, not on the body underneath it. So, let’s take these claims at face value. If the healers can manipulate the field, they must be able to discern its presence somehow without simply inferring it from the presence of a body. But if we can perceive its presence through sight, smell, touch or hearing (probably taste isn’t an option here), then everyone should be aware of it. But we are not. Only TT practitioners are. Well, that must be because there’s another sensory mechanism which not all of us have – only those who would be good therapeutic touch practitioners have it. The inference from the claim that the energy field can be worked with to the claim that practitioners must be able to recognise its presence is not one that is often made by TT believers. But it was made by a nine-year-old from Loveland, Colorado, Emily Rosa. And though no TT practitioner had thought of doing this, the young Ms Rosa thought about how to test this claim. For her science project she set up a solid barrier dividing a table in half. The barrier left enough room for a TT practitioner to pass her hand underneath, just high enough for the experimenter to have her hand underneath it, or not. Whether it was or was not was determined by a randomizer. So, if the TT practitioner could do better than chance, on a test designed to rule out the other sensory modalities, this would be evidence of the energy field that some gifted people could detect. Needless to say, Ms Rosa’s experiment did not confirm this hypothesis, but it did lead to her being the youngest person ever to be published in a top rank medical journal (Rosa , et al, 1998). There is another important implication of Hume’s Rule. He tells us that belief should be based on the preponderance of evidence, or on the probability that evidence confers on the belief. But the evidence for or against the belief is continually shifting as more of it becomes available. Along with this, the probabilities will fluctuate. Remembering this is how the skeptic following guided curiosity avoids dogmatism even when she has a fairly strongly held belief. She is always ready to modify her beliefs, and in some cases switch from belief to disbelief or vice versa as the new evidence comes in. And it is also why a skeptic should not only state her beliefs, but state them along with the degree of confidence – her estimate of the likelihood that more evidence will require her to revise or abandon those beliefs. Or better yet, always be prepared to state the belief along with the evidence for it. With these qualifications, there is no harm in provisionally stating a belief with a probability not much higher than .5, or disbelief even when the probability is a bit less than .5. Taking a belief seriously confers the benefit that, once a belief is stated along with the evidence for it, it can be examined, and implications drawn from it, which in turn can lead to new understanding. However, dismissing it because the probability is not much above .5 forgoes this possibility. The important thing to remember about dogmatism is that what is wrong with it is not the forceful stating of the belief, but concentrating on the belief rather than the evidence for it, or the unwillingness to budge from it when new evidence comes to light. Some skeptics will be disappointed that I have gone all this way without mentioning one of the cardinal principles of skepticism, that there are times – quite a lot of them, actually – when you shouldn’t express a belief at all; you should straightforwardly admit that you do not know. There are two advantages to this admission. The first is that it serves as a stimulus to curiosity: having admitted that you don’t already know gives you a good reason to try to find out. Second, it prevents you from misleading others (and yourself). When they think you know and you don’t, they might follow you when they shouldn’t. There are two situations where one should say that one doesn’t know. The first is when this expression is simply a substitute for “I don’t care.” This expression is tantamount to admitting that you have very little evidence and you are not prepared to gather any more. For example, just by reading the headlines and deciding that I have no interest in the articles they head, I couldn’t help finding out that Prince William, the Duke of Cambridge, and his wife, Kate Middleton, recently had a son, whom they named “Archie”. But his birthweight? I don’t know; meaning …. After all, one cannot expect to have time to look into everything; one must prioritise. The second situation is where the evidence you have at present is about equally compelling for belief and disbelief, and it is possible to get more. In this situation it makes sense to suspend belief and wait for further evidence. But there’s an important exception here: sometimes waiting isn’t a viable option; circumstances require an immediate response. Fortunately, these special cases are quite rare; so withholding judgement while awaiting more evidence is an available option, and a good one. Otto von Neurath (1921) compared our belief system to a leaky ship at sea. We are continually replacing rotten planks with fresh ones, but never are we able to replace the whole bottom at once, given that we wish to remain afloat. Thus we will never have a perfect set of beliefs; there will always be some false ones in there that we haven’t found yet. The best we can hope for is a gradual improvement. To continue with his metaphor, when we find a particularly strong plank on the boat which doesn’t fit very well with the old ones already in place, going to the all the work to make it fit may result in a much less leaky boat overall. Similarly, encountering a new belief that is inconsistent with some old ones, but with a lot of evidence backing it up may require the modification of several of the old beliefs at once. But the result may be a more coherent belief system overall. But not a perfect one. Non-skeptics may find this disconcerting. They are like the sailors who aren’t in it for the pleasure, but just want to get somewhere – anywhere -- and who just want to go along for the ride. But the true skeptic enjoys the sailing for its own sake. Dawkins, Richard 2006: The God Delusion.Boston, Houghton Mifflin Company. Hume, David: “Of Miracles”, in Hume 1955, Indianapolis, The Bobbs-Merill Company Inc., An Inquiry Concerning Human Understanding. Paterson, Jane 1980: Know Yourself Through Your Handwriting. Montreal, Readers Digest Assn. Quine, W.v.O. Quine and J.S. Ullian, 1978: The Web of Belief. New York, Random House. Rosa, L., E. Rosa, L. Sarner, S. Barrett 1998: "A Close Look at Therapeutic Touch". Journal of the American Medical Association, Vol 279(13):1005–1010. Von Neurath, Otto 1921: Anti-Spengler. Munich, G.D.W. Callwey
0 Comments
![]() Dyer is a Canadian war researcher who wrote this book in 1985. From what I can gather, it is well respected. This edition has been revised and updated. The book tours war historically, examining its roots and its evolution over the millennia. The opening comments discuss some of the big picture history. For example, major battles prior to the 20th century would generate casualty rates as high as 50-60%, with an average of 20. In modern battles, this figure rarely exceeds 1 percent. This was due largely to changes in technology (weapons got better). This in turn changed the nature of soldiers and their mission. In the old days, soldiers did not get combat fatigue (shell shock, or today: PTSD) because they would die before it ever happened. Today, all armies recognize that troops can only take so many combat days (in WWII, it was 240) before they fall apart. Dyer has always impressed me especially with regard to basic training, which he discusses in a chapter called "Any one's Son Will Do". Get them when they are young. Break them down to the same level, making them all equals, and then build up a small group dynamic where each man relies on the others to help him stay alive. Drill Sergeants are masters of psychological manipulation, and they know it. In his opening chapter, Dyer makes this startling revelation: In WWII in the ETO, only 15% of combat infantry riflemen ever fired their weapon in anger! Most soldiers would never have reveal such a fact, until they were told they were not at all alone. More than four fifths of combat soldiers got through the war with killing anyone, and without firing a shot! More research has backed this up. Gettysburg was a hugely costly battle for Americans because it was Americans on both sides. More than half of the recovered muskets from the battle were loaded with more than one round, and only 5% were ready to be fired when they were abandoned or dropped. Six thousand had as many as 10 rounds in the barrel. In other words, most of the fighters were spending their time reloading a loaded gun, and, one assumes, ducking. No one is suggesting these men were cowards. Many were simply principled and did not want to kill. The upshot of this is startling: if you could get the malingerers to enter the fray, the other guys didn't stand a chance. Which they did. New training got the numbers up to 50% in Korea and 80% in Vietnam. The US had figured out how to get men to kill automatically. Dehumanizing the enemy was a big part of it. War goes back a long way. Chimps war on each other and so do we. We did not invent war, we inherited it. Dyer examines hunter gatherer groups and their interaction; ritualized warfare in groups like New Guinea bushmen; and other evidence of our more primitive past. Some suggest that ritualized warfare is not the "real deal". It does generate casualties at a low rate, which meant that they could do it a lot, which in turn meant high casualty rates over time. Hobbs, Rousseau and Darwin are invoked and examined for their viewpoints. The birth of war was driven by human life style choices. Hunting and gathering can keep you alive. So can farming. But a new type of living was now being made: pastoralism (i.e.: nomads). They lived by herding domesticated animals. Inevitably, these groups clashed. The nomads would win fights because they were mobile, which meant they could concentrate their forces when needed. To combat this, walls around towns were built. When horses were domesticated, things accelerated. It is worth noting that Egypt was largely spared from nomadic attacks by its geography. The Sumerians hit upon the idea that religion could be a better way of settling disputes. It worked for a while. Priests liked it because it gave them power. But as we know, religion is not a cure for war, but more often an excuse or a direct cause. Aside: Women were equal partners in life until civilization and agriculture came along, driving the need for a power structure which eschewed women because it could. Sargon was the worlds first emperor over a militarized society (circa 2300 BC). Around then, there was a major technological innovation: the compound bow. The bow and arrow has killed more people than any other weapon, ever. Unlike the English longbow (yet to be invented), the composite bow was short, powerful and could be fired from horseback. Other inventions over time include the chariot (fast, hit and run, archery platform); the pike and phalanx; war galleys; improved, harder metals; bigger horses; the saddle and stirrup (700 AD); gun powder (1300 AD); and organized armies (which gave the powers-that-be the willies). Aside: Japan had a warrior based culture. When the musket was introduced, they were appalled. A samurai could be killed by a commoner! Unacceptable… so they just stopped making them. That obviously changed later in history. Armies got bigger, the percentage of soldiers dying went up, battles were huge but infrequent; and the average citizen was left alone. Armies had to be trained and fed. You could not just conscript someone and throw them into battle. Standing armies were a part of every major European power from 1700 on. What happened next was the introduction over time of the concept of "total war". New weapons were being developed at a quick pace. It was soon discovered that a few men behind cover with guns could stop a large number of advancing troops. Tactics changed. WWI introduced the concept of the continuous front. The tank was used for the first time. It gave professional soldiers the hope that it might end the wars of attrition. It did, but it caused the continuous front to become mobile. This and aircraft took a terrible toll on civilians. The tank eventually spawned "blitzkrieg", or mobile "lightning" war. The world wars were the first where civilian deaths outnumber the deaths of combatants. The next chapter in the book documents the history of nukes. There have been a lot of ideas over the years on this subject, but the prevailing idea is that you only need enough to destroy the other guy. This was known as MAD (Mutually Assured Destruction). And so far, this math has kept the nuclear peace for 75 years. The advent of nukes gave rise to the phrase "conventional war". The birth of the professional army came in 1803 Prussia with the creation of the Kriegsakademie. This approach of using well trained professional soldiers was soon adopted by others. One hundred Germans in WWII were a match for 125 British or 250 Russian troops. Why? Because they had 10x as many good generals who knew how to get the most from men and equipment. The old game is now protected turf and generals everywhere want to keep it that way (the start of the military-industrial complex?). Technology has changed the equation a great deal. One example: a Spitfire in WWII cost 5,000 pounds to make. The supersonic Tornado, its 3rd generation successor, cost 17 million, 172 times more after adjusting for inflation. The last chapter speculates about the future of war: Baboons are nasty creatures. The males are obsessed with status and fight all the time. But in one troop something unusual happened. The aggressive males all died off at once from eating infected meat. Overnight, the troop settled down to a much more egalitarian society. And it stayed that way even after the demographics of the tribe returned to normal! The ritualized war fare on New Guinea bushmen killed a large percentage of combatants, one at a time, year after year. The government stepped in and told them this had to stop. They all agreed enthusiastically and never looked back! It seems both were caught in a local stability point that they could not get out of without a nudge. Perhaps there is hope in both those stories. While not as eye-opening a book as Guns Germs and Steal, this book is a must read for anyone who wishes to grapple with these thorny issues. War has been with us for all of our history. But perhaps we can first ritualize it, and then dump it as a bad idea. I doubt it. The New Guinea bushmen were offered an alternative to their wars… essentially third party arbitration. This is something the top dogs will never agree to. One final tidbit: Have you ever wondered if you were safer as an officer in combat, or as a soldier? In WWII, the answer was "soldier". In Vietnam, it did not matter. Dyer himself has served, so he has seen some of all this from the inside. ![]() This was a much awaited book when it showed up. It covers the Trump history from getting elected (when the first thing he did was lie and claim he won the popular vote) to just after the infamous July 25th phone call to the president of Ukraine. Like books from other reporters like Bob Woodward, this book is heavy on basic facts. It is clearly written and as the title of the book implies, the authors know about whom they are writing. Unlike Woodward's books, which I found dry and dull, I rather enjoyed reviewing the events of the last few years in one compact reference. Trump fired Comey -- who he knew was on the west coast -- by sending a letter a letter through one of his henchmen. He also torpedoed him on Twitter. When told that he had screwed up in sending the letter as he did, he replied "I know, fucking incompetence. Drives me crazy!" (referring to his staff). Trump never errs. Period. The material in the book is largely familiar to anyone who has followed the election of the Mango Mussolini. If you had negative opinions of him, and who doesn't, this book will be satisfying and scary at the same time. If sheds some light on the Mueller report and why it fell so flat. It also illuminates the mind of Trump. He is a petty, pompous, pugnacious, pinhead (at that is just the "P"s) in charge of a country he does not understand. Every person who has been in contact with him for any length of time has walked away from him covered in bullshit and fleas. Some still serve, but most have been arrested or driven from office. In years to come, this book will become a reference for the times. The only unsatisfying aspect of the book is that it ends before the story is over. As a skeptic, I have read a lot about cults. We had at least two ex-cult members lecture the BC Skeptics . I spoke with them in person. The difference between a cult and a religion is often subtle. In this case, we are talking about a political figure who demands utmost fealty and believes he can do no wrong. The is nothing subtle about Trump , and the parallels between the Trump movement and the rise of Nazi Germany continue to trouble me. ![]() This was a read of a different nature. I wanted to learn a little more about feminism and a friend recommended this and one other book. It is different for me because it views the world from a perspective that I might try to empathize with, but cannot experience. It is worth noting that the book is 30 years old now, and some things have changed. I also note that Ms Wolf is very "beautiful", or at least she was in 1990. How much beauty-bias she personally suffered is hard to say. The book has six chapters: Work, Religion, Culture, Violence, Sex, and Hunger, There are many examples in every chapter (too many at times) of the various burdens women must bear… some external and some internal. For example: the intro points out that the average American woman would rather loose 10-15 lbs than any other goal. That is really sad. It certainly has a strong external component to it. You can blame Twiggy et al for that, but surely part of this issue is internal. She introduces the concept of the PBQ (Professional Beauty Qualification). She argues that the PBQ is a way to discriminate against women in a safe, litigation-proof way. The standards for on-air female personalities are quite different than for men. One woman lost a case where her employer said she was too, old, too unattractive, and not deferential enough to men. Yikes! The PBQ, Wolf argues, is the currency of womanhood. Here is a shocking statement from the book: Young professional women spend up to 1/3 of their income on their appearance. Or to put it slightly cruder: Want to keep your job: get your boobs done. Culture obviously has shaped women. Women's culture is driven by women's magazines and the advertisers who support them. The advertisers are constantly telling women how they should or could look, and assure them that if they buy their goop, all their woes will go away. In fact they create the woes first, and then fix them. And they deliberately promote competition between women. The author suggests that "adornment" is a huge part of female culture, and I am sure she is right. But I wonder if that aspect of female culture is not part of the problem. Putting such a heavy emphasis on adornment seems rather shallow to me, but I am a guy. Women and weight is always an issue. Weight Watchers tells women "Always wear your makeup. Even to walk the dog. You never know whom you are going to meet." That says a lot. About appearance issues and weight. The author's Religion chapter focuses primarily on the cult aspects of beauty and make-up. She spends quite a bit of time focusing on the bullshit of skin creams that promise rejuvenation. One marketing line caught my eye: "A lipstick you can have a lasting relationship with." I wonder if it comes with batteries. Cults are something I know a little about. And it is all (well, most of it) there. Chanting, purifications, confessions and other mind control techniques are on full display. If the woman is also hungry, that helps as it will impair her reasoning. Sex will always be an issue. And men control the issue. The book cites several cases where women were raped or brutalized, only to have the legal system tut-tut them, or suggest it was all in good fun. Who can forget the Canadian judge who asked a female victim why she didn't just keep her knees together? Ms Wolf does not approve of pornography and she may have a point, but these are issues that can be resolved academically, which is my only real complaint about this book… It could have used more science to back up its conclusions. There is no doubt that weight and conditions like bulimia and anorexia are major concerns for women. The author quotes a number of statistics on the subject. If one takes the worst numbers that have been put forward, then 1 in 10 college age women are anorexic and 5 more are bulimic. If the true figures are even close to that, that is very troubling indeed. The rail thin heroin-chic skinny look came in with Twiggie and has never left. Porn and women's magazines are part of the problem, to be sure. They both make people of both sexes feel that their bodies are not as good as they could be. Cosmetic surgeries have gone through the roof. Joan Rivers had more and more done until the last one killed her. The author argues that this is a form of violence toward women, and it is hard to argue that she has a point. Doctors invade diseased bodies as a last resort. Cosmetic surgeons call healthy bodies sick and then invade them. The industry is huge and largely unregulated. Things may have changed in 30 years, but in 1990, doctors could not tell a patients the risks of cosmetic surgery because they did not know themselves. I found the book over-long on examples, and short on analysis and statistics. But these are quibbles. I can say this: it is more complicated being a woman in our society than a man. We have a long way to go as a society to level the playing field. And women (and men) need to learn how to be happy with themselves and others. I recently visited my sister-in-law and complemented her on her looks. I had caught her in a rare moment: au natural (no makeup). She thought I was ribbing her. That should not happen. A few things you should know before reading this email from my aunt:
Enjoy... Hi Lee This is the story of me becoming an atheist. In the evening our mother told us stories written by H.C. Andersen. Among them was the story (The Tinderbox) about the soldier, a tinderbox and three big dogs. And on Sundays, I went to the Sunday school and heard stories about Jesus and his disciples. Stories. And then, in school one day, the teacher rolled down a big map of Palestine and he said: This is where Jesus was walking with his disciples. I was shocked. Like I would be, if the teacher had taken us to a tree with a big hole in it and declared that: Here was the tree, where soldier killed the witch, and got the tinderbox. Stories. But I was living in the 1930’s, and opposite to now, people went to church, so I kept my opinion to myself, until one day I openly declared myself as a nonbeliever, no longer a member of the church (I saved taxes), and none of my children, grandchildren, and great grandchildren are members. Churches in Denmark are empty mostly – only a few old people mostly women are sitting there, and some churches are used for other kinds of social events. Else Age 95 And I need no medicine And remember: No herring on white bread! (no white food at all). I often wear a t-shirt that reads "Boo I am an atheist". I think it is funny because, at least in the US, atheists are feared more than terrorists. Why? Because atheists do not fear ultimate punishment, and are therefore free to run amok. The atheist might argue "Hey, I am only trying to blow up your beliefs... terrorists want to blow up your house and family!" That logic seems lost on the believer.
Compare this to Catholics. If a Catholic sins (or runs amok), he can talk to a priest, who is bound to silence, and get forgiven by God. They do not even have to apologize to the victims of their sins. Nor do they have to turn themselves into the police if that might be required. Nope... the church gives them a get-out-of-hell-free card. And so does the rest of the world... with a few excepts like Cardinal Pell (may he rot in jail forever, the bastard). So what is the difference? This is it: A believer gets to walk away, relived of his burden forever; An atheist must carry their burdens until they die. Of the three groups mentioned (atheists, believers and terrorists), I fear believers the most, mostly because most terrorists are believers, and none that I know of are atheists. Irony is a bitch. |
AuthorLee Moller is a life-long skeptic and atheist and the author of The God Con. Archives
April 2023
Categories
All
|