Noam Chomsky on Postmodernism

Since I’ve been thinking about postmodernism recently, I thought to share this fascinating interview from the youtubes. The most unfamiliar point that Chomsky brings up is the story about Bruno Latour and the ancient Egyptian tuberculosis death (read more about it here). Basically, Latour argued that since tuberculosis was not constructed until the era of modern medicine, it could not have existed in ancient Egypt! (Starts at 3:48 in the video.)

5 Historical Documents on Universal Truths

A couple of weeks ago, I wrote a post criticizing the strong form of moral relativism, namely the idea that nobody, or no culture, is right or wrong. In this post, to continue the objective vs subjective truth discussion, I will look at five historical documents that have explicitly acknowledged universal truths. Moreover, all of these documents proclaim non-empirical truths, i.e. they are not documents of science that can be tested by the scientific method. (I include this caveat because it’s easy for a relativist to acknowledge that science can have universal truths but then claim arbitrarily that other subjects work differently than science and shouldn’t have universal or objective truths. So, I am addressing the claim that nonscientific truths cannot be universal.)

1. Euclid’s Elements (~300 BC)

euclid-elements

The Elements is one of the most influential books of all time, not just in mathematics but in the entire Western way of thinking. For this post, math is considered separate from science, in that math does not operate by the scientific method. It instead operates by a strictly logical method that was largely formalized by Elements. The steps of this deductive method, in contrast with the inductive scientific method, consist of:

  1. Listing axioms, or self-evident truths.
  2. Listing basic assertions, which also should be self-evident.
  3. Combining the axioms and assertions to obtain theorems, which are the end result.

(For a list of the axioms and assertions, see the wiki page.)

In Elements, the first “postulate,” or axiom, is that a straight line can be drawn from one point to any other point. This seems obvious enough. Clearly if we imagine two points, we can also imagine a straight line between them. Another seemingly obvious claim is the last “common notion,” or assertion, which states that the whole is greater than the part.

But to what extent are these axioms really self-evident or universal? On what basis do we have to judge their universality or objectivity? The last axiom, for instance, known as the parallel postulate, is not even true in certain geometries. These are questions that have been debated for centuries

2. The Declaration of Independence (1776)

Trumbull-Declaration-of-independence

“We hold these truths to be self-evident, that all men are created equal, that they are endowed by their Creator with certain unalienable Rights, that among these are Life, Liberty and the pursuit of Happiness.”

Note that “We hold these truths to be self-evident” sounds like something Euclid would have written two thousand years earlier. In fact, the similarity is likely more than coincidence. Thomas Jefferson was a reader of Euclid, as evidenced in a letter to John Adams: “I have given up newspapers in exchange for Tacitus and Thucydides, for Newton and Euclid; and I find myself much the happier.” Furthermore, the Declaration reads much like a mathematical proof in the style of Euclid:

  1. The introduction (“When in the Course of human events… a decent respect to the opinions of mankind requires that they should declare the causes which impel them to the separation”) establishes the want for the “dissolution of political bands” and then acknowledges that they need to declare the causes for it, i.e. the need for a proof.
  2. The preamble establishes the self-evident truths.
  3. The indictment contains the various violations by the King of the self-evident truths.
  4. The denunciation gathers the above together and says a “therefore,” showing that the proof has been concluded: “We must, therefore, acquiesce in the necessity, which denounces our Separation, and hold them, as we hold the rest of mankind, Enemies in War, in Peace Friends.”
  5. The conclusion notes that the proof has been completed; therefore, they will act on the result of the proof: “That these united Colonies are, and of Right ought to be Free and Independent States; that they are Absolved from all Allegiance to the British Crown.”

More can be found in a talk given by Noam Elkies. The interesting thing is to note how universal these self-evident truths are. Is it objectively true, for example, that all men are created equal? Is this view just a Western and/or Enlightenment construction? I would argue it is not (this is for a different post).

3. Pride and Prejudice (1813)

pride-and-prejudice

The reason I have included Pride and Prejudice over any other work of literature is the opening sentence: “It is a truth universally acknowledged, that a single man in possession of a good fortune must be in want of a wife.”

Yet again, we have a declaration of universal truth, though this time used in fiction to establish the setting for the story. In contrast with its use in the Elements and the Declaration of Independence, universal truth is used by Austen in a more sarcastic manner.

Indeed, literature in general tends to question truths that are universally held. In this context, Pride and Prejudice is special because it acknowledges this explicitly. The statement, of course, is patently false, but it raises the question of whether there are any universal truths in social relations. And what would “universal” even mean? If something applied to a certain group in early 19th century England but not to anyone else, is it still universal?

4. The Communist Manifesto (1848)

Karl_Marx

Back to serious documents, we have the strong claim by Marx and Engels that “The history of all hitherto existing society is the history of class struggles.” The signifier is the word “all,” which again proclaims a universal truth, at least universal to a sufficiently large breadth (“hitherto existing society”). By the nature of their argument, it should not be an absolute universal in the sense of applying to all time: success would mean having a classless society, and therefore, class struggles wouldn’t exist.

This example and Austen’s example are both social/historical universals. Marx argues that history can be understood by looking at class struggles, but again, on what basis can we support this? The modern view is that history is complex and can be partially understood through many different means, not just on modes of production.

On the other hand, Euclid’s is a mathematical universal, and Jefferson’s is a moral universal, in acknowledging the rights of man.

5. The Universal Declaration of Human Rights (1948)

Flag_of_the_United_Nations

This United Nations Universal Declaration of Human Rights is among the most significant documents of the twentieth century, and it is also based on presumed universal truths. Its preamble consists of seven “whereas” clauses to establish several self-evident assertions much like in the introduction to the US Declaration of Independence. These are:

“Whereas recognition of the inherent dignity and of the equal and inalienable rights of all members of the human family is the foundation of freedom, justice and peace in the world,

Whereas disregard and contempt for human rights have resulted in barbarous acts which have outraged the conscience of mankind, and the advent of a world in which human beings shall enjoy freedom of speech and belief and freedom from fear and want has been proclaimed as the highest aspiration of the common people,

Whereas it is essential, if man is not to be compelled to have recourse, as a last resort, to rebellion against tyranny and oppression, that human rights should be protected by the rule of law,

Whereas it is essential to promote the development of friendly relations between nations,

Whereas the peoples of the United Nations have in the Charter reaffirmed their faith in fundamental human rights, in the dignity and worth of the human person and in the equal rights of men and women and have determined to promote social progress and better standards of life in larger freedom,

Whereas Member States have pledged themselves to achieve, in co-operation with the United Nations, the promotion of universal respect for and observance of human rights and fundamental freedoms,

Whereas a common understanding of these rights and freedoms is of the greatest importance for the full realization of this pledge….”

These set up the basis for the 30 articles, which are the “self-evident” truths or axioms. The first three articles, for example, are:

“Article 1. All human beings are born free and equal in dignity and rights. They are endowed with reason and conscience and should act towards one another in a spirit of brotherhood.

Article 2. Everyone is entitled to all the rights and freedoms set forth in this Declaration, without distinction of any kind, such as race, colour, sex, language, religion, political or other opinion, national or social origin, property, birth or other status. Furthermore, no distinction shall be made on the basis of the political, jurisdictional or international status of the country or territory to which a person belongs, whether it be independent, trust, non-self-governing or under any other limitation of sovereignty.

Article 3. Everyone has the right to life, liberty and security of person.”

Note that the UN did not feel the need to prove any of these. They were simply obvious or self-evident. The theorems, however, are all implicit. It is implied that if these axioms are violated, the UN has the authority to intervene on behalf of human rights.

We could spend a long time debating which particular articles are true or false, but the big picture question is, Can any of them be objectively true? Is the discussion of them even meaningful? The intuitive answer is yes.

To be continued…

The Flaming Laser Sword

I recently stumbled upon Mike Alder’s article “Newton’s Flaming Laser Sword, Or: Why Mathematicians and Scientists don’t like Philosophy but do it anyway.” It was quite relevant to my view of philosophy. From a mathematical and scientific perspective, plenty of philosophical issues seem strange.

Alder uses the example of when an unstoppable force meets an immovable object. From a scientific perspective, the thought process is something like, “We’ll test it: if the object moves, then it wasn’t immovable, and if it doesn’t, then the force wasn’t unstoppable.” Anyway, this is something I might talk about more later on.

The Construction of Social Progress: Can Civilization Move Forward?

International_Space_Station

In the past year, I have used the term “social progress” in 6 different blog posts. It referred to various topics, including LGBT rights, women’s rights, and views on race, not to mention advances in medicine and technology. Implicit were the assumptions that civilization can move forward, and that having having a more equal society does constitute social progress.

Progress and Postmodernism

As it turns out, this type of thinking is not a given. Under postmodernist thought (whatever this phrase means), the idea of social progress is taken skeptically and questioned. Granted, the questioning is done with the noblest intention. Postmodernists argue that metanarratives of progress have, in the past, led to the cruelties of European colonialism, Fascism, and Communism. In each case, those who thought they were more civilized or who thought they could bring about a more civilized society ended up being brutal tyrants. Progress was thus a tool by which the rulers ruled the oppressed. Progress was and is, in the extreme, nothing more than a social construct.

I wonder if this fervent skepticism toward social progress is an overreaction. While I could write an entire post or more specifically about this, I reject postmodernism overall and consider myself under post-postmodernism, remodernism, metamodernism, or whatever word you prefer to describe the cultural state after postmodernism. Admittedly, I recognize that my own thoughts cannot be fully disentangled from postmodernist thought (which is itself a postmodernist way of thinking), but I can try to move forward.

The reason I bring this up is that postmodernism and progress are more intricately tied than just a loose sentiment that progress doesn’t exist. Postmodernism also rejects objective truth (either to some degree or often all-out); if you have been in an English class, you’ve probably learned that all truth is subjective. Herein lies another issue, as the concept of progress entails that society is objectively moving forward, that there is some objective truth, a conflict with postmodernism.

To add one more grain to the heap, there is a modernist vs postmodernist dichotomy between prescription and description. The significance of this is that modernism and progress are inherently compatible: modernism tried not only to describe the world, but also to prescribe that we should try to achieve social progress (even if it did not reveal how). Postmodernism, however, as a purely descriptive framework, is incompatible with the concept of progress; it could not advocate for social progress even if it were not a social construction. (This leads to a chicken and egg problem: Does postmodernism reject progress because it rejects prescription, or does it reject prescription because it rejects progress?)

The Existence of Social Progress

Despite the postmodern rejection of progress, it is very easy to show that progress does exist. Ask any postmodernist if they would rather contract polio or measles or chicken pox right now, or not contract any of them. Clearly, everyone agrees there is some objective truth and an objective scale of progress on health and medicine. “But that’s falling into the technology trap,” one might object, “you cannot tie together technology and progress because of nukes.” But this is like saying Einstein’s 1905 paper on special relativity was was the cause of the Cold War. This type of thinking misses the big picture, and it misses the fact that technological advancements have made the world a much better place.

Even then, supposing you are still against technology despite medical or other technological advances, say you are not a heterosexual, white male. Would you rather live in the United States of 2014 or 1814? Does your answer not signify the existence of progress?

What about even if you are a heterosexual, white male, would you rather live in the England of 2014 or 1314? That is, would you rather live in a society with the homicide rate of 1314, or in a society with a 95% lower homicide rate? (p. 61 of this book)

Here is the Social Progress Index, which ranks countries based on aggregate scores on Basic Human Needs, Foundations of Wellbeing, and Opportunity:

SocialProgressIndex

Using numerical data is a modernist approach, and a caricature postmodernist might flinch upon seeing the United Kingdom as being considered more “progressed” than Nigeria. Of course, we must be very cautious at how we interpret this data. For instance, the UK’s higher position than Nigeria does not constitute grounds for invasion and colonization as it may have in the modernist era. But these numbers do form grounds for critical analysis.

Yes, much of progress is socially constructed. Many of the earlier (i.e. modern) approaches were naive and led to atrocious results. But the solution is not to forsake progress altogether, but rather, to gain a matured understanding of it. This first step towards true progress requires the acceptance of progress, the rejection of postmodernism.

Making Use of the Armchair: The Rise of the Non-Expert

As with all news, when I heard about the Sochi skating controversy last week, I read multiple sources on it and let it simmer. From the comments, however, that I saw on Facebook, Reddit, and on the news websites themselves, one thing struck me—nearly everyone seemed to be have extensive knowledge of Olympic figure skating, from the names of the spins to the exact scoring rubric.

How could this be? Was I the only person who had no idea who Yuna Kim was, or that Russia had not won in the category before?

Much of this “everyone is an expert” phenomenon is explained by selection bias, in that those with more knowledge of skating were more likely to comment in the first place; therefore, most of the comments that we see are from those who are the most knowledgeable.

But it’s unlikely that there would be hundreds of figure skating experts all commenting on at once. Moreover, when you look at the commenting history of the people in the discussion, they seem to also be experts on every other subject, not just in figure skating. So another effect is in play.

Namely, the Wikipedia effect (courtesy of xkcd):

xkcd Extended Mind

Of course, this effect is not limited to skating in the Olympics. When Newtown occurred, masses of people were able to rattle off stats on gun deaths and recount the global history of gun violence in late 20th- and early 21st-century.

Even so, not everyone does their research. There are still the “where iz ukrane????” comments, but undoubtedly the average knowledge of Ukrainian politics in the United States has increased drastically in the past few days. If you polled Americans on the capital of Ukraine, many more would be able to answer “Kiev” today than one week prior. For every conceivable subject, the Internet has allowed us all to become non-expert experts.

Non-Expert Knowledge

The consequences of non-expert knowledge range from subject to subject. The main issue is that we all start with an intuition about something, but with experience or training comes a better intuition that can correct naive errors and uncover counterintuitive truths.

  • An armchair doctor might know a few bits of genuine medical practice, but might also throw in superstitious remedies into the mix and possibly harm the patient more than helping. Or they might google the symptoms but come up with the wrong diagnosis and a useless or damaging prescription.
  • Armchair psychologists are more common, and it is easier to make up things that sound legitimate in this field. It is possible that an armchair psychiatrist will help a patient, even if due to empathy and not from psychiatric training.
  • Armchair economist. Might say some insightful things about one trend that they read about in the economy, but could completely miss other trends that any grad student would see.
  • Armchair physicist. Might profess to have discovered a perpetual motion machine, to be dismissed by a real physicist because the machine actually has positive energy input and is hence not perpetual. Or, might read about the latest invisibility cloak and be able to impress friends by talking about the bending of electromagnetic waves around an object by using materials with negative refractive index, but has no idea that it only works for a particular wavelength, thus making it practically useless (for now).
  • Armchair philosopher. Perhaps the most common, the armchair philosopher notices the things that happen in life and takes note of them. The article that you are currently reading is armchair philosophy, as I basically talk about abstract stuff using almost zero cited sources, occasionally referencing real-world events but only to further an abstract discussion.

Going back to the physics example, we normal people might observe the drinking bird working continuously for hours and conclude that it is a perpetual motion machine. An armchair physicist might go further to claim that that if we attach a motor to it, we could generate free energy.

Drinking Bird

A real physicist, however, would eventually figure out the evaporation and temperature differential, and then conclude that it is not a perpetual motion machine.

Five minutes of reading Wikipedia will not allow you to match an expert’s knowledge. But having non-expert knowledge sometimes does help. It opens up the door to new information and ideas. If everyone spoke only about what they were experts in, the world would become boring very quickly.

Talking About Topics Outside of Your Expertise

In everyday speech, any topic is fair game except for, ironically, the one topic that everyone is deemed to be an expert in even without Wikipedia—(their) religion. But I digress. The point is, the way we talk about things on a day-to-day basis is very different from the way experts talk about them in a serious setting.

Some differences are very minor and just a matter of terminology. For instance, I was discussing the statistics of voter turnout in the 2012 election one time, and I had phrased it as “percentage of eligible people who voted.” At the time, I did not know that “turnout” was a technical term that meant precisely what I had just said; I thought it was just a loose term in that didn’t necessarily consider the difference between the electorate and the total population, hence why I phrased it so specifically. In this example, the statistics I presented were correct, and thus the conclusion was valid, but the terminology was off.

Other differences are more significant. In the case of medical practice, a lack of formal understanding could seriously affect someone’s health. Using Wikipedia knowledge from your smartphone to treat an unexpected snake bite in real time is probably better than letting it fester before help arrives. But it’s probably safest to see a doctor afterwards.

A non-expert discussion in a casual setting is fine, as is an expert discussion in a serious setting. But what about a non-expert discussion in a serious setting? Is there anything to be gained? If two non-physicists talk about physics, can any meaning be found?

My answer is yes, but you need to discuss the right things. For example, my training is in math, so it would be pretty futile for me to discuss chemical reactions that occur from the injection of snake venom into the human body. However, given that I had done my research properly, I might be able to talk about the statistics of snake bites with as much authority as a snake expert. Of course, it would depend on the context of my bringing up the statistics. If we were comparing the rise in snake deaths to the rise in automobile deaths, I might be on equal footing. But if we were comparing snake bite deaths between difference species of snakes, a snake expert probably has the intellectual high ground.

But even this example still requires you to use some area of expertise to relate it to the one in question. To the contrary, you can still have a legitimate discussion of something outside your area of expertise even without relating to an area of expertise that you already have. You only need to make a claim broad enough, abstract enough, or convincingly enough to have an effect.

Among all groups of people, writers (and artists in general) have a unique position in being able to say things with intellectual authority as non-experts. Politicians are next, being able to say anything with political power as non-experts. However, I’m interested in the truth and not what politicians say, so let’s get back to writers. F. Scott Fitzgerald was not a formal historian of the 1920s, but The Great Gatsby really captures the decade in a way no history textbook could. George Orwell was not a political scientist, but Nineteen Eighty-Four was very effective at convincing people that totalitarian control is something to protect against.

The Internet and the Non-Expert

On the other hand, Nineteen Eighty-Four was not crafted in a medium limited by 140 characters or by one-paragraph expectancy. If George Orwell were alive today and, instead of writing Nineteen Eighty-Four, wrote a two-sentence anti-totalitarian comment on a news story on North Korea, I doubt he would have the same effect.

It is usually hard to distinguish an expert from a non-expert online. Often, an expert prefaces oneself by explicitly saying, “I am an expert on [this topic],” but even this is to be taken skeptically. I could give a rant on the times people claiming to have a Ph.D in economics had no grasp on even the most basic concepts.

In addition to allowing us the sum total of human knowledge just a click away (well, maybe not all knowledge), the Internet allows us to post knowledge instantaneously and share it with millions of other users. We have not only the public appearance of non-expert knowledge, but also the virus-like proliferation of it. Since the dawn of the Internet, people have been able to acquire knowledge about anything, but there was a great divide between the few content providers and the many consumers. Only recently have we become the content makers ourselves. What is the role of armchair philosophy in the age of information?

Conclusion

Now is a more important time than ever to be an armchair philosopher, or an armchair thinker, precisely because of the overwhelming amount of information available to us. To deal with the data overload requires an abstract way to categorize information, to filter out the useless from the useful, the wrong from the less wrong, the less true from the true.

We are expected to deal with areas outside of our expertise, and as our knowledge of these areas grows from the age of mass information, our responsibility to use it correctly becomes greater. Forming opinions even on issues that you have no authority to form opinions on is now an imperative. We learned the capital of Ukraine in one week, and our googling of Kiev might prove useful in the future. To deal with a quickly changing world, we need to deal with all information, not just data that we are comfortable with, as effectively as possible.

The Better Angels of Our Nature

The Better Angels of Our Nature: Why Violence Has Declined, by Steven Pinker, is definitely the most thought-provoking book I’ve read this year. Then again, it’s still January, so we’ll see.

better-angels-of-our-nature

First, the question of why violence has declined presupposes that it has declined, a shocking idea to many. From the preface:

“This book is about what may be the most important thing that has ever happened in human history. Believe it or not—and I know that most people do not—violence has declined over long stretches of time, and today we may be living in the most peaceable era of our species’ existence…. It is an unmistakable development, visible on scales from millennia to years, from the waging of wars to the spanking of children.”

There are numerous reviews of this book already in existence, both professional and nonprofessional. So, I’ll focus on what I found to be the most interesting part.

The Violence Delusion

When I surveyed perceptions of violence in an Internet questionnaire, people guessed that 20th-century England was about 14 percent more violent than 14th-century England. In fact it was 95 percent less violent. (61)

The first chapters of the book show that violence has actually declined. But it might also be worth pointing out why many people (including myself) would have thought the opposite. Pinker does discuss these reasons, but the discussion is sporadic, accompanying each individual section, rather than being presented at once with a big-picture view. I present a summary of these scattered points here:

  • Memory pacification: “A woman donning a cross seldom reflects that this instrument of torture was a common punishment in the ancient world; nor does a person who speaks of a whipping boy ponder the old practice of flogging an innocent child in place of a misbehaving prince” (1). Pinker also notes that when witches are mentioned, the thoughts that come to mind are of fairy tales and fantasy, not of drowning trials, hanging, or burning at the stake. We react to the Colosseum with awe at the architecture and the glory of the to-the-death fights, not with horror at the endorsed murders that took place (imagine if Auschwitz were toured in the same manner as the Colosseum). Overall, this makes it more difficult to remember how cruel the past was.
  • Publicization of relatively small events: We have a full-day press that competes to put out news stories and keep your attention. Thus, every time a homicide or multiple homicide occurs, we are reminded of the dangerous violence in our current society. Newtown was tragic, yes. Was it a very significant event relative to other things that go on in the 21st century United States? Yes. But that is the point—we live in an age where 28 deaths is considered a national tragedy. An isolated event involving 28 deaths, common in medieval times given the feuding states, would hardly affect a medieval peasant’s perception of how violent their country was.
  • Change in mentality: Pinker notes that, as horrendous as we would view it today, torture was not seen as wrong in the Middle Ages. “[T]he sporadic, clandestine, and universally decried eruptions of torture in recent times cannot be equated with the centuries of institutionalized sadism in medieval Europe. Torture in the Middle Ages was not hidden, denied, or euphemized. It was not just a tactic by which brutal regimes intimidated their political enemies or moderate regimes extracted information from suspected terrorists. It did not erupt from a frenzied crowd stirred up in hatred against a dehumanized enemy. No, torture was woven into the fabric of public life. It was a form of punishment that was cultivated and celebrated, an outlet for artistic and technological creativity. Many of the instruments of torture were beautifully crafted and ornamented. They were designed to inflict not just physical pain, as would a beating, but visceral horrors, such as penetrating sensitive orifices, violating the bodily envelope, displaying the victim in humiliating postures, or putting them in positions where their own flagging stamina would increase their pain and lead to disfigurement or death. Torturers were the era’s foremost experts in anatomy and physiology, using their knowledge to maximize agony, avoid nerve damage that might deaden the pain, and prolong consciousness for as long as possible before death” (130). Pinker then goes on to point out that while we today would condemn an act of torture because torture is inhumane, medieval criticisms of torture focused on the wrongful targets of torture—the act of torture itself was fine, it just mattered whom it was being used against. It’s difficult to perceive your time as violent when atrocious actions like these are not viewed as violent.
  • Rise in awareness: “Well into the 1970s marital rape was not a crime in any state, and the legal system underweighted the interests of women in other rapes. Legal scholars who have studied jury proceedings have discovered that jurors must be disabused of the folk theory that women can be negligently liable for their own rapes…” (395). Stats from the U.S. Bureau of Justice Statistics show that the annual rate of rape from 1973 to 2008 had fallen by 80%. Pinker notes, “In fact, the decline may be even greater than that, because women have almost certainly been more willing to report being raped in recent years, when rape has been recognized as a serious crime, than they were in earlier years, when rape was often hidden and trivialized” (402). Thus a decline by a factor of five in reported cases could and probably does mean an even greater decline in actual cases. On the flipside, since awareness of rape is up so much, people generally perceive it as a greater threat today than it was decades ago.
  • More available data towards the present time. “Remember Tuchman’s ‘private wars’ of the 14th century, the ones that knights fought with furious gusto and a single strategy, namely killing as many of another knight’s peasants as possible? Many of these massacres were never dubbed The War of Such-and-Such and immortalized in the history books. An undercounting of conflicts below the military horizon could, in theory, throw off the body count for the period as a whole” (199). Basically, an availability bias.
  • Population and proportionality. Using a chart compiled by Matthew White, Pinker lists the 21 events in human history with the highest human-caused death tolls. Indeed, World War II tops the list with 55 million, the 16th century French Wars of Religion are at the bottom with 3 million, and the other 19 events are in between those. Some of the things I had never heard of, such as the An Lushan Revolt, which apparently happened in the 8th century and caused a death toll of 36 million (from White’s data). Surprisingly to many, 14 out of 21 of the worst human-caused events in history happened prior to the 20th century, and this is based on absolute numbers. When the death tolls for these events are adjusted by population size, only one 20th century event makes the top 10 (which was World War II in 9th position).
  • Political sentiments: The 21st century started with 9/11 and Iraq. But these are almost trivial compared to the violence and wars of the past. In the section “The Long Peace,” Pinker notes, “Zero is the number of developed countries that have expanded their territory since the late 1940s by conquering another country. No more Poland getting wiped off the map, or Britain adding India to its empire or Austria helping itself to the odd Balkan nation…. [T]wo entire categories of war—the imperial war to acquire colonies, and the colonial war to keep them—no longer exist” (251).

With all this in mind, the fog can be cast aside. This brief summary is not from the book itself, but from the website: “Tribal warfare was nine times as deadly as war and genocide in the 20th century. The murder rate of Medieval Europe was more than thirty times what it is today. Slavery, sadistic punishments, and frivolous executions were unexceptionable features of life for millennia, then suddenly were targeted for abolition.  Wars between developed countries have vanished, and even in the developing world, wars kill a fraction of the people they did a few decades ago. Rape, battering, hate crimes, deadly riots, child abuse, cruelty to animals—all substantially down.”

Overall Comments

I thought the discussions of the Enlightenment and Counter-Enlightenment were interesting. Pinker shows the Enlightenment as one of the principal motivators in the humanitarian reforms of the 18th and 19th centuries. It was an age people began seriously questioning things whether torture, witch-burning, slavery, sexism, racism, or homophobia were actually justified, and rules started to be edited or written anew (the Declaration of Independence’s assertion that all men are created equal was one giant leap at the time). On the other hand, Counter-Enlightenment values generally countered (for lack of a better word) the Enlightenment ones. Perhaps something that will cause/is causing/has caused much uproar is Pinker’s link between the Counter-Enlightenment and some of the deadly experiences of the 20th century like the World Wars, Nazism, and Communism. I feel like these are generally associated by the public with Enlightenment values, but that’s another topic. As for the modern day:

“Reason appears to have fallen on hard times. Popular culture is plumbing new depths of dumbth, and American political discourse has become a race to the bottom. We are living in an era of scientific creationism, New Age flimflam, 9/11 conspiracy theories, psychic hotlines, and resurgent religious fundamentalism.” (642)

Pinker makes sure to address the issue of violence from multiple angles. It’s commonly believed, for instance, that a nation’s economy has a significant impact on its violence rates. However, this often seems to be correlation rather than causation. Poor countries in unstable regions are politically… unstable, and politically unstable regions tend to have higher rates of violence. On the other hand, if economy (say by GDP or per capita GDP) were a strong measure of rates of violence, we should expect US violence rates to cycle up and down in response to expansions and recessions, which they do not. Nor, for instance, was Britain’s rise in personal wealth resulting from the Industrial Revolution the reason for a decrease in violence—the decline of violence in Britain had already begun centuries before, but up until the Industrial Revolution, average real wage was flat.

In all, The Better Angels of Our Nature was an extraordinary read. Even though we face tough issues in our time, there are many ancient atrocities that we no longer have to worry about on an institutional scale: “…abduction into sexual slavery, divinely commanded genocide, lethal circuses and tournaments, punishment on the cross, rack, wheel, stake, or strappado for holding unpopular beliefs, decapitation for not bearing a son, disembowelment for having dated a royal, pistol duels to defend their honor…” (30). Those wanting to do away with the decadent present and instead live in the idyllic, peaceful past might be surprised, for if their dreams were to become reality, they would face unspeakable rates of violence and death.

The Signal and the Noise, and Other Readings

The Signal and the Noise

The Signal and the Noise

Since last year’s presidential election, everyone has heard of the legendary Nate Silver, who predicted the outcomes of all 50 states correctly. Given that he also correctly predicted 49 out of 50 states in the 2008 election, this repeat feat seemed like clairvoyance, not coincidence. So the question is, what did Silver do right that so many polls and pundits did wrong?

Statistics.

The Signal and the Noise (2012) is basically a popular applied statistics book, with more history, philosophy, and psychology than formulas. The first half of the book illustrates the failures of prediction including the 2007/8 financial crisis, elections, sports, and natural disasters; the second half explains how to predict the correct way, using Bayesian probability. Overall it does an excellent job at explaining the concepts and not going into mathematical detail (which is probably a plus for most people; even for a math person like me, I know where to look up the details).

Sidenote: While I was reading the chess section, my mind literally blanked for about 10 seconds upon seeing the following:

signal-and-the-noise-chess-error

My chess intuition immediately told me that something was wrong: there is no way this position could have occurred “after Kasparov’s 3rd move.” Since Kasparov was white, this implied the white position must have 3 moves, but clearly there are only two moves: the Knight on f3 (from g1) and the Pawn on b3 (from b2). Yet this book was written by Nate Silver, so he couldn’t have gotten something wrong that was so simple. Once I realized it must have been a mistake, I looked up the game and found that at this point of the game, the g2 pawn should be on g3. I thought it was an interesting mind lapse.

Breaking the Spell

Breaking_The_Spell

This book argues that scientific analysis should be applied to religion. Namely, the title refers to the taboo of preventing rational discussion of religion, and that to “break the spell” is to break the taboo. In addition, it discusses the theories as to how religion arose; ironically the names for such theories are evolutionary theories, as they concern how modern religion has evolved over time from ancient spiritual beliefs (e.g. which specific doctrines maximize a belief system’s chances of survival, etc.).

Reading this means I have now read at least one book from each of the four “horsemen”: Dawkins, Dennett, Harris, and Hitchens. Of the four, Dennett is by far the least provocative. While the other three make arguments that outright use logical analysis on religion, in this book Dennett is the one carefully arguing that one should be allowed to make arguments that analyze religion just as one can on any other phenomena. This book should be nowhere near as controversial as The God Delusion or The End of Faith.

Overall the book makes good points but is quite slow, makes overly cautious caveats, and has a very formal tone. I think if someone like Dawkins had written this, it would be much more readable. I wouldn’t really recommend this to anyone who doesn’t have a lot of interest in philosophy.

CEO Material

CEO_Material

The main competitive advantage of this book over the typical leadership book is that it quotes very often from 100+ real CEOs. Overall these first-hand experiences supplemented the author’s main points quite well. However, for the sake of privacy I presume, the quotations are not labeled with the speaker, so it is sometimes difficult to tell how any particular passage applies to a given situation. For example, do I want to listen to the advice of a food company CEO on a particular issue and apply it to run a tech company? Perhaps the overall message is similar but clearly the details matter. Some say that context is everything, and without the context of who said it, each quote has much less power.

Most of the points seemed like common sense, although that is to be expected—the system is efficient enough that if the most effective behavior for a CEO were radically different from what we already do, then we would have adapted to that already (hopefully). Even so, there are still some interesting points made with real justifications, though again it would be helpful if we knew who said each quote, even for a few of them. In all, Benton did make points that changed the way I look at things, so it was worth reading.

The Blind Watchmaker

Blind_Watchmaker

While The Selfish Gene focuses on how genes propagate themselves and how they dynamically compete over time (evolutionary game theory), The Blind Watchmaker covers an entirely different issue: How did complexity arise?

Some of its answers, written at an earlier time (1986), seem somewhat outdated now, ironically more so than The Selfish Gene which was written even earlier in 1976. This is probably due to The Selfish Gene being more of “Here’s the progress we made in the last decade” when it was written, while The Blind Watchmaker is more along the lines of “Here’s why this work from 1802 is nonsense” and that this counter-argument doesn’t particularly need to invoke the most up-to-date findings.

But anyways, we don’t judge books by how outdated they seem in 30 years, so let’s move on to the content. Due to its premise, the book is more philosophical than The Selfish Gene, which is itself more scientific, hardly addressing at all the conflict between evolution and religion. While The Blind Watchmaker still has a formidable amount of science, it addresses some philosophical questions as well and confronts the conflict head-on. I would recommend it to those looking to question philosophical beliefs, whether of others or of their own.

Mortality

Mortality_Christopher_Hitchens

Of the books in this post, Mortality is the answer choice that doesn’t belong with the others. While the other four are strict nonfiction works that try to explain or teach certain something, Mortality comes off more as a dramatic story, the story of coming to terms with terminal illness. Hitchens opens up with the stark statement, “I have more than once in my life woken up feeling like death.” As usual, Christopher Hitchens’ signature writing style and tone are apparent.

“What do I hope for? If not a cure, then a remission. And what do I want back? In the most beautiful apposition of two of the simplest words in our language: the freedom of speech.”

“It’s probably a merciful thing that pain is impossible to describe from memory.”

“The politicized sponsors of this pseudoscientific nonsense should be ashamed to live, let alone die. If you want to take part in the ‘war’ against cancer, and other terrible maladies, too, then join the battle against their lethal stupidity.”

“The man who prays is the one who thinks that god has arranged matters all wrong, but who also thinks that he can instruct god how to put them right.”

“I have been taunting the Reaper into taking a free scythe in my direction and have now succumbed to something so predictable and banal that it bores even me.”

“Myself, I love the imagery of struggle. I sometimes wish I were suffering in a good cause, or risking my life for the good of others, instead of just being a gravely endangered patient.”

“To the dumb question ‘Why me?’ the cosmos barely bothers to return the reply: why not?”