Are Science and Religion Compatible?

cosmos-titlecard

With the current reboot of Cosmos, the age-old question of whether science and religion are compatible has presented itself in mainstream once again.

If we were to answer this very methodically, we would start questioning the semantics of “science,” “religion,” and “compatible.” The conventional definitions are broad enough that given particular arrangements of definitions, the answer can be made yes or no without much disagreement.

Suppose I frame the question as, “Can someone who considers themselves to be religious also believe in science?” The answer is a factual yes. But what if I frame the question as, “Can someone who asserts a literal interpretation of the Bible believe in a 13.8 billion year old universe with our origins in natural evolution?” The answer is a logical no.

This kind of disparity shows that if two parties are uncareful, debating whether science and religion are compatible can turn into a useless argument of semantics.

Religion and Science as Methods: Asserting vs Searching

Orlando-Ferguson-flat-earth-map

This 1893 illustration by Orlando Ferguson, called “Map of the Square and Stationary Earth,” posits the Earth as, well, not completely flat, but at least much flatter than a globe. Interestingly, at the bottom of the diagram, several Bible verses are pointed out as “scripture that condemns the globe theory.” They include, as stated in the illustration:

And his hands were steady until the going down of the sun.—Ex. 17:12

The world also shall be stable that it be not moved.—1 Chron. 16:30

The four corners of the earth.—Isaiah 11:12

It is he that sitteth upon the circle of the earth.—Isaiah 40:12

He that spread forth the earth.—Isaiah 52:5

…and several more which also do not seem to be explicitly condemning of the sphere theory.

Contrast Ferguson with Eratosthenes, a Greek mathematician who lived in the 3rd century BC who used a systematic method of measurement, shadow lengths, and the tools of geometry to estimate the circumference of the earth circa 240 BC:

circumference_eratosthenes

Using this process, Eratosthenes estimated  the circumference of the earth to be the equivalent of 39,690 km, which is amazingly close to the actual circumference of 40,075 km. It becomes even more amazing if you consider that the known world was not very large in Eratosthenes’ time.

Now what was the point of this comparison other than to show that one guy was really smart and the other was an ignoramus? Answer: There is a clear difference in methodology. Ferguson presupposed that the Bible must be true, and built his map attempting to follow the Bible, e.g., literally with four corners. (“But clearly ‘four corners of the earth’ is a metaphor!” How do you know that?) Eratosthenes set up an experiment to find the angle theta from the center of the earth between Alexandria and Syrene, and then solved for an equation using this value to find the circumference of the earth.

That is, religion asserts truth, whereas science searches for truth. This is where the fundamental disagreement arises.

Ferguson asserted that the Bible is true, and thus any “evidence” must be valid if it affirms the Bible and invalid if it contradicts the Bible. (See confirmation bias.)

Eratosthenes’ experiment does not assert or presuppose that the earth is a sphere beforehand. If it turned out that the earth was flat, Eratosthenes would have measured the angle theta to be zero, and then deduced that the world was indeed flat. Instead, he measured an angle of 7.2 degrees, indicating a curvature of the earth which he then calculated. One method uses circular reasoning (“Because it says so in the Bible”), whereas the other uses an actually legitimate process.

Only a very small portion of people still believe in the flat earth model. So, enter the geocentric model (see the Galileo affair). This is apparently still a common belief: as recently as in a 2014 poll, one in four Americans believe that the sun goes around the earth.

I don’t have a problem with these beliefs in themselves. Ignorance is a good justification for them (not justifying ignorance itself), i.e. if I were not educated or did not have the tools of science at my disposal, and I just used my natural intuition on the shape of the earth, I would probably say it is flat. There is nothing immediately obvious to suggest otherwise. After all, nearly every civilization in antiquity independently came up with the idea that the earth is flat. However, if I am presented with all the overwhelming evidence that the earth is round, and I still reject all of the evidence and still assert the earth is flat because I believe some book must be true because it says it’s true, that would be a much worse offense.

In the paragraph above, you can replace the flat-earth/round-earth phrasing with creationism/evolution, and the same argument would hold.

cosmos-dna

But this can lead to various useless discussions of semantics. What about a religious Christian, for instance, who doesn’t take the Bible literally and can accept scientific facts that are contradictory to literal interpretation? Is this person really “religious”? What about a person (not necessarily religious) who accepts all the relatively older facts that science has shown over time, such as gravity, round earth, and evolution, but refuses to believe the latest advancements in neuroscience? Is this person really “scientific”? (What about a Scotsman who puts sugar in his porridge? Is this person a true “Scotsman”?)

To resolve some of the ambiguity, let’s look at scientists, a category which has a relatively clear definition.

What About “Religious Scientists”?

“But there are scientists who believe in God!” Yes, there are! In fact, a whopping 7% of the National Academy of Sciences believes in a god, the other 93% being atheists and agnostics. The figure is not as extreme for scientists in general (Pew):

Scientists-and-Belief-1

And for specific affiliation:

Scientists-and-Belief-2

Going from general public to scientists, atheists increase representation by a factor of 8, agnostics by a factor of 5, and Jews by a factor of 4, while evangelical Protestants decrease representation by a factor of 7. What does this mean? Assuming the data is accurate, this implies (1) atheists/agnostics/Jews were filtered out and more likely to be scientists in the first place, and/or (2) somewhere along the process of becoming a scientist, some Christians de-converted. Since changing one’s religion is relatively rare (and this doesn’t explain the Jewish case), it must be explained mostly by (1), that certain groups are more prone to becoming a scientist to begin with, i.e. a selection effect.

Explaining why Christianity is negatively correlated to science is still an interesting question that would deserve an entire post. However, if I had to give a single answer, it would be the social stigma against science (largely Christian-perpetuated).

Another interesting question is explaining why the percentage of Jewish scientists is 4 times higher than the Jewish percentage in the general population. Even though being a Christian decreases your chances of being a scientist, being Jewish increases it significantly. If I had to give an answer here, it would be that Judaism is far more open-minded than Christianity in America, because most American Jews consider being Jewish as more a matter of ancestry/culture rather than as a matter of religion (source):

jew-overview-2

So, at least part of the increase in the number of Jews from the general population to scientists can be explained by justified comparison to the increase in the number of atheists.

Declarations of Compatibility

Many religious denominations today have declared that that scientific concepts like evolution are not in conflict with their faiths. But it’s one thing to declare something and another to show it. In the history of religion’s acceptance of scientific ideas, the enormous delay is more telling than the final admittance of wrongness, which could come centuries later. The Galileo affair in the early 1600’s was not apologized for by the Catholic Church until 1992 under Pope John Paul II. Even as late as 1990, Cardinal Ratzinger (future Pope Benedict XVI) gave a speech using this particular quote by Paul Feyerabend (source):

The Church at the time of Galileo kept much more closely to reason than did Galileo himself, and she took into consideration the ethical and social consequences of Galileo’s teaching too. Her verdict against Galileo was rational and just, and the revision of this verdict can be justified only on the grounds of what is politically opportune.

An even more recent fact took a century and a half to be admitted, and only in part, by official Catholic doctrine: evolution. While On the Origin of Species was published in 1859, it was not until 1996 that John Paul II officially accepted evolution. Even so, in the acceptance letter, he included the following caveat (source):

Theories of evolution which, because of the philosophies which inspire them, regard the spirit either as emerging from the forces of living matter, or as a simple epiphenomenon of that matter, are incompatible with the truth about man.

It’s hard to take seriously an organization that refuses an idea so fervently only to attempt to vindicate it centuries later. In the United States, 46% of the population still believe in creationism (source), while a further 32% believes in theistic evolution, with God as an agent in evolution (which kind of defeats the purpose of evolution).

Of course, the slowness of religion to adopt ideas is not just confined to scientific truths. What were some justifications for slavery during the US Civil War? Let’s hear some not-so-well-known quotes by Jefferson Davis, President of the Confederacy (source; going to list multiple to make it clear I’m not just taking one of them out of context…):

“If slavery be a sin, it is not yours. It does not rest on your action for its origin, on your consent for its existence. It is a common law right to property in the service of man; its origin was Divine decree.” ~Davis

“African slavery, as it exists in the United States, is a moral, a social, and a political blessing.” ~Davis

“My own convictions as to negro slavery are strong. It has its evils and abuses…We recognize the negro as God and God’s Book and God’s Laws, in nature, tell us to recognize him – our inferior, fitted expressly for servitude…You cannot transform the negro into anything one-tenth as useful or as good as what slavery enables them to be.” ~Davis

“It [slavery] was established by decree of Almighty God…it is sanctioned in the Bible, in both Testaments, from Genesis to Revelation…it has existed in all ages, has been found among the people of the highest civilization, and in nations of the highest proficiency in the arts…Let the gentleman go to Revelation to learn the decree of God – let him go to the Bible…I said that slavery was sanctioned in the Bible, authorized, regulated, and recognized from Genesis to Revelation…Slavery existed then in the earliest ages, and among the chosen people of God; and in Revelation we are told that it shall exist till the end of time shall come. You find it in the Old and New Testaments – in the prophecies, psalms, and the epistles of Paul; you find it recognized, sanctioned everywhere.” ~Davis

Of course, we can find many, many more recent examples in views expressed on women, interracial marriage, and homosexuality. But this article is on religion and science, so let’s get back on topic.

The point of bringing up these social examples is to demonstrate that religion is not “compatible” in the sense that it has supported homosexuality all along (which it obviously hasn’t). Rather, 100 years in the future, when homosexuality is regarded like having green eyes is today, religious advocates will claim that the fact that religion ended up accepting homosexuality is evidence of its compatibility with homosexuality.

To make it explicit for the science case, in no way was evolution compatible with religion when Darwin was around. Only after one and a half centuries, after revision of doctrine and turning some things into metaphors instead of literal truth was it officially declared that they can they be logically held together. That is, when two contradictory ideas are held together, one has to budge (unless doublethink). In the case of evolution, it was religion that budged. Same with the heliocentric theory: in the 1600s, heliocentrism and religion were not compatible. Only after religion changed into something else was it compatible.

This again raises the question of what we mean by “compatible.” Say Bob and Joe are in a room, and each time this happens, Bob cannot stand Joe and beats him up. We take Bob out and put him under an anger management program, but directed only at Joe. That is, now when Bob and Joe are in the same room, Bob is nice to Joe, and in fact they become friends. However, when you put anyone else in the same room with Bob, Bob will unfailingly beat that person up, until you train Bob to be nice to that particular person.

If I ask, “Are Bob and Joe compatible,” the answer might be yes, only after the psychological treatment. However, if I ask, “Is Bob compatible with having a new person added to his room,” the answer is no. I think this is analogous to religion and science. Religion was at first incompatible with heliocentrism, but after a grueling long time, now it is not. Religion was incompatible with evolution, but after a long time, now it is not. However, religion is incompatible with science as the method, the process of adding new people to the room. Religion is compatible with particular areas of scientific results after rejecting them for as long as possible. However, to be actually compatible with adding new people to the room, Bob needs to be subjected to a session where he learns that beating up anyone is wrong, not just certain people. Whenever religion learns a lesson, whether it’s heliocentrism is right or evolution is right (or slavery is wrong), it never applies that lesson to anything else. (“Oh I understand that it’s wrong to hate on interracial couples now, let’s hate on gays!”)

To the question, “Are science and religion compatible,” my answer is a qualified no.

The Signal and the Noise, and Other Readings

The Signal and the Noise

The Signal and the Noise

Since last year’s presidential election, everyone has heard of the legendary Nate Silver, who predicted the outcomes of all 50 states correctly. Given that he also correctly predicted 49 out of 50 states in the 2008 election, this repeat feat seemed like clairvoyance, not coincidence. So the question is, what did Silver do right that so many polls and pundits did wrong?

Statistics.

The Signal and the Noise (2012) is basically a popular applied statistics book, with more history, philosophy, and psychology than formulas. The first half of the book illustrates the failures of prediction including the 2007/8 financial crisis, elections, sports, and natural disasters; the second half explains how to predict the correct way, using Bayesian probability. Overall it does an excellent job at explaining the concepts and not going into mathematical detail (which is probably a plus for most people; even for a math person like me, I know where to look up the details).

Sidenote: While I was reading the chess section, my mind literally blanked for about 10 seconds upon seeing the following:

signal-and-the-noise-chess-error

My chess intuition immediately told me that something was wrong: there is no way this position could have occurred “after Kasparov’s 3rd move.” Since Kasparov was white, this implied the white position must have 3 moves, but clearly there are only two moves: the Knight on f3 (from g1) and the Pawn on b3 (from b2). Yet this book was written by Nate Silver, so he couldn’t have gotten something wrong that was so simple. Once I realized it must have been a mistake, I looked up the game and found that at this point of the game, the g2 pawn should be on g3. I thought it was an interesting mind lapse.

Breaking the Spell

Breaking_The_Spell

This book argues that scientific analysis should be applied to religion. Namely, the title refers to the taboo of preventing rational discussion of religion, and that to “break the spell” is to break the taboo. In addition, it discusses the theories as to how religion arose; ironically the names for such theories are evolutionary theories, as they concern how modern religion has evolved over time from ancient spiritual beliefs (e.g. which specific doctrines maximize a belief system’s chances of survival, etc.).

Reading this means I have now read at least one book from each of the four “horsemen”: Dawkins, Dennett, Harris, and Hitchens. Of the four, Dennett is by far the least provocative. While the other three make arguments that outright use logical analysis on religion, in this book Dennett is the one carefully arguing that one should be allowed to make arguments that analyze religion just as one can on any other phenomena. This book should be nowhere near as controversial as The God Delusion or The End of Faith.

Overall the book makes good points but is quite slow, makes overly cautious caveats, and has a very formal tone. I think if someone like Dawkins had written this, it would be much more readable. I wouldn’t really recommend this to anyone who doesn’t have a lot of interest in philosophy.

CEO Material

CEO_Material

The main competitive advantage of this book over the typical leadership book is that it quotes very often from 100+ real CEOs. Overall these first-hand experiences supplemented the author’s main points quite well. However, for the sake of privacy I presume, the quotations are not labeled with the speaker, so it is sometimes difficult to tell how any particular passage applies to a given situation. For example, do I want to listen to the advice of a food company CEO on a particular issue and apply it to run a tech company? Perhaps the overall message is similar but clearly the details matter. Some say that context is everything, and without the context of who said it, each quote has much less power.

Most of the points seemed like common sense, although that is to be expected—the system is efficient enough that if the most effective behavior for a CEO were radically different from what we already do, then we would have adapted to that already (hopefully). Even so, there are still some interesting points made with real justifications, though again it would be helpful if we knew who said each quote, even for a few of them. In all, Benton did make points that changed the way I look at things, so it was worth reading.

The Blind Watchmaker

Blind_Watchmaker

While The Selfish Gene focuses on how genes propagate themselves and how they dynamically compete over time (evolutionary game theory), The Blind Watchmaker covers an entirely different issue: How did complexity arise?

Some of its answers, written at an earlier time (1986), seem somewhat outdated now, ironically more so than The Selfish Gene which was written even earlier in 1976. This is probably due to The Selfish Gene being more of “Here’s the progress we made in the last decade” when it was written, while The Blind Watchmaker is more along the lines of “Here’s why this work from 1802 is nonsense” and that this counter-argument doesn’t particularly need to invoke the most up-to-date findings.

But anyways, we don’t judge books by how outdated they seem in 30 years, so let’s move on to the content. Due to its premise, the book is more philosophical than The Selfish Gene, which is itself more scientific, hardly addressing at all the conflict between evolution and religion. While The Blind Watchmaker still has a formidable amount of science, it addresses some philosophical questions as well and confronts the conflict head-on. I would recommend it to those looking to question philosophical beliefs, whether of others or of their own.

Mortality

Mortality_Christopher_Hitchens

Of the books in this post, Mortality is the answer choice that doesn’t belong with the others. While the other four are strict nonfiction works that try to explain or teach certain something, Mortality comes off more as a dramatic story, the story of coming to terms with terminal illness. Hitchens opens up with the stark statement, “I have more than once in my life woken up feeling like death.” As usual, Christopher Hitchens’ signature writing style and tone are apparent.

“What do I hope for? If not a cure, then a remission. And what do I want back? In the most beautiful apposition of two of the simplest words in our language: the freedom of speech.”

“It’s probably a merciful thing that pain is impossible to describe from memory.”

“The politicized sponsors of this pseudoscientific nonsense should be ashamed to live, let alone die. If you want to take part in the ‘war’ against cancer, and other terrible maladies, too, then join the battle against their lethal stupidity.”

“The man who prays is the one who thinks that god has arranged matters all wrong, but who also thinks that he can instruct god how to put them right.”

“I have been taunting the Reaper into taking a free scythe in my direction and have now succumbed to something so predictable and banal that it bores even me.”

“Myself, I love the imagery of struggle. I sometimes wish I were suffering in a good cause, or risking my life for the good of others, instead of just being a gravely endangered patient.”

“To the dumb question ‘Why me?’ the cosmos barely bothers to return the reply: why not?”

Why Are College Students Not Choosing Math/Science?

microscope

From the Wall Street Journal in 2011:

Although the number of college graduates increased about 29% between 2001 and 2009, the number graduating with engineering degrees only increased 19%, according to the most recent statistics from the U.S. Dept. of Education. The number with computer and information-sciences degrees decreased 14%

After coming up with the topic for the post, I found this article from 2011 with a similar title and citing the same WSJ story. It argued that the high school teaching environment was not adequate in preparing students for rigorous classes in college. 

In addition, the article includes the argument that in the math and sciences, answers are plain right or wrong, unlike in the humanities and social sciences.

I can agree with these two points, but I want to add a few more, with the perspective of year 2013. Also, I am going to narrow down the STEM group a bit more, to just include math and science. The main reason is that in the past years, the number of CS majors has actually increased rapidly. At Cornell, engineering classes can be massive and there does not seem to be a shortage of engineers. Walk into a non-introductory or non-engineering-oriented math class, however, and you can often count the number of students with your fingers. So even though STEM as a whole is in a non-optimal situation, engineering and technology (especially computer science) seem to be doing fine. So then the question remains.

Why Is America Leaving Math and Science Behind?

I mean this especially with regards to theoretical aspects of math and science, including academia and research.

In this situation, money is probably a big factor. The salary of a post-grad scientist (from one article at $37,000 to $45,000) is pitiful compared to that in industry (which can a median early-career salary of up to $95,000, depending on the subject, according to the same article). Essentially there is a lack of a tangible goal.

There are other factors besides money. Modern math and science can be quite intimidating. All major results that could be “easily” discovered have already been discovered. In modern theoretical physics, for instance, the only questions that remain are in the very large or the very small—there is little left to discover of “tabletop” physics, the physics that operates at our scale. Most remaining tasks are not problems in physics, but puzzles in engineering.

Modern mathematics is very similar. While there are many open questions in many fields, the important ones are highly abstract. Even stating a problem takes a tremendous amount of explanation. That is, it takes a long time to convey to someone what exactly it is you are trying to figure out. The math and science taught in high school is tremendously unhelpful in preparing someone to actually figure out new math and science, and it is thus difficult for an entering college student to adjust their views of what math/science are.

Even the reasons for going to college have changed. More than ever, students list their top reason for going to college as getting better job prospects rather than for personal or intellectual growth.

In addition, society seems more than before focused on immediate gain rather than long term investment. Academia’s contribution to society, especially in math and science, is often not felt until decades or even centuries after something was invented. Einstein’s theories of relativity had no practical application when he made them, but our gadgets now use relativity all the time. Classical Greece knew about prime numbers, but prime numbers were not useful until modern-age data encryption was required. Even a prolific academic could receive very little recognition in one’s own life.

However, with the rise of online social networks in the last several years, you can now see what your friends are up to and what they are accomplishing in real-time. This should at least have some psychological effect on pushing people towards a career where real, meaningful progress can be tracked in real-time. Doing something that will only possibly have an impact decades later seems to be the same as doing nothing.

Considering the sentiment of the last few paragraphs, it might sound like I am talking about the decline in humanities and liberal arts majors. Indeed, while the number of math and science majors is increasing (though not as much as in engineering/technology), it almost seems like the theoretical sides of math and science are closer in spirit to the humanities and liberal arts than they are to STEM. The point is not for immediate application of knowledge, but to make contributions to the overall human pool of knowledge, to make this knowledge available to future generations.

Is this just a consequence the decline of education or the fall of academia in general? STEM is not really education in the traditional sense. It is more like technical training.

In all, the decline of interest in theoretical math/science is closely correlated with the decline of interest in the humanities/liberal arts. Our culture is fundamentally changing to one that values practicality far more than discovery. (For instance, when is NASA going to land a human on Mars? 2037. JFK might have had a different opinion.) Overall this is a good change, mainly in the sense of re-adjusting the educational demographics of the workforce to keep America relevant in the global economy. But, we should still hold some value to theory and discovery.

Additional resources:

  • National Science Foundation statistics – [link]
  • National Center for Education Statistics – [link]
  • Pew social trends – [link]

Survival of the Selfish Gene

After reading The God Delusion, I decided to study some of Richard Dawkins’ earlier works. For this post, I read The Selfish Gene (and among the books on my queue are The Blind Watchmaker and The Greatest Show on Earth).

the-selfish-gene

Published in 1976, The Selfish Gene explores the phenomena at play regarding the behavior of replicators, namely genes and memes. I was expecting to see lots of biological arguments, and while there are many, I was shocked at what I found was the main tool used in the book: game theory.

Of course, once you think about it, it makes perfect sense that game theory is extremely important when talking about genes and how they spread from one generation to the next. And by game theory, I do not mean board games or video games, but economic game theory, applied to biology in what is now known as evolutionary game theory. In fact, this book would be an excellent read for people interested in mathematics or economics, in addition to the obvious group of those interested in biology. Dawkins uses concepts like Nash equilibria, though the term is not explicitly stated (consider the date of the book), and the Prisoner’s Dilemma, just for a couple examples, to explain many biological behaviors found in various animals, including humans. This kind of game-theoretic analysis followed largely from the work of John Maynard Smith.

In addition to having studied a bit of game theory, I have also studied dynamical systems, though from the perspective of pure math and not biology. Even so, the concepts in the book were very familiar. I do not think The Selfish Gene is controversial from an academic standpoint. The now 40-year old ideas are still relevant today, and the ideas are really not that difficult to understand, given a sufficient mathematical and scientific background.

Instead, the controversy around the book seems to come solely from the title itself, and perhaps the attached stigma to writing anything about evolution, which seems to be more of an issue today than it was in 1976. Dawkins notes this years later in the preface to the second edition:

This is paradoxical, but not in the obvious way. It is not one of those books that was reviled as revolutionary when published, then steadily won converts until it ended up so orthodox that we now wonder what the fuss was about. Quite the contrary. From the outset the reviews were gratifyingly favourable and it was not seen, initially, as a controversial book. Its reputation for contentiousness took years to grow until, by now, it is widely regarded as a work of radical extremism.

I do find this amusing. It seems to have not to do specifically with the theory of evolution itself, but with the unfortunate anti-intellectual sector of the US. (Of course, Dawkins is from the UK, but I am talking about American opinion of these kinds of books.)

In current society it seems like a fad to wear one’s ignorance on one’s sleeve, as if boastfully declaring, “My ignorance is just as good as your knowledge.” Of course I am not advocating that we should go the opposite direction and be ashamed for not learning, but we should be able to come together and agree that ignorance is not a virtue, especially not in the most scientifically advanced country in the world. I am not really sure how the United States is supposed to recover from this, other than that we become more reasonable over time. And that will take education, not ignorance.

The title of the book is somewhat misleading, only if one does not understand what the word “selfish” is describing. The “selfish gene” is not so much talking about a gene that causes selfishness in individuals (this is an ambiguous notion in itself), but rather, it describes the word “gene” directly, that genes themselves propagate themselves in a manner that appears selfish. The individual is merely a “survival machine” for the gene. There is a critical difference here between the two notions.

The selfish gene is merely a gene that, for practical reasons, has a higher chance of being passed on. It does not really contradict any current notion of evolution, and in fact, at the time of publication, it became the new and improved theory of evolution that is now the textbook standard. In any case, the message is that evolution works not by the survival of the fittest individuals, but by the survival of the fittest, or most selfish, genes.

When we look at the selfish gene, there are situations (as demonstrated in the book) where the intrinsically selfish thought appears on the outside as altruistic. Mutual back-scratching benefits both individuals, and moreover, benefits the gene for it, thus making the gene more likely to spread. So while the behavior of back-scratching seems altruistic, it may be nothing more than concealed selfishness. This idea can be extrapolated to many phenomena. Often people put on acts and fake displays of kindness only for the selfish benefit of “seeming” nice. Or they are so “humble” that they announce their humbleness everywhere they please and make you feel bad for not being as humble as they are. The list goes on. However, I will not comment too much on this as this goes under cultural behavior and not strictly genetic behavior, although they are related.

The controversy around this book also seems to stem from perceived personal offense. Included in The Selfish Gene is an interesting quote from Simpson regarding historical developments in explaining how the current species on Earth came to be:

Is there a meaning to life? What are we for? What is man? After posing the last of these questions, the eminent zoologist G. G. Simpson put it thus: ‘The point I want to make now is that all attempts to answer that question before 1859 are worthless and that we will be better off if we ignore them completely.’

While this statement is perfectly true in trying to understanding biology, I can see how religious people might take offense. To declare that all mythological ideas in this area before Darwin’s The Origin of Species are worthless is a bold claim, even when it is correct.

Regarding the actual content of the book, I have already mentioned that Dawkins makes extensive use of game theory. There are many numbers in some of the more technical chapters, making the book possibly difficult to read in real-time unless the reader is versed in mental mathematics. Though, with some deliberate thought on these chapters, any reader should be able to get through them.

The Selfish Gene is a remarkable book, giving clear explanations of basic biology and evolutionary game theory for the layman. It is a shame that such educational material is viewed as controversial. I wish I could succinctly summarize the fascinating interplay of evolutionary game theory in a single post, but it would be better to leave it to you to pick up this book and think about it for yourself. If you do not like evolution, however, you have been warned.

For Science: Neil deGrasse Tyson’s “Death by Black Hole”

Death By Black Hole

Death by Black Hole is an epic read. What makes this stand out from the average science essay collection is Neil deGrasse Tyson’s unwavering expertise in combination with his remarkably down-to-Earth explanations of not only how things happen, but also of how we discovered how things happen.

For instance, everyone today knows there is a constant speed of light, and we actually encounter it, sometimes in latency in the Internet. But as far as our intuition goes, light moves infinitely fast, i.e. it is instantaneous. In fact, I still remember Bill Nye the Science Guy trying to outrun a beam of light in his show. After many tries, he was never able to succeed.

Tyson reveals to us that even Galileo, in 1638, thought that light was instantaneous, when his lantern experiment failed to yield a measurable delay. It was not until Ole Rømer who first saw and interpreted correctly the evidence that light is not instant. In “Speed Limits”:

Years of observations had shown that, for Io, the average duration of one orbit—an easily timed interval from the moon’s disappearance behind Jupiter, through its re-emergence, to the beginning of its next disappearance—was just about forty-two and a half hours. What Rømer discovered was that when Earth was closest to Jupiter, Io disappeared about eleven minutes earlier than expected, and when Earth was farthest from Jupiter, Io disappeared about eleven minutes later.

Rømer reasoned that Io’s orbital behavior was not likely to be influenced by the position of Earth relative to Jupiter, and so surely the speed of light was to blame for any unexpected variations. The twenty-two-minute range must correspond to the time needed for light to travel across the diameter of Earth’s orbit. From that assumption, Rømer derived a speed of light of about 130,000 miles a second. That’s within 30 percent of the correct answer—not bad for a first-ever estimate…. (p. 120)

That someone deduced the speed of light with 1600’s technology is remarkable.

In addition, Tyson enlightens us with the exciting information we all want to know. Antimatter, for instance, annihilates on contact with normal matter, releasing tremendous amount s of energy. In Dan Brown’s Angels and Demons, a tiny vial of antimatter explodes with the violence of a nuclear bomb. But what if a Sun made out of antimatter collided with our own Sun? How big would the blast be? According to Tyson in “Antimatter Matters,” the explosion would be frighteningly large:

If a single antistar annihilated with a single ordinary star, then the conversion of matter to gamma-ray energy would be swift and total. Two stars with masses similar to that of the Sun (each with about 1057 particles) would be so luminous that the colliding system would temporarily outproduce all the energy of all the stars of a hundred million galaxies. (p. 106)

While this anthology is comprised of essays which are all distinct and divided into categories, it is still possible enough to read it like a normal book from start to finish if you are a science enthusiast.

However, given the sheer variety of different topics, there are wide jumps of topics and some overlap of subject material between essays that might alienate a some readers. This was not too much of an issue for me, but I did find the lack of an overall thesis sort of strange, and this this forced me to read it in a different manner than for most books. For someone interested in a popular book on astrophysics that was originally intended as a book, I would highly recommend Michio Kaku’s Physics of the Impossible, which is more coherent and packs more punch than Death by Black Hole.

This is not to say that Death by Black Hole is without merit. It is one of the few books to explain not just the contents of scientific discoveries, but also the discovery process itself, which can oftentimes be more fascinating to learn about than the results. Neil deGrasse Tyson is one of the finest communicators of science in our time, and I always find his talks on YouTube fascinating. As an essay collection on science, Death by Black Hole is unmatched.

The Legacy of 2012: The Fall of Superstition

This is similar to my Legacy of 2009 post. I didn’t write one for 2010 or 2011, but the theme was similar. In 2010, social media really exploded, and by 2011 it was all but set in stone. But as this was happening, another explosion was occurring: the smartphone. In early 2012, smartphones hit 50% saturation in the United States, and thus statistically they are near peak sales (though perhaps holiday season 2012 might be the final spike).

Source:

smartphone-additions-2012

The global smartphone market is rising rapidly as well. With more people than ever having the sum total of human knowledge within arm’s reach, this leads to the year’s real legacy:

2012: The Fall of Superstition

On December 21, 2012 the world was supposed to come to an end, thought 10% of the world’s population. But on that day, nothing happened. The universe continued as normal. Perhaps a massively failed doomsday can help awaken the world from superstition.

2012_Poster

At 12%, the most scientifically and technologically sophisticated country in the world, the United States, was shockingly above average in the 2012 doomsday belief. Yet, this fact may not be too surprising, for among that of the developed countries, the US has been and is the most religious by far.

But this may change in several more years. From a 2012 Pew Research survey, non-religion is quickly growing in the United States, gaining roughly 5 percentage points in 5 years.

rise of no religion pew graph

More importantly, with the strong correlation between non-religion and younger age, the growth of non-religion is poised to accelerate in the upcoming years. In one of my previous posts, I predicted that secularism will be one of the next sociopolitical movements, following the previous Civil Rights and feminist movements, and also the current LGBT movement.

Along with a shock to superstition came several great advancements in science. The Curiosity rover landed with high precision in a daring and suicidal-looking sky crane maneuver.

MSL_Curiosity

Another significant achievement is the confirmation of the Higgs boson, whose existence in turn confirms the Standard model, one of the most intricate scientific theories to date. Other great scientific accomplishments of 2012 include:

With all the remarkable advancements, 2012 was not without disappointments. In Italy, six scientists were sentenced to prison for an earthquake. Wake up Italy, you’re not the 1600’s anymore.

In 2012 the expiring Kyoto Protocol was extended to 2020, but did global carbon dioxide levels actually fall? Nope. The went up 58%, in huge part due to China’s industrial growth combined with its disregard for the environment.

co2-emissions

Meanwhile, the United States is doing okay, with its CO2 production in 2012 at the lowest in 20 years. However, to have meaningful impact in reducing climate change, it will take a true global green movement, which unfortunately will be at least a decade away.

Conclusion

Despite being a year filled with progress, 2012 had its setbacks. Religious extremists violently demonstrated the fundamental tenets of their “religion of peace” in response to a satirical film, and also attempted to assassinate a 15 year old girl who just wanted education for everyone.

Elsewhere in the Middle East, the Israel-Hamas conflict set the world on edge for eight days. And though it paled in comparison to many other issues in the world, the Sandy Hook shooting shook and saddened America, and should lead to gun control being a more important issue.

Overall though, 2012 was a good year. With extraordinary scientific advancements (above), social advancements (though not met worldwide such as in this case or this one), as well as the reelection of President Obama, the year 2012 ends with a world that is more smart, aware, and progressive than it ever was, more potent than ever before to deal with religious extremism, war, and environmental destruction.

Are We in a Simulation? A Scientific Test

According to a recent article, scientists are planning a test to determine whether our universe is a computer simulation. This is pretty relevant to my blog as I have discussed this idea a number of times before [1] [2] [3] [4].

Soft Watch at the Moment of First Explosion

Of course, the must-read paper on this subject is philosopher Nick Bostrom’s article, “Are You Living in a Computer Simulation?” The implication, given a couple of premises, is that we are almost certainly living in a computer simulation. Not only that, but the argument posits that our simulators are themselves extremely likely to be in a simulation, and those simulators are likely too to be in a simulation, etc.

Indeed, how will scientists test for signs of a simulation?

“Currently, computer simulations are decades away from creating even a primitive working model of the universe. In fact, scientists are able to accurately model only a 100 trillionth of a metre, with work to create a model of a full human being still out of reach.”

spiral_galaxy

Even so, there are limitations beyond technical ones that should be considered. If a test does not find any evidence of our being in a simulation, that does not rule out the possibility—in fact, a very well-designed simulation would be very difficult, if not actually impossible, to tell apart from a “reality” to its inhabitants.

Conversely, suppose a test that did find “evidence” that we are in a simulation. How would we judge this evidence? How can we know which way the evidence is supposed to point? After all, even if we find “glitches,” they could turn out to be part of a larger set of natural laws.

As Richard Feynman once thought, suppose we are observing a chess game but are not told what the rules are. After looking at various snapshots of a game, we can piece together some of the rules, and eventually we will learn that a Bishop must stay on the same color when it moves. But one snapshot later, we find that the only Bishop in the game is now on a different colored square. There would be no way of knowing, without looking at many more games, that there is a rule where a Pawn can promote into another piece, such as a Bishop, and that the old Bishop was captured. Without this knowledge, we might have thought that the Bishop changing color was a glitch.

Now back to the article.

“By testing the behaviour of cosmic rays on underlying ‘lattice’ frameworks governing rules of physics that could exist in future models of the universe, the researchers could find patterns that could point to a simulation.”

Many disciplines would have to come together here to prove something fundamentally “wrong” with our universe. It would be the junction point of computer science, physics, philosophy, mathematics, neuroscience, astronomy.

The plan given in the article is a noble one, but I do not expect it to grant any important experimental data soon. Rather, it is the tip of an immense iceberg that will be explored in not years or decades, but millennia to come.