Kindles and Current Reading List

My Kindle Paperwhite arrived today, and after using it for only an hour, I wonder how I managed to get by without one. It would have made the last 7 years a much more reading-conducive experience. At home, I have bookshelves full of books and I end up not even reading many of them, partly because I don’t have a great non-expensive way of transporting books from Texas to New York. And the physical books I do have in NY can get weighty. For practicality, I should have used the library system more. But there’s a certain irreplaceable feeling of having your own books that you can read at your own pace. Anyway, I think a Kindle solves all my reading problems and perhaps I can reassign my old books to other uses.


In this link is the past reading list. And currently on the summer plate:

  • Capital in the Twenty-First Century – Thomas Piketty
  • The Price of Inequality – Joseph Stiglitz
  • An Appetite for Wonder – Richard Dawkins
  • The Big Short – Michael Lewis
  • Drift – Rachel Maddow
  • The Virtual Executive – Debra Benton
  • Nudge – Richard Thaler and Cass Sunstein
  • Predictably Irrational – Dan Ariely

How Movies Have Conditioned Us to Hate Science and the Future

According to film, science and technology solve nothing. Either one of two things occur: (1) the exact same social problems will happen in the future even with significantly advanced technology, or (2) social problems will be even worse than they are today.

The perspective I am writing this from is that of concern with the future of American education with particular interest in math and science. There are many voices in the STEM discussion. I just hope to contribute in fleshing out the relation between the public sentiment towards science and Hollywood’s portrayal of science.

1. The Future Sucks


I have not read the books, but The Hunger Games is quite dystopic: a society where young people are randomly selected and put to a grandiose battle to the death, as entertainment for the upper classes. But the stadium is an extraordinary technological feat: the environment can be changed at will, fires can be triggered anywhere, and cameras are hidden in every location. Of course, those with advanced technology are bad. Those with poor technology are good.


Elysium makes the technological divide even more blatant. The rich, bad guys are in a utopian, ultra-technologically advanced ship experiencing luxurious lives with all-powerful healing chambers, leaving the rest of humanity, i.e. the good guys, to rot away on a dystopic Earth.


With the Terminator franchise, the message is clear: Artificial intelligence is super evil! Don’t let the machines ever have power, else they will kill you.




And that.


And that.


Also that. And many, many more. Every time, technological advances lead to a terrible world devoid of any current notion of morality.

2. Scientists Are Evil Murderers


The premise of Alien is massively disheartening. The off-camera scientists want to study an alien creature at all costs, disregarding all morality, i.e., letting a killer alien parasite on board and massacre everyone (almost). Of course, a backstabbing android was in on the conspiracy from the start.


Yes, Prometheus is part of the Alien franchise, but it is so insulting to scientists that it deserves its own rant. The scientists in this movie are so stupid that no one would ever want to be a scientist after seeing this movie. From Cracked:

“Instead of a worthy follow-up to the best sci-fi action movie ever, we got an attempt at a stand-alone plot that wouldn’t have even happened if the characters weren’t stupid enough to pet alien snakes, get lost in tunnels that they themselves had mapped, and take their helmets off on an alien planet most likely so full of dangerous microbes that they’d be shitting their intestines out within the hour. Seriously, they’re like the dumbest scientists ever.”


Regarding The Last Days on Mars:

“Another Prometheus basically. In the way that the world’s most prominent scientists are trusted to be the first to search for life on Mars, then they turn out to be a bunch of emotion driven morons making the most ridiculous and rash nonsensical decisions they could make time and time again. I really don’t see why the people making these types of movies feel the need to have these people constantly being petty emotion driven morons. Things can go wrong even when the people are making the right decisions.”

The “emotion driven moron” depiction of scientists is superbly ironic. Are they trying to criticize scientists in general, i.e. criticizing rationality and intelligence, and supporting emotion and ignorance? Or are they trying to criticize emotions and idiocy, i.e. supporting scientists?


Dammit scientists, stop sciencing!


Chemistry = monsters!


Seriously, stop it, scientists.


We give up.

3. Zombie Apocalypse, or Any Man-Made Apocalypse


The Umbrella Corporation makes us really hate science. When not creating zombie viruses, it does… whatever the heck it does, making other viruses and figuring out how to murder people. Good job, Resident Evil.


While the release of the virus in 28 Days Later subverts the typical trope in that it was caused by animal rights activists, the blame is on the scientists for having those caged infected animals stuck at a research lab in the first place.


I don’t remember World War Z too well, but I remember the scientist was practically useless and accidentally killed himself in a hilariously undignified fashion.

Either science will cause the apocalypse, or given the apocalypse, it is old-fashioned values that triumph over science.

4. Nature/Magic/Tradition/Spirituality/Irrationality/Emotion vs Science


Avatar is basically the ultimate nature vs technology film ever made, and of course, nature trumps technology easily. In addition, nature is good and technology is bad. You could argue that the message of this movie, or any of the ones above, is good: technology is not automatically good, and we should not take technological superiority as an excuse to exploit others. But the message of “science is not necessarily good,” hammered into our brains again and again and again, that “science is not necessarily good,” eventually translates to “science is evil.” In addition, these types of movies always depict science as in conflict with something like nature or emotions, when in reality, science tries to help them.


A man with some emotion (good) vs a society where emotion is forbidden (evil). It assumes that advancements in science automatically lead to its being used for totalitarian control somehow.


A man with good conscience (good) vs a cold rational police force (evil).


The answer is always love.


An ancient traditional religion (Jedi, The Force, lightsaber resembling a sword) triumphs over technology (Death Star, droids, and laser guns). And yes, this happens a long time ago, but it pragmatically fits into our analysis of sentiments of the future.


Even in an age of interstellar space exploration, people still are adversely affected by notions like revenge, anger, self-interest, massive-scale conspiracy, and the pursuit of personal power. (On the other hand, the original TV series were quite optimistic. Such negative “human” traits were mostly absent, and when they did appear, it was because the crew was observing a less advanced civilization that still had them.)

As a caveat, I’d like to point out that I think most of the movies above are individually great. But if you combine all the anti-technology, anti-future sentiments, you get an extremely negative, if not socially dangerous, depiction of the future.

Poll Results on Technological Optimism

Because of the linearity of scientific progress, much of anti-science sentiment is related to anti-future sentiment. According to one poll, 48% think that America’s best days are in the past (Rasmussen, 2014). Another poll reports that 30% of Americans believe that future technological changes will cause people’s lives to be mostly worse (Pew, 2014). From the site’s own findings:

  • “66% think it would be a change for the worse if prospective parents could alter the DNA of their children to produce smarter, healthier, or more athletic offspring.
  • 65% think it would be a change for the worse if lifelike robots become the primary caregivers for the elderly and people in poor health.
  • 63% think it would be a change for the worse if personal and commercial drones are given permission to fly through most U.S. airspace.
  • 53% of Americans think it would be a change for the worse if most people wear implants or other devices that constantly show them information about the world around them. Women are especially wary of a future in which these devices are widespread.”

These percentages are affected by many factors. For instance, wealthier people are generally more optimistic about the future of technology: 52% of those with an income of $30,000 or less think technology will be for the better, but 67% of those with an income of $75,000 or more do.


According to Gallup, there is also a significant partisan gap in optimism, with Democrats significantly more optimistic: 74% of Republicans have positive views of America 5 years in the past, whereas 75% of Democrats have positive views of America 5 years in the future.

This post was inspired by Neal Stephenson’s argument that science fiction is fixated on nihilism and apocalyptic scenarios and that sci-fi should dream more optimistically. From the Smithsonian Mag website: “He fears that no one will be inspired to build the next great space vessel or find a way to completely end dependence on fossil fuels when our stories about the future promise a shattered world.” These are legitimate fears. If we as a society abandon science now, what kind of Dark Ages will we slip back into?

College and Smartphones

Photo from Newegg.
Image from Newegg.

Last December I obtained a Samsung Galaxy S3, my first ever smartphone. Yes, I’m only about 6 years late to the smartphone party. Before this, I had been using a Motorola Razr flip phone for years and didn’t really think a smartphone was necessary. But after just three months, it is already difficult to imagine not having one.

Smartphones in College Life

According to various reports I found, somewhere between 50-70% of college students have a smartphone. But statistically, at a school like Cornell, whose students come from families that more affluent than average, it would be reasonable to assume the percentage should be much higher. In fact, almost everyone I know here has a smartphone.

According to one source, the percentage of all college students who had a smartphone in 2009 was about 27%. Another source claims that the score in 2008 was 10%. Yet at Cornell, the figure was already 33% by 2008. I think it would not be unreasonable to estimate that Cornell’s smartphone usage is about a year (and a bit more) ahead of the average trend, and I would bet that currently between 80% and 95% of students at Cornell have a smartphone.

The social implications of having a smartphone here are significant. To have the Internet at your fingertips is to have all the knowledge you need on demand about activities, people, or just random information in a conversation. It is also to check email, respond to messages, or to share videos. Since the social norm is to have one, and the students expect other students to have them, most things are done with a smartphone in mind. At Cornell, to not have a smartphone is to be technologically behind. The Red Queen analogy comes to mind: “Now, here, you see, it takes all the running you can do, to keep in the same place. If you want to get somewhere else, you must run at least twice as fast as that!”

The “Superhuman” Extension

I resisted getting a smartphone for a long time because I thought (1) having a laptop was sufficient, and (2) not having a smartphone brought more peace of mind. However, the utility gained from having one far outweighs the silence of not having one. I can check my email at any time, look up anything I want, and a few weeks ago in NYC for an interview, I used it as a GPS.

There is an interesting article by John D. Sutter from CNN Tech last September, which is titled, “How Smartphones make us Superhuman.” It obviously makes the case supporting them. An excerpt:

In addition to enabling us to video events on a second’s notice, potentially altering the course of global politics, these high-tech human “appendages” increasingly have become tools for fighting corruption, buying stuff, bolstering memory, promoting politics, improving education and giving people around the world more access to health care.

While I don’t use the smartphone for any political reasons as Sutter might admire, I find it invaluable as a college student to be able to access information from anywhere. In classes, I still use my laptop as it is faster to type things, given that it is set up. But on the go, and for other occasions, using a laptop is impossible. It may sound strange, but without the Internet, I would feel disconnected from the world.

In fact, last May, my laptop broke down and would not even get to the boot menu. While it was being repaired (for about a month), I had no immediate Internet connection for the first time in years, and it felt that something was deeply missing. I had to go to the school libraries multiple times a day, and I would be there at odd hours. (Though, I did manage to write a 23-page math paper saved between various emails and flash drives.) Not having a computer was certainly survivable, but at a huge inconvenience.

Similarly, for a smartphone it is obvious that anyone can survive without having one. However, the productivity, convenience, time-efficiency, and omniscience are clearly worth it for any student living in 2013.

2010s: The Decade of Solutions

I just wrote my first word-based post of 2010 a few moments ago. And now, some mysterious force compels to make another one. Except this time, specifically on 2010 and the decade that it starts.


First, reflections on 2009. In a post I made near the end of last year titled Reflections on 2009, I saw how I had basically become, at least in my perspective, more creative. I realized things for what they were, and I was able to look at the big picture. But now, I seek a much deeper task: to reflect on the entire 2000s decade, and then preview the next.

I actually do remember December 31, 1999. I was eight years old (born December 28, 1991) and of course had a disjointed, childish view of the world. But I remember that day, talking with a friend named Bobby, about 2000. We were watching Pokemon I believe. But we came to the conclusion that it was amazing to be able to live in two different millenniums. Basically, all I remember from the general populace was pure joy and excitement. (An eight-year-old had no idea what Y2K was.) Even if the year system was arbitrary, it was still exhilarating, at least in our childish minds, to be born in one millennium and to live our lives in the next.

2000–2009 was a remarkable decade. Before that, I did know what a computer was. But I think I touched a computer twice, at most, before 2000. Yet, I cannot even begin to estimate how many times I touched one in the 00s decade. Probably a couple thousand times.

I’m no tech expert, but I think not many people would disagree if I said the 00s were the decade of information technology. (See my post on The Legacy of 2009 for outside quotes on this.) Computers shrank, and became exponentially faster. Blogging rose to the forefront. Web 2.0 in 2004 was the “official” start of the enhanced Internet that we see today. Facebook launched in 2004, and by the end of the decade it contained 350 million users worldwide—a sizable chunk of the human population. YouTube rose to prominence this decade. Micro-blogging, e.g. Twitter, appeared. So many things happened this decade on the web that it revolutionized the world. It created a truly global society, and it changed how we think.

For myself, I probably can’t say anything of meaning. I mean, a lot of things happen between the ages 8 and 18. Nonetheless, this decade was incredible.

But the next decade, the 2010s, will contain even greater human achievements. Because at this point in time, the growth of digital technology will only continue to accelerate.

Take even the last decade for example. Web 2.0 and Facebook both came around in 2004, while YouTube, Twitter, the Nintendo Wii came around about 2006. And they have increased dramatically in the last few years. They are already, just after a few years, embedded into our daily vocabulary. Of course, Google has also been a key innovator throughout the decade.

The 2010s will see in digital technologies the increase in both scale and pace. This blog might be completely outdated in a few years, and if it does, then we will know that humanity is advancing—fast. I have no doubt that he 10s will be even more record-breaking in technology.

So far, so good. But if we turn away from technology, we find some pressing issues that the world has not dealt with. (Yes, I just ended a sentence with a preposition, but may you care less about it given the content of this paragraph.) Conflicts in the Middle East are not going to end anytime soon. The potential for global nuclear annihilation still exists. Poverty and hunger still rage throughout the world. Diseases still ravage poorer countries, and can ravage wealthier ones. Environmental consequences are sooner or later going to be felt—and when that happens, I’m afraid it will be too late.

I don’t pretend to have any foolproof solutions to these problems. But I will say, it would be a shame if we destroy ourselves out of greed, arrogance, or war. Future species millions of years in the future will be perplexed by our concurrent ingenuity and stupidity, for we had the capacity to sequence the entire human genome, only to have our genome be obliterated by our own futile quarrel.

These problems are by no means new. People have been warning about them for years—in some cases, decades. In our history, we pretty much let them slip by. In the 00s, we made symbolic acts to solve them. But we’re not doing anything. On paper and on television, we are supporting the green movement, yet we still endlessly consume trees and fossil fuels.

It would be indeed a huge shame if the wealth of technological achievements made in the last decade—or century—are destroyed by human apathy. But I have a message for everyone. If the human race is to act some time, the 2010s is the decade in which to do it. At this point, from the accomplishments we made in the previous decade, we have achieved an instantaneous, interactive global communications network. This is a tool that we never had before. And we must use it.

We must augment the advances in technology with applications to our real-world problems. Scientists and engineers will need to work extra-hard. Politicians must be courageous enough to make necessary changes. We will need to be able to not only see the problems, but understand them, and understand what we can do about them. We have had many decades of problems. Let this be the decade of solutions.

The Legacy of 2009

2009 has been a remarkable year in every aspect. Struggles were fought, issues were disputed, but in the end, new heights were reached. What follows are various statements from the web on this fateful year. (With focus on technology.)

Doug Gross [CNN]:

  • This [2009] was the year that online social media exploded.
  • It was a big year for technology: Twitter and Facebook’s popularity exploded, while new smartphones, e-readers and a host of other gadgets cropped up to compete for our plugged-in affection.

Mark Leibovich [NYTimes]:

  • You could Tweet all the highlights of 2009 and still have time for dithering.
  • But if ever there were a year to put buzzwords before a death panel, this would be it, before the aporkalypse comes.
  • Whatever, it was a year when a lot of people acted stupidly.
  • If this year were a state dinner, even the Salahis wouldn’t Salahi it.

John D. Sutter [CNN]:

  • Engineers didn’t make huge improvements to technology in 2009. The year’s big tech names — Twitter, Facebook, Google, Apple, Amazon — all existed before January. Instead, this is the year technology changed us.
  • We could have done any of these things in 2008. But we embraced in unprecedented numbers a digital-centered life in 2009.
  • By the end of 2009, having a basic cell phone wasn’t good enough anymore.
  • Facebook now has more than 350 million users — that’s more people than live in the United States and is more than double the 150 million people who were on Facebook at the start of the year.
  • In 2009, it’s no longer enough to search for information that was current 30 minutes or an hour ago. Now, Internet junkies look for their news, Tweets and links to be updated in “real-time,” just as they are on Twitter.

David Von Drehle [Time]:

  • [2009] A year that dawned to the chime of change soon got bogged down in intractable troubles.
  • Struggle abroad and struggle at home: surely those were defining glimpses of this Moment in our history.

Alex Altman [Time]:

  • We were warned. But when the worst recession in seven decades smacked us in the face, all the gruesome auguries did little to dull the pain. As unemployment soared to 10.2% — the highest rate since 1983 — spendthrifts became tightwads, a new age of austerity dawned, and the era of easy money lurched to a close.

Pete Cashmore [CNN]:

  • The “real-time Web” is booming. From Twitter to Facebook to new search engines that discover information posted just seconds ago, it seems the 2010 Web will be fueled by our desire for instant gratification.
  • We’re seeing the ongoing voluntary erosion of privacy through public sharing on Facebook and Twitter . . . [link]
  • One factor that’s dramatically different at the end of this decade versus the beginning: Ubiquitous connectivity.
  • The network itself has become faster and virtually omnipotent.

For more, including the politics and news aspects, check out the following features:

Last, but not least, we honor