One Second Left

I really enjoyed Edge of Tomorrow (an 8 on my movie list), but one plot detail really bugged me. If you haven’t seen it yet, I’m not going to spoil anything directly.

countdown-timer

(Image link—pretty funny.)

It’s part of a larger category that happens in most action movies actually. This particular example doesn’t happen in Edge of Tomorrow: every time there is a countdown timer where something really, really bad will happen (typically an explosion), the protagonist will save the day with one second left til destruction, whether the timer was originally set to five minutes or five hours. In every action movie there are several of these “just in time” moments. And yes, I understand, this is what makes the movies suspenseful.

That really annoys me.

Did the aliens annoy me? Nope. Time travel loops? Nope. But impeccable luck and timing? Yes.

Is there any deeper meaning behind this? People have said that I over-criticize movie meanings, but I think this does have some harmful effects. The “protagonist always gets the girl” cliché is the worst in terms of social damage for obvious reasons, but “one second left” has its own issues. It distorts our views of luck and chance, thereby affecting our risk judgment, and it turns the extremely improbable into the probable.

A bigger issue still is that the “protagonist wins” cliché, which is in 99% of movies, may warp our sense of justice. There is a known cognitive bias called the just-world bias, where we falsely expect justice to be served (we unconsciously believe in karma), and movies can really take advantage of this. How do you explain why the good side was able to defuse the bomb at the last second? Easy, the good side deserved it. (How might this translate into real life? We feel that we deserve something great, so instead of trying for it, we wait for the universe to give it to us.)

Of course, I still enjoy action movies and TV that use “one second left.” But it just gets difficult to keep up suspension of disbelief when the most absurd chance events happen over and over again.

Kindles and Current Reading List

My Kindle Paperwhite arrived today, and after using it for only an hour, I wonder how I managed to get by without one. It would have made the last 7 years a much more reading-conducive experience. At home, I have bookshelves full of books and I end up not even reading many of them, partly because I don’t have a great non-expensive way of transporting books from Texas to New York. And the physical books I do have in NY can get weighty. For practicality, I should have used the library system more. But there’s a certain irreplaceable feeling of having your own books that you can read at your own pace. Anyway, I think a Kindle solves all my reading problems and perhaps I can reassign my old books to other uses.

BookHatFamilyGuy

In this link is the past reading list. And currently on the summer plate:

  • Capital in the Twenty-First Century – Thomas Piketty
  • The Price of Inequality – Joseph Stiglitz
  • An Appetite for Wonder – Richard Dawkins
  • The Big Short – Michael Lewis
  • Drift – Rachel Maddow
  • The Virtual Executive – Debra Benton
  • Nudge – Richard Thaler and Cass Sunstein
  • Predictably Irrational – Dan Ariely

The Hypercritical Condition?

liberalism-hegemony

Michael Roth, president of Wesleyan University, recently wrote a piece in The New York Times titled “Young Minds in Critical Condition.”

It happens every semester. A student triumphantly points out that Jean-Jacques Rousseau is undermining himself when he claims “the man who reflects is a depraved animal,” or that Ralph Waldo Emerson’s call for self-reliance is in effect a call for reliance on Emerson himself. Trying not to sound too weary, I ask the student to imagine that the authors had already considered these issues.

Instead of trying to find mistakes in the texts, I suggest we take the point of view that our authors created these apparent “contradictions” in order to get readers like us to ponder more interesting questions. How do we think about inequality and learning, for example, or how can we stand on our own feet while being open to inspiration from the world around us? Yes, there’s a certain satisfaction in being critical of our authors, but isn’t it more interesting to put ourselves in a frame of mind to find inspiration in them?

Being a student in the sciences, I don’t experience this kind of humanities phenomenon directly. But this ultra-critical mindset pervades everyday life, at least at an elite university. Students engage in this “intellectual one-upmanship” all the time without even realizing it. Try using Thomas Jefferson in a pro-freedom argument and you get the response that TJ owned slaves, thereby invalidating whatever moral or legal progress he allegedly made; therefore, the takeaway point is that the liberal notion of freedom was built on detestable foundations.

Also from Roth:

Liberal education in America has long been characterized by the intertwining of two traditions: of critical inquiry in pursuit of truth and exuberant performance in pursuit of excellence. In the last half-century, though, emphasis on inquiry has become dominant, and it has often been reduced to the ability to expose error and undermine belief. The inquirer has taken the guise of the sophisticated (often ironic) spectator, rather than the messy participant in continuing experiments or even the reverent beholder of great cultural achievements.

Even for my own blog posts, I sometimes run into critical comments which, instead of saying something substantive, completely miss the main point and belittle some small detail that I had usually already considered and addressed elsewhere in the article. One is powerless to defend against such criticisms, as preemptively placing ample amounts of caveats is no deterrent. It just changes the criticism from “The author does not consider X…” to “The author dismisses X…” followed by a pro-X argument, where X is a counterargument that the author has already considered.

Not that critical comments are bad—they’re quite useful. Constructive criticism is a hundred times more helpful than praise. Perhaps the issue is a self-fulfilling prophecy of blogging: since people don’t expect complex arguments with caveats, they assume that everything you say is absolute, even when that is clearly false. And it is not just in academia or blogging. Go to the comments page of any remotely controversial news story (I really enjoy reading CNN comments), and you can effortlessly predict which arguments and counterarguments are used.

Hilariously, one of the comments perfectly demonstrates the point of the article.

From user “reaylward”:

“Critical” in this context means close or analytical, not disparaging or condemnatory. Thus, a critical reading of a text means a close or analytical reading of the text, not a disparaging or condemnatory reading. The “historical critical method” of interpreting the Christian Bible, for example, means a close or analytical reading of the text, not a disparaging or condemnatory reading. “Critical thinking” doesn’t mean “exposing error”, it means thinking analytically. I think they need a dictionary at Wesleyan. And I mean that in the critical sense.

And a response by “Austin Uhler”:

Your comment is an example of the type of thinking that the author is discouraging. While you are correct about the strict meaning of “critical” in this context, your uncharitable reading means you are missing the author’s point: it is becoming more common for students to take critical thinking down negative, dismissive and unproductive paths.

This is probably the best comment-response pair I have ever seen for a NYT article.

Is the hypercritical condition a legacy of postmodernism? Is it simply a byproduct of the Internet? Are we becoming more cynical? I don’t know.

Being hypercritical is certainly a better problem to have than being uncritical. I appreciated Roth’s article nonetheless, for addressing the overly critical crowd.

Most Writers Are Writers

drawing-within-drawing

In a disproportionate amount of fiction works, the protagonist turns out to be a writer. The explanation is that the end products are created by writers, who put themselves into their works. And when you write what you know, a writer tends to write about writing. In other words, it’s selection bias.

According to tvtropes, this phenomenon is called “most writers are writers,” and writing about writers has several advantages in providing realistic excuses for (un-)realistic diction, investigative skills, journalistic connections, short work weeks, and arbitrary research knowledge that you characters need to have. The same applies for screenwriters about the film industry, and so forth.

TVTropes also contains the following addition:

“A consequence of this is that there is a disproportionate number of works involving the difficulties associated with getting a job after college when you have an English major, even if it’s a good economy, as all the writers were English majors, and virtually none of them could find a job after college, even in a good economy.”

Of course, Most Writers Are Writers does not entail that all writers are writers, only that a disproportionately large number of them are. According to the Bureau of Labor statistics, the number of writers and authors in 2012 was 129,100, or 0.04% of the US population. A different site estimates the low end at 250,000, or 0.08% of the population. In either case, clearly a much larger percent of books written include writers as main characters.

The downside of Most Writers are Writers is that other people are underrepresented. To be sure, there are books and screenplays written about anything possibly imaginable. But the subfields are much smaller, and are often much less accurate because of this selection bias.

For instance, scientific terms and concepts are used incorrectly all the time in the subfield of science fiction, where authors are supposed to have a higher-than-average understanding of science in the first place. We excuse sci-fi authors for making technical mistakes because they’re writers, not scientists (exceptions exist). On the other hand, when’s the last time you recall a blatant mistake written about the writing process or a book deal? Never, because someone writing about these will be knowledgeable of them.

But that is about writers’ interests and their knowledge of science. The Most Writers Are Writers trope is different in that it concerns the characters themselves: it’s very rare for a main character to actually be a scientist. And when they are, they’re often beyond terrible at their job, itself a bias, but that’s for a different time…

Cultural Values

Plato

After looking over some posts from this blog, I realize I almost never post anything having to do with being Asian American. Out of 384 posts so far, only one directly relates to this topic, and even that was only in response to another article.

This post will explore my experience as an Asian American and also why I never talk about being one.

Pride in American Culture

Perhaps there is a second article on my views of being an Asian American, though again, it was only used as an example in a larger context. The post, “Pride in Things Out of Your Control,” criticized being proud of something that is based on luck. One of the most relevant examples I came up with was my cultural/national identity:

The key difference is that nationality is something I could theoretically change. Had I the inclination, I could feasibly move to some other country than the US. Yet no matter how much I might want to be of some other race, I can’t revoke being Chinese. Thus I cannot be proud of being Chinese in race, but I can be proud of being American in nationality.

This is something I still stand by, and it is the reason I almost never talk about being Asian. From the same article:

I happen to be Chinese, but I have never felt proud of being Chinese, simply because I had no choice whatsoever in being born Chinese. In fact, I would far more strongly identify as “American” rather than “Chinese,” since there are some things I actually can make decisions between, e.g. Eastern vs. Western philosophy, cultural values, and freedom of speech; and in each case I agree more with the American side.

What exactly are these differences in philosophy and cultural values?

Liberalism and Freedom

As much as we like to joke about the shortcomings of the American political system, the US government is a blessing compared to the Chinese government.

The freedoms we take for granted in America are nonexistent in many areas of the world, China included. Here we can slander the government, mock politicians, and even negatively portray the president. Try doing that in China. Actually, don’t.

We have not just the freedom of speech, but also the freedoms of thought and information. Facebook, Twitter, and YouTube are inaccessible in China, largely because the government doesn’t want its citizens to learn any information from people of other cultures, as they would be too difficult to censor. For instance, they surely wouldn’t want people knowing about the Tiananmen square massacre (even though most people have probably heard of it but aren’t sure whether it is true).

Tiananmen Square Tank

In addition, we have a corporate media, which is at least far better than a government media. While they can go over the top sometimes, at least our news agencies deliver shocking news when it exists. On the other hand, the government media is very unreliable and is fond of covering things up. I recall a train derailing that provoked a lot of controversy when the government did not say anything about it for a long time. There’s also the Beijing smog incident, where the central media understated the extent of the problem and Beijing citizens had to resort to the US embassy’s particulate readings to get a sense of how bad the pollution was.

Now, enough of the government. Even within the US, there are many cultural differences between Asian Americans and Americans in general.

Creativity and Individualism

The most relevant difference for me is that American culture puts so much emphasis on the individual, and this I strongly agree with. In the post, “A Chinese Kid’s Response to ‘Chinese Parenting,’” I talked about how there were a lot of forced ritual activities, but I failed to emphasize in that post how the activities were all staple Asian activities that did not even remotely try to set one apart. Play the piano? Yes, I’m sure that will set you apart from all other Asian kids. Go to Chinese school? Study for the SAT? The whole system was really formulaic and focused as much as possible on conforming. (I ended up quitting the first two and not even starting the third. Instead, I learned chess, played the trumpet, figured out how to code, read novels, and started a blog.)

Sure, a conforming society might be good if the sole aim is to keep order, as in a police state. But for society to advance, for technology to be revolutionized, for literature to be written, for art and music to be made—these all requires creative feats by the individual. This is yet another reason I cannot stand Chinese culture: there is almost no promotion of creativity.

Voyager

The Rebel

Very similarly to individualism, the rebel archetype, which is about the worst thing possible in Chinese culture, is cherished in American culture (and Western culture in general).

The Master said, ‘In serving your father and mother you ought to dissuade them from doing wrong in the gentlest way. If you see your advice being ignored, you should not become disobedient but should remain reverent. You should not complain even if in so doing you wear yourself out.’

—Analects of Confucius

Disobedience, in the eyes of any one who has read history, is man’s original virtue. It is through disobedience that progress has been made, through disobedience and through rebellion.

—Oscar Wilde

Oscar_Wilde

Education vs Learning

There is a well-known stereotype of Asians placing so much emphasis on education. However, the point of this emphasis at least early on is almost solely for grades and test scores, not to actually learn stuff. I wrote earlier in the year about how even in college, there is an insane amount of GPA-centrism.

Here is an excerpt from the Chinese parenting post which summarizes my view on grades (written regarding high school):

Not that I cared less about education; in fact, it was quite the opposite. I became learning-focused instead of grade-focused. In class, I would be the one asking bizarre questions about material that seemed only remotely connected to the curriculum, but I never asked such a cringe-inducing question as “What percent of the grade is this assignment?” or “Is this for a grade?” or “Is this going to be on the test?” or, my favorite one yet, “Is there extra credit?”—and by the way, I’ve heard these countless times in high school from my Asian peers.

A Mark Twain quote on this topic:

I have never let my schooling interfere with my education.

—Mark Twain

mark_twain

Conclusion

In summary, the reason I rarely ever talk about being Asian American is that I identify culturally as American, and I don’t find Asian cultural values worth preserving. Yeah, that sounds pretty harsh, but that’s what I have to say.

Utopia vs Dystopia: A Matter of Semantics?

After witnessing the dystopian societies of 1984, Brave New World, and The Hunger Games, I wondered to myself, what would a Utopia really be? What differentiates a Utopia from a Dystopia? Is there always a fine line?

If you have learned of a Utopia as a perfect society, you might naively think that a Dystopia would be the opposite, or a failed society.

Yet this could not be further from the truth. The societies of 1984, Brave New World, and The Hunger Games are stable, successful, self-sustaining worlds, yet they are considered to be Dystopias. None of the three societies are failures. They merely contain different moral systems and social classes than what we are used to today. Yet they are considered repulsive and to be avoided at all costs.

1984

In 1984, the world is run by three superpowers locked in constant warfare. This way, since each individual power is always at war, each government can maintain permanent martial law and rule with an iron fist. Any dissent is dealt with ruthlessly, as seen in the plot. The system works. It is, I daresay, perfect.

In Brave New World, the government does not rule with an iron fist, but rather, by providing so many distractions and recreations to the common people (analogous to TV or drugs in our world) that the average person is too amused to worry about any oppression by the government. There is a propagandized doctrine of happiness, that there are no problems as long as everyone is happy. The work is done by genetically engineered stupid people (the Epsilons) that serve as slaves to the other castes. Indeed, the way it runs, this society can be thought of as perfect as well.

The only major difference about the presentation of the Dystopia in The Hunger Games is that it presents an overly dramatic story of a rebel going through an elaborate system (the game itself) to rebel. It is also the only one so far that presents any hope to the rebels. In 1984 and Brave New World, by contrast, the government wins at the end.

In this respect, the government in The Hunger Games is nowhere near as successful as those in 1984 and Brave New World. Despite its running the games for 74 years, the government faces decadence and imperfection, which didn’t seem to affect the other two Dystopias. So in a way, the government in The Hunger Games is not a true Dystopia—it does not have lasting power, so it is not perfect. In 1984, the government could turn people against each other, and in Brave New World, everyone is happy so no one has reason to rebel. In The Hunger Games, however, people are unhappy, and these unhappy people unite together, posing a real threat to the government.

So the society in The Hunger Games is more akin to a short-lived Middle Eastern or South American state undergoing rapid regime changes, as a large amount of discontent exists and is significant. By contrast, the societies in 1984 and Brave New World are more like the former Soviet Union/the current United States. The people are either squashed in rebellion or are too mesmerized to rebel.

Where does a Utopia fit in all of this? A Utopia is supposed to be perfect, but how are the societies of 1984 and Brave New World not perfect? Sure, in 1984, the main character is tortured, but you could make the argument that if he had just listened to the government and did what it asked for, he would not have been hurt at all. Indeed, when he is brainwashed at the end, the society seems perfect to him.

And if you are a thinking human being in Brave New World, there is little reason you would want anything else from society. You are provided with all the joy you could possibly want. Sure, the lower class Epsilons are treated unfairly, but they are made dumb biologically. They might not have a consciousness as we have. They are basically machines.

You could say that in a true Utopia, everyone would be treated fairly. But how can a society actually function if this were the case? There has to be someone or a group of people in charge. Even in Plato’s Republic, containing the first proposal of a utopian society, there are social classes with clearly defined rulers.

And even with powerful and rational people at the top, this does not create a Utopia. In Watchmen, set in the Cold War, the titled superheroes try to save humanity, but the smartest and most rational of them finds, to most people’s shock, that the only way to save humanity from nuclear destruction is to initiate a massive attack on the whole world, in order to unify the United States and the USSR. While this character is considered to be the main antagonist as he killed millions of people, he is, if viewed from a purely rational perspective, the hero of humanity. And from this perspective, he took steps in creating a Utopia, not a Dystopia.

Since these moral issues are so subjective, the line between a Utopia and a Dystopia and the definition of perfect are subjective as well, as shown in all of the examples above. Then is the distinction between a Utopia and a Dystopia any more than a matter a semantics? What are your thoughts?

Sleep Deprivation Is Totally Not a New Phenomenon

“If they had not been overcome with drowsiness they would have performed something. The millions are awake enough for physical labor; but only one in a million is awake enough for effective intellectual exertion, only one in a hundred millions to a poetic or divine life. To be awake is to be a live. I have never yet met a man who was quite awake. How could I have looked him in the face?”

—Henry David Thoreau, in Walden (1854)

Thoreau’s figures have an uncannily prophetic ring in today’s crazy (and sleepless) world. Maybe it is an exaggeration that only one in a million awake enough for effective intellectual activity, for after all, we all know a handful of awake and alert people. But it is not an exaggeration by much. How many people like Thoreau do you know? Probably not that many, if at all.

In the surrounding section, Thoreau blames sleep deprivation on clocks, schedules, and “factory bells.” O, how simple their lives were, you might say. For in today’s society, we must also contend with the TV, the Internet, and the iPhone. (And my econ professor.)

I made the argument over a year ago that several things can make us resist sleep. They rest in broad categories:

  • Chemistry: Consuming caffeine or other sleep-altering substances
  • Danger: Being threatened or in some state of physical danger
  • Interaction: A two-way interaction with other people or with a computer
  • Ambition: Wanting to achieve something and sacrificing sleep in order to do it

But only a few of these things can sleep-deprive us day after day, month after month. And these things as can be seen in the categories (besides Danger, which is short-term), nearly all come from conscious habit. Since habits are difficult to get rid of, this presents a big problem for sleep deprived people. Oh well. You can always just live with it.

(I’m super sleep-deprived right now, which may explain the incoherency of this post.)