Standing Up in a Crowded Theater, Studying for Tests, and Other Game-Theoretic Dilemmas

Everyone is sitting down in a crowded theater, comfortably seated and with a good view. All is well until one person decides his view is not good enough, so he stands up to get a clearer view. This ruins other peoples’ views, so they stand up as well. A while later, everyone is standing up but has the same view as before, resulting in each being in a position strictly worse than when everyone was sitting.

This particular example is typically avoided since the social norm in a theater is to sit. In fact, in numerous examples of this game, there are either direct (laws) or indirect (social norms) methods of control to prevent such disasters from happening. Here are two for illustration:

  • Crime. If one person stole a little, this person would be in a better position and society would not be harmed by much. However, if everyone did this, society would collapse. The criminalization of theft prevents this problem (for the most part). This concept applies to many types of crimes.
  • Environmentalism. If one person polluted more, there would be virtually no change to the environment. However, if everyone did so, the environment would feel the full effects. (This still isn’t quite resolved, but in most developed countries it is well on its way.)

From a game-theoretic perspective, however, each individual taking the selfish path is making a rational decision. The problem is that the system may not discourage the selfish activity sufficiently.

Someone who doesn’t recycle may (justifiably) argue that they do in fact care about the environment, but that the impact of their not recycling is negligible to the environment. While this is true, if everyone thought like this, then we would all be standing up in the theater. The main point of this post to go over some less commonly cited situations.

Studying for Tests

I would argue that studying for a test falls into the category of standing up in a theater. From both high school and college, I have observed or have heard of people studying hours upon hours for tests and often barely remembering any of the material after a semester. A test should measure how well you understand something, not how well you can memorize and cram facts into your brain for the next day.

People who know me from high school and college know I don’t study much (if at all, depending on the class) for tests. Perhaps some see this as a sign of not caring, but I would argue that I care about the knowledge just as much, if not more, than people who study far greater hours. In the cases where I do study, I go for the “why” rather than the “what,” and I study to load the concepts into long-term memory, rather than the details into short-term memory. If you do need the details at a later time, cram it in then when it is relevant and when you have the big-picture understanding.

Let’s pretend that studying for tests were not allowed. Then what would a test measure? Would it measure how much attention someone paid in lecture? How well they comprehended the main points? What part of the homework they didn’t copy from someone else?

In fact, everyone’s grades would still be similar. In classes where grades are curved, if everyone does “worse” on a test the same way, then the grades will be unaffected (though there may be some shifting around). The tests would just become more genuine.

So it may seem like I have something against studying for tests. But what part specifically of studying for tests do I have an issue with? Well, as mentioned before, I think if everyone studied for tests, it makes the test scores more a measure of who studied the most and who could cram in material the most efficiently, instead of who actually understood the content. But even if this problem were somehow irrelevant—letsay an irrefutable study comes out tomorrow saying that cramming ability is just as relevant for the real world as understanding—I would still have an issue with studying, namely the time spent. Suppose someone is taking 4 classes and studies 4 hours for each midterm and 8 hours for each final. That’s 48 hours spent studying in a semester. Multiply that by 8 semesters to get 16 days spent on studying. These 16 days are the difference between sitting down and standing up.

Preparing for Colleges/Job Interviews

Sure, the informative power of some of the tests I’ve mentioned above may be arguably above zero. For example, maybe it’s feasible that a dedicated premed student university should cram before a bio test because the details do matter, though the question remains of whether such a student will remember anything years later. But there’s still one very important test taken all around the country that really has no arguable intellectual merit: the SAT.

This test is probably the biggest insult to intelligence when taken seriously. I try my hardest to resist cringing whenever I hear smart people talking about their SAT scores. From the CollegeBoard site:

The SAT and SAT Subject Tests are a suite of tools designed to assess your academic readiness for college. These exams provide a path to opportunities, financial support and scholarships, in a way that’s fair to all students. The SAT and SAT Subject Tests keep pace with what colleges are looking for today, measuring the skills required for success in the 21st century.

Yes, I’m sure it’s very “fair to all students.”

sat-scores-by-wealthAnd I’m sure that by “keep[ing] pace with what colleges are looking for today, measuring the skills required for success in the 21st century,” what CollegeBoard means is that the skills required for success in today’s world are… wealth, certain racial backgrounds, and access to prep courses.

Anyways, I guess my point is that if nobody studied for the SAT, nobody took prep courses, and no one cared so much, then:

  • Students wouldn’t be wasting their time studying for it.
  • Many families would save time and money on SAT prep by not having to do it.
  • As a result, less privileged students would stand a better chance, and thus the test would be more fair.

Of course, while this may sound good, it is easier said than done. To not study would be shooting yourself in the foot, or in this case, to sit down in the theater in which everyone is standing. It would be like one country’s reducing its greenhouse emissions while other countries are not decreasing theirs.

(Personally, I refused to study for the SAT, though at the time I had to give off the impression that I was studying for it to appease my Asian parents. If you really want the story, it’s in the latter part of this post.)

I would go further to say that preparing for job interviews in some ways fits this type of game. On this subject, however, I have very little experience as my only important interviews were of the type where it would be very difficult to prepare for, i.e., math puzzles. Answering such questions did not hinge on knowing certain advanced equations, but instead on using simple tools that almost everyone knows, in unusual ways.

In addition, I understand that an interview not only judges the answers to the questions, but also the interviewee’s character. If it is evident that someone prepared a lot for an interview, that fact in itself would be considered in the interviewer’s assessment. However, I think that in a world in which no one prepared for interviews, both sides would benefit as the interviewee would save time and stress while the interviewer gets a more genuine view of the interviewee, not a carefully constructed outer shell.

And for a preemptive defense, to the claim that studying or preparing is simply a result of competition, I have nothing against capitalism or competition. If anything, freeing up students’ time from studying for tests would make them be able to compete in other areas, and be able to take additional classes or learn new skills (I picked up programming while pretending to study for the SAT). I see the time wasted as an inefficiency. The point of not studying is to have more time, and hence be more productive.

Sitting down in a standing theater is a difficult decision. But if everyone sat down, we might all live in a better place.

Is the Virtual World Really An Escape from Reality? (Part 2)

On September 17th, Blizzard announced that they would be removing the auction houses in Diablo 3. For gamers, this may seem like a very strange move. It is very rare that a company will remove a significant feature of a game, especially when there is no stated replacement plan.

Real World Finances

But from a sociological perspective, this is a very interesting move that signifies a reaction to the merging of the virtual and real worlds. It seems like the warnings from Jesse Schell in 2010 are manifesting. Last year, Diablo 3 launched with two widespread auction houses, allowing players to trade their virtual items. The gold auction house used in-game currency, while the real money auction house used… real money. Real US dollars. And other worldwide currencies.

The Diablo 3 Real Money Auction House. The $250 max buyout is the limit.
The Diablo 3 Real Money Auction House. The $250 max buyout is the limit.

As I said in part 1, the virtual world, used to be an escape from reality:

One of the strongest effects of these games was to cause players to disregard socioeconomic stratification that existed in the real world. In the virtual worlds of RPG’s, everyone starts equal and has the same opportunities.

From an extensive CNN report on gaming:

A professor: “…people do not feel they have the freedom and kind of  their own power to change their own social roles and their own identities. But in cyberspace, people do not remember… your wealth.”

However, Facebook (among others, though Facebook arguably had the largest effect) changed this with microtransactions that allowed players with more wealth in real life, or more willingness to use the wealth, to translate it to in-game wealth. Schell’s talk has a lot more on how Facebook changed gaming.

But despite the influence of Facebook, many gamers stayed on non-FB games. It took Diablo 3 to have a large enough impact on affecting socioeconomics within a game. To some degree, those who were wealthier in real life were wealthier in the game. And to some degree, it was impossible to progress forward unless one was already wealthy.

In one sense, Blizzard’s removal of the auction houses signifies a break from the trend of the ever increasingly tangled web of real and imaginary.

An Efficiency Problem

Of course, we cannot discount Blizzard’s stated reasons for removing the auction houses:

When we initially designed and implemented the auction houses, the driving goal was to provide a convenient and secure system for trades. But as we’ve mentioned on different occasions, it became increasingly clear that despite the benefits of the AH system and the fact that many players around the world use it, it ultimately undermines Diablo’s core game play: kill monsters to get cool loot.

Indeed, the problem was that there was too much trading and the system became too efficient. I actually wrote a lengthy post about this on the Diablo 3 forums last year, called “Why the Auction House is the Main Problem,” which was also mathematically oriented. This article was highly rated and was spread around the interwebs.

Basically, the problem was that the increased market efficiency from the auction houses allowed the average player to obtain much better items than they otherwise would, thereby short-circuiting the actual game.

Although it seems fairly obvious now as to what happened, the sentiment at the time was that the real money auction house was causing the main problems, but that the gold auction house was fine. Before my thread, I don’t recall anyone making a coherent argument against the efficiency of the gold auction house.

The Future of Gaming

Thus it is not all that surprising that Blizzard is removing both auction houses. And even considering Blizzard’s official reason, it is interesting that the economic system in the game has so many analogs in real life.

A vision of the future virtual world, from part 1:

It will not be a place where we can set aside our real world and escape our problems for a few hours. It will not be a place where we have fun or meet people we would never see otherwise and talk about the little things in life without worrying about our financial position.

Instead, it will be an extension of the real world and everything in it. Those who are wealthier in the real world will have more options in the virtual world, and those who are poorer will remain poor. Ultimately, if virtual reality does not return to its roots as an escape from reality, people will end up escaping the virtual world as well.

So given the recent news, perhaps we are not quite as firmly on that road as we were last year—a wrench has been thrown in the works. But in the end, the real and virtual worlds are still on a collision course. We should definitely be prepared.

Thinking, Fast and Slow; and Other Summer Readings

This summer’s reading list was a bit unusual, and the following books all have something in common:

  • Thinking, Fast and Slow, by Daniel Kahneman
  • The Stuff of Thought, by Steven Pinker
  • When Genius Failed, by Roger Lowenstein
  • Moonwalking with Einstein, by Joshua Foer
  • Against the Gods, by Peter Bernstein
  • Coolidge, by Amity Shlaes

Positive expectancy to whoever can state it first in the comments below (also, the people who would be able to state this know I mean).

Thinking, Fast and Slow

thinking-fast-and-slow

Very good book, recommended for anyone. It presents the existence of two modes of thinking: one that is fast and intuitive, and the other which is slow and methodical. It then goes through many cognitive biases that can affect making rational decisions.

The Stuff of Thought

See this post.

When Genius Failed

when-genius-failed

A fascinating tale of how a company went from riches to rags, based on miscalculated risk. I think it is worth reading even for a non finance fan.

Moonwalking With Einstein

moonwalking-with-einstein

A book on memory. I actually read this one in the spirit of the book: I would go through some pages on the subway and then, without using a bookmark, remember the page I was on. This probably doesn’t sound impressive, but without bookmarks I am terrible at remembering how far into a book I am. The experiment worked out pretty well: I often remembered exactly the sentence on which I left off. Recommended for those interested in remembering things.

Against the Gods

against-the-gods

This was my least favorite among the least, though perhaps it has to do with my previous knowledge of mathematical history. It felt too much like a history textbook most of the time, and when it attempted to do math, the explanation was very rudimentary. I think one is better off reading the wikipedia page on the history of math.

Coolidge

coolidge

It was a surprisingly interesting book, at least for the first half or so. Learning about Calvin’s struggles earlier on in life was awesome, but once it got to real politics, it became much again like a history textbook.

There are a few more books that I am going through (by Pinker, Harris, and Dennett), and I will post about these once I am done.

Survival of the Selfish Gene

After reading The God Delusion, I decided to study some of Richard Dawkins’ earlier works. For this post, I read The Selfish Gene (and among the books on my queue are The Blind Watchmaker and The Greatest Show on Earth).

the-selfish-gene

Published in 1976, The Selfish Gene explores the phenomena at play regarding the behavior of replicators, namely genes and memes. I was expecting to see lots of biological arguments, and while there are many, I was shocked at what I found was the main tool used in the book: game theory.

Of course, once you think about it, it makes perfect sense that game theory is extremely important when talking about genes and how they spread from one generation to the next. And by game theory, I do not mean board games or video games, but economic game theory, applied to biology in what is now known as evolutionary game theory. In fact, this book would be an excellent read for people interested in mathematics or economics, in addition to the obvious group of those interested in biology. Dawkins uses concepts like Nash equilibria, though the term is not explicitly stated (consider the date of the book), and the Prisoner’s Dilemma, just for a couple examples, to explain many biological behaviors found in various animals, including humans. This kind of game-theoretic analysis followed largely from the work of John Maynard Smith.

In addition to having studied a bit of game theory, I have also studied dynamical systems, though from the perspective of pure math and not biology. Even so, the concepts in the book were very familiar. I do not think The Selfish Gene is controversial from an academic standpoint. The now 40-year old ideas are still relevant today, and the ideas are really not that difficult to understand, given a sufficient mathematical and scientific background.

Instead, the controversy around the book seems to come solely from the title itself, and perhaps the attached stigma to writing anything about evolution, which seems to be more of an issue today than it was in 1976. Dawkins notes this years later in the preface to the second edition:

This is paradoxical, but not in the obvious way. It is not one of those books that was reviled as revolutionary when published, then steadily won converts until it ended up so orthodox that we now wonder what the fuss was about. Quite the contrary. From the outset the reviews were gratifyingly favourable and it was not seen, initially, as a controversial book. Its reputation for contentiousness took years to grow until, by now, it is widely regarded as a work of radical extremism.

I do find this amusing. It seems to have not to do specifically with the theory of evolution itself, but with the unfortunate anti-intellectual sector of the US. (Of course, Dawkins is from the UK, but I am talking about American opinion of these kinds of books.)

In current society it seems like a fad to wear one’s ignorance on one’s sleeve, as if boastfully declaring, “My ignorance is just as good as your knowledge.” Of course I am not advocating that we should go the opposite direction and be ashamed for not learning, but we should be able to come together and agree that ignorance is not a virtue, especially not in the most scientifically advanced country in the world. I am not really sure how the United States is supposed to recover from this, other than that we become more reasonable over time. And that will take education, not ignorance.

The title of the book is somewhat misleading, only if one does not understand what the word “selfish” is describing. The “selfish gene” is not so much talking about a gene that causes selfishness in individuals (this is an ambiguous notion in itself), but rather, it describes the word “gene” directly, that genes themselves propagate themselves in a manner that appears selfish. The individual is merely a “survival machine” for the gene. There is a critical difference here between the two notions.

The selfish gene is merely a gene that, for practical reasons, has a higher chance of being passed on. It does not really contradict any current notion of evolution, and in fact, at the time of publication, it became the new and improved theory of evolution that is now the textbook standard. In any case, the message is that evolution works not by the survival of the fittest individuals, but by the survival of the fittest, or most selfish, genes.

When we look at the selfish gene, there are situations (as demonstrated in the book) where the intrinsically selfish thought appears on the outside as altruistic. Mutual back-scratching benefits both individuals, and moreover, benefits the gene for it, thus making the gene more likely to spread. So while the behavior of back-scratching seems altruistic, it may be nothing more than concealed selfishness. This idea can be extrapolated to many phenomena. Often people put on acts and fake displays of kindness only for the selfish benefit of “seeming” nice. Or they are so “humble” that they announce their humbleness everywhere they please and make you feel bad for not being as humble as they are. The list goes on. However, I will not comment too much on this as this goes under cultural behavior and not strictly genetic behavior, although they are related.

The controversy around this book also seems to stem from perceived personal offense. Included in The Selfish Gene is an interesting quote from Simpson regarding historical developments in explaining how the current species on Earth came to be:

Is there a meaning to life? What are we for? What is man? After posing the last of these questions, the eminent zoologist G. G. Simpson put it thus: ‘The point I want to make now is that all attempts to answer that question before 1859 are worthless and that we will be better off if we ignore them completely.’

While this statement is perfectly true in trying to understanding biology, I can see how religious people might take offense. To declare that all mythological ideas in this area before Darwin’s The Origin of Species are worthless is a bold claim, even when it is correct.

Regarding the actual content of the book, I have already mentioned that Dawkins makes extensive use of game theory. There are many numbers in some of the more technical chapters, making the book possibly difficult to read in real-time unless the reader is versed in mental mathematics. Though, with some deliberate thought on these chapters, any reader should be able to get through them.

The Selfish Gene is a remarkable book, giving clear explanations of basic biology and evolutionary game theory for the layman. It is a shame that such educational material is viewed as controversial. I wish I could succinctly summarize the fascinating interplay of evolutionary game theory in a single post, but it would be better to leave it to you to pick up this book and think about it for yourself. If you do not like evolution, however, you have been warned.

Is the Virtual World Really An Escape from Reality?

Or are they on a collision course?

Google Glass

The Role-Creating World

One of the most popular and successful genres of gaming is the role-playing game (RPG). In an RPG, the player is a character in a usually fantasy world, and is able to develop skills and abilities within that world to progress as a character. In the virtual world, one could grow more powerful or more wise, and take on more difficult obstacles.

Traditionally, these role-playing games—and in fact, all commercial video games—were played as an escape from reality. One could escape the loud, busy, modern world and live instead in a quiet, simple, and perhaps peaceful world.

WoW Screenshot 4
Screenshot from the game World of Warcraft.

One of the strongest effects of these games was to cause players to disregard socioeconomic stratification that existed in the real world. In the virtual worlds of RPG’s, everyone starts equal and has the same opportunities.

From an extensive CNN report on gaming:

A professor: “…people do not feel they have the freedom and kind of  their own power to change their own social roes and their own identities. But in cyberspace, people do not remember… your wealth.”

From a gamer interviewee, in the same report about the RPG known as Maple Story:

“It’s a game where you can make people grow and develop within a certain line of work.  …you get a feeling that you are improving.”

The anonymity of online gaming meant that players could ignore social and economic barriers in real life, and feel accomplished by themselves.

The Facebook Conundrum

The face of gaming was forever changed by Facebook. Instead of playing with anonymous players from all around the country, and even all around the world, players of Facebook games play with their real-life friends.

Screenshot from Farmville. Courtesy of Wikipedia Commons.

Moreover, many Facebook games have microtransactions, where players can pay real money to gaming companies in exchange for virtual goods or virtual currencies. In “older” style RPG’s, on the other hand, all currencies are in-game only and there is no legal exchange between virtual money and real money.

These are two big factors:

  • The veil of anonymity has lifted; and,
  • Real money is now able to affect your character’s position in the virtual world.

It doesn’t take a genius to see where this is headed: into socioeconomic stratification in the virtual world, which was supposed to be the one place where players could escape from real world problems.

That is, in classic RPG’s, more successful players could attribute their victories to skill, knowledge, and effort. But in microtransaction-based games, the more successful players could be attributed to just being wealthier in the real world.

Diablo 3 and Marxism

Even in these microtransaction-based games on Facebook, the microtransactions can be thought of in terms of a state-controlled economy. Almost always, the company itself determines the prices of all virtual goods or currencies, and the company itself is the seller of goods. Zynga and Nexon are two examples of this.

Activision Blizzard took the idea of microtransactions one step further, and created a capitalist economy, where the players themselves sell goods to each other, while the company obtains a 15% tax on each virtual good sold.

Screenshot of the Real Money Auction House in Diablo 3. The $250 buyout is the max limit.

In the classic microtransaction models where every player who buys a particular item pays the same amount, no player feels ripped off or feels that the system is unfair.

But in the Real Money Auction House model, one player might buy a near identical good for half the price that another player paid, perhaps because the first player had carefully studied the market and compared options more carefully. The second player ends up feeling ripped off.

In this free market virtual economy, the stratification arising from unregulated capitalism has taken effect. Again, one doesn’t need to read Karl Marx to see what is going on in this virtual economy. The rich are getting richer by buying goods cheap and then reselling them for higher values, while the poor find it very difficult to start off. The poor have essentially turned into a working class. The Diablo 3 economy is very much akin to that of Industrial Revolution Britain.

The Future of the Virtual World

The virtual world began as an escape from reality, then transformed into a mirror of current reality, and then mutated again to a history of human reality.

If it continues down this path, then the virtual world of the future is not going to be the virtual world we saw in our dreams.

What we imagined virtual reality to be.

It will not be a place where we can set aside our real world and escape our problems for a few hours. It will not be a place where we have fun or meet people we would never see otherwise and talk about the little things in life without worrying about our financial position.

Instead, it will be an extension of the real world and everything in it. Those who are wealthier in the real world will have more options in the virtual world, and those who are poorer will remain poor. Ultimately, if virtual reality does not return to its roots as an escape from reality, people will end up escaping the virtual world as well.

Black Friday

Does anyone else find it weird that one day people appreciate everything they have down to the little things, and the next day they try to buy as much as they can?

—James Feng

This was the first year that I bought anything on Black Friday. As I mentioned in the previous post, I made an online purchase of the PC game World of Warcraft: Wrath of the Lich King for $10 (down from $39.99). With a month of college left, I do not plan to play this expansion anytime soon; however, I do plan to try leveling from 70 to 85 at some point within the next year.

James’s quote captures the contradiction between the amiability of Thanksgiving and the materialism of Black Friday. Humans are usually consistent creatures. We don’t honestly believe one thing and then suddenly believe the opposite. So either we are not truly thankful on turkey day or we are not truly greedy on shopping day. I think it’s a bit of both.

Life at Cornell

Because of the experiment I mentioned last post, I haven’t been posting much, so with this post I’d like to return to my normal posting schedule. Well, a “schedule” never really existed, so what I mean, then, is a more frequent schedule. Until my next experiment…

Question Mark
Still looking for ideas for my next experiment...

Anyway, on to life outside of WoW in the last 20 days. I’ve been doing okay in my classes overall. Here are my courses my order of easiest to most difficult:

  • CS 1610 (Computing in the Arts): We still have not had a prelim or received any grades yet. The content is pretty straightforward.
  • SOC 1101 (Intro to Sociology): I’m at an A- right now, but we just had the second prelim yesterday. I felt I didn’t do as well on it as on the first prelim, but that seems to be the general consensus, so with the curve, it may be similar.
  • HIST 2500 (Technology in Society): We don’t have prelims, but instead, essays. We have three such essays that each count for 25%, and the other 25% is participation. I received an A on the first essay, but admittedly, I pulled an all-nighter for it, and the grade was very hard earned. In contrast, I do barely any work or studying for Sociology.
  • ENGL 1170 (Short Stories): This class has a lot of reading and a lot of writing. By the end of this semester I’ll probably have written more in this class than in all my other classes combined, then doubled. Plus, all the writing is in the form of literary analysis, which is not exactly my favorite style. I think I have a B in it right now, and I doubt I will be able to raise it by very much.
  • MATH 2230 (Theoretical Linear Algebra and Multivariable Calculus): This is by far my hardest class. The class median score on the first prelim was a 47, which I happened to get. It curved up to a B. Not bad, but it is so different from high school, where I was used to A+’s in math without doing any work. Plus, I used to be able to understand the concepts without doing the homework, and now, in college, I am starting to not understand the concepts even though I am doing the homework. My old theory: Math is easy. New theory: Math is tough.

I should probably mention some other aspects of Cornell as well. The weather has recently turned cold. For example, it is, at the time of this post, 40° F, and according to the Weather Channel, this will drop to 33° F later tonight.

Snowman
Just one degree lower...

I hear that in Austin, the daytime temperatures are still reaching the 80s. Lucky! 😛

Moving on… One thing I love about Cornell are the libraries. My favorite ones so far are the Uris Library and the music library (in Lincoln Hall). Uris has the appearance of being old-fashioned, and for some reason, that makes my productivity increase dramatically (though the most important aspect is likely the quietness). On the other hand, the PCL at the University of Texas looks new and modern, and for some reason, I never had much productivity in it.

The music library at Cornell is quite modern as well (and despite the name, it is actually more quiet than say the Olin library). What makes it modern is, well, one day, I heard this mechanical sound, and saw, with my own eyes, one of the bookshelves moving! It was like a scene from a Harry Potter movie…

Andrew Dickson White Library
The Andrew Dickson White Library within Uris Library. It's not the one with moving bookshelves, but still...

I’ve probably spent more time in libraries in this semester so far at Cornell than during all of high school combined. I also find them very good for creative work.

Moving on again… Band! I will just have to say here once again that the BRMB (Big Red Marching Band) is amazing! It’s so much better than high school marching band. On October 8/9 (which was during the middle of my experiment), we traveled to Boston for the Cornell–Harvard game! Neither team was that great (I’m from Austin, so I am qualified to judge football competency), and we somehow managed to let Harvard catch two of their own punts. Seriously? (Harvard won 31–17.)

There are many things I would say about the trip, which was very interesting and eventful, but I am forbidden from saying anything about the bus ride. (What happens on Bus 5 stays on Bus 5.) I stayed, as did the majority of the trumpet section, with a couple (both in number and in marital relation) of Cornell band alumni on Friday night before the game. It was a fun night.

Wow, I’ve written nearly 800 words so far. It’s about time I get to the second, and what I originally intended as the main, subject of this post:

The Principles of Scientific Management

The what of what? Actually, most people whom I know in my audience have heard of this work before, as they have likely taken AP US History or a related history course at some point. When the course gets to economic progress the early twentieth century, the textbook mentions: Henry Ford and Frederick Winslow Taylor, the latter for whom the concept of “Taylorism” is named.

A refresher: Taylorism, or scientific management, is an economic theory that focuses above all on efficiency. It is concerned with maximizing productivity. That’s about all that’s mentioned in APUSH. (Here are Wiki links for Frederick Taylor and scientific management if you are interested.)

Frederick Taylor
Frederick Taylor (1856-1915)

In our HIST 2500 class, “Technology in Society,” we just read Taylor’s work that founded this theory: a treatise called The Principles of Scientific Management (1911). Near that period of time, labor and employers were generally not on friendly terms with each other. Remember all those labor strikes and unions you had to memorize for APUSH? Yeah…

Taylor was an engineer who proposed a solution, scientific management, to deal with this social issue. His goal was to resolve the management–labor conflict with a system that would be beneficial to both employers and workers. Scientific management, he argued, would enable workers to be much more efficient, and thereby more productive. This would allow a smaller number of specialized workers to produce much more than a larger number of normal workers, which would in turn allow the employer to raise wages and still increase profit.

We are not talking about minor improvements here. Taylor didn’t argue that 10-20% increases in productivity would solve the labor issue. His analysis in the book shows that in many industries the daily productivity of one worker could be doubled, and in some cases, tripled or even more. It means that not only were the employers gaining more revenue, but the workers were also earning higher wages. And, as Taylor implies, this increase in production would also lower the prices of manufactured goods, which helps the common people: they have more money and can buy cheaper goods. It’s a win-win-win situation.

So how exactly does this increase in productivity occur? The idea is to make every part of every task as efficient as possible. For a shoveler, a group of scientists carefully analyzed which type of person was most suited for shoveling. They also figured out the optimum load on the shovel (21 pounds—any more or less in one scoop would reduce the overall efficiency), which type of shovel should be used for different materials, and even what material the bottom of the container that is being shoveled from should be. They figured out how many rest breaks the workers should have, and for how long they should last, and when they are scheduled. And they analyzed each motion in shoveling as to figure out which ones are necessary and which ones are useless, which movements are faster and which are slower, and how to shovel as to move the greatest amount of material in the least amount of time.

Stopwatch
The single most important tool in scientific management.

My crazy idea is to apply the theory of scientific management to other things. Oh wait, that’s already been done. Often with unremarkable consequences.

What I really should do is to have some degree of scientific management in my life, that is, have a schedule. At college I am going pretty much without a schedule. Then again, NOT playing WoW is probably much more significant in productivity-increasing than whatever I could I apply from scientific management. Plus, the application of scientific management requires at least two people, so if I were to try to apply this, someone would need to be my “manager.” Interesting, but no thanks.