College and Smartphones

Photo from Newegg.
Image from Newegg.

Last December I obtained a Samsung Galaxy S3, my first ever smartphone. Yes, I’m only about 6 years late to the smartphone party. Before this, I had been using a Motorola Razr flip phone for years and didn’t really think a smartphone was necessary. But after just three months, it is already difficult to imagine not having one.

Smartphones in College Life

According to various reports I found, somewhere between 50-70% of college students have a smartphone. But statistically, at a school like Cornell, whose students come from families that more affluent than average, it would be reasonable to assume the percentage should be much higher. In fact, almost everyone I know here has a smartphone.

According to one source, the percentage of all college students who had a smartphone in 2009 was about 27%. Another source claims that the score in 2008 was 10%. Yet at Cornell, the figure was already 33% by 2008. I think it would not be unreasonable to estimate that Cornell’s smartphone usage is about a year (and a bit more) ahead of the average trend, and I would bet that currently between 80% and 95% of students at Cornell have a smartphone.

The social implications of having a smartphone here are significant. To have the Internet at your fingertips is to have all the knowledge you need on demand about activities, people, or just random information in a conversation. It is also to check email, respond to messages, or to share videos. Since the social norm is to have one, and the students expect other students to have them, most things are done with a smartphone in mind. At Cornell, to not have a smartphone is to be technologically behind. The Red Queen analogy comes to mind: “Now, here, you see, it takes all the running you can do, to keep in the same place. If you want to get somewhere else, you must run at least twice as fast as that!”

The “Superhuman” Extension

I resisted getting a smartphone for a long time because I thought (1) having a laptop was sufficient, and (2) not having a smartphone brought more peace of mind. However, the utility gained from having one far outweighs the silence of not having one. I can check my email at any time, look up anything I want, and a few weeks ago in NYC for an interview, I used it as a GPS.

There is an interesting article by John D. Sutter from CNN Tech last September, which is titled, “How Smartphones make us Superhuman.” It obviously makes the case supporting them. An excerpt:

In addition to enabling us to video events on a second’s notice, potentially altering the course of global politics, these high-tech human “appendages” increasingly have become tools for fighting corruption, buying stuff, bolstering memory, promoting politics, improving education and giving people around the world more access to health care.

While I don’t use the smartphone for any political reasons as Sutter might admire, I find it invaluable as a college student to be able to access information from anywhere. In classes, I still use my laptop as it is faster to type things, given that it is set up. But on the go, and for other occasions, using a laptop is impossible. It may sound strange, but without the Internet, I would feel disconnected from the world.

In fact, last May, my laptop broke down and would not even get to the boot menu. While it was being repaired (for about a month), I had no immediate Internet connection for the first time in years, and it felt that something was deeply missing. I had to go to the school libraries multiple times a day, and I would be there at odd hours. (Though, I did manage to write a 23-page math paper saved between various emails and flash drives.) Not having a computer was certainly survivable, but at a huge inconvenience.

Similarly, for a smartphone it is obvious that anyone can survive without having one. However, the productivity, convenience, time-efficiency, and omniscience are clearly worth it for any student living in 2013.

Is the Virtual World Really An Escape from Reality?

Or are they on a collision course?

Google Glass

The Role-Creating World

One of the most popular and successful genres of gaming is the role-playing game (RPG). In an RPG, the player is a character in a usually fantasy world, and is able to develop skills and abilities within that world to progress as a character. In the virtual world, one could grow more powerful or more wise, and take on more difficult obstacles.

Traditionally, these role-playing games—and in fact, all commercial video games—were played as an escape from reality. One could escape the loud, busy, modern world and live instead in a quiet, simple, and perhaps peaceful world.

WoW Screenshot 4
Screenshot from the game World of Warcraft.

One of the strongest effects of these games was to cause players to disregard socioeconomic stratification that existed in the real world. In the virtual worlds of RPG’s, everyone starts equal and has the same opportunities.

From an extensive CNN report on gaming:

A professor: “…people do not feel they have the freedom and kind of  their own power to change their own social roes and their own identities. But in cyberspace, people do not remember… your wealth.”

From a gamer interviewee, in the same report about the RPG known as Maple Story:

“It’s a game where you can make people grow and develop within a certain line of work.  …you get a feeling that you are improving.”

The anonymity of online gaming meant that players could ignore social and economic barriers in real life, and feel accomplished by themselves.

The Facebook Conundrum

The face of gaming was forever changed by Facebook. Instead of playing with anonymous players from all around the country, and even all around the world, players of Facebook games play with their real-life friends.

Screenshot from Farmville. Courtesy of Wikipedia Commons.

Moreover, many Facebook games have microtransactions, where players can pay real money to gaming companies in exchange for virtual goods or virtual currencies. In “older” style RPG’s, on the other hand, all currencies are in-game only and there is no legal exchange between virtual money and real money.

These are two big factors:

  • The veil of anonymity has lifted; and,
  • Real money is now able to affect your character’s position in the virtual world.

It doesn’t take a genius to see where this is headed: into socioeconomic stratification in the virtual world, which was supposed to be the one place where players could escape from real world problems.

That is, in classic RPG’s, more successful players could attribute their victories to skill, knowledge, and effort. But in microtransaction-based games, the more successful players could be attributed to just being wealthier in the real world.

Diablo 3 and Marxism

Even in these microtransaction-based games on Facebook, the microtransactions can be thought of in terms of a state-controlled economy. Almost always, the company itself determines the prices of all virtual goods or currencies, and the company itself is the seller of goods. Zynga and Nexon are two examples of this.

Activision Blizzard took the idea of microtransactions one step further, and created a capitalist economy, where the players themselves sell goods to each other, while the company obtains a 15% tax on each virtual good sold.

Screenshot of the Real Money Auction House in Diablo 3. The $250 buyout is the max limit.

In the classic microtransaction models where every player who buys a particular item pays the same amount, no player feels ripped off or feels that the system is unfair.

But in the Real Money Auction House model, one player might buy a near identical good for half the price that another player paid, perhaps because the first player had carefully studied the market and compared options more carefully. The second player ends up feeling ripped off.

In this free market virtual economy, the stratification arising from unregulated capitalism has taken effect. Again, one doesn’t need to read Karl Marx to see what is going on in this virtual economy. The rich are getting richer by buying goods cheap and then reselling them for higher values, while the poor find it very difficult to start off. The poor have essentially turned into a working class. The Diablo 3 economy is very much akin to that of Industrial Revolution Britain.

The Future of the Virtual World

The virtual world began as an escape from reality, then transformed into a mirror of current reality, and then mutated again to a history of human reality.

If it continues down this path, then the virtual world of the future is not going to be the virtual world we saw in our dreams.

What we imagined virtual reality to be.

It will not be a place where we can set aside our real world and escape our problems for a few hours. It will not be a place where we have fun or meet people we would never see otherwise and talk about the little things in life without worrying about our financial position.

Instead, it will be an extension of the real world and everything in it. Those who are wealthier in the real world will have more options in the virtual world, and those who are poorer will remain poor. Ultimately, if virtual reality does not return to its roots as an escape from reality, people will end up escaping the virtual world as well.

Curiouser and Curiouser

The wait is almost over. Check out NASA’s official page on the Curiosity rover.

The rover is currently closing in on Mars, with a few hours before landing. The image below links to NASA’s “Where is Curiosity” page with real-time locations.

The landing itself will be very intricate. Many different processes will have to work perfectly for a successful landing. Let’s hope all goes well.

Edit: It successfully landed! Great job NASA! Here is the first picture, in which Curiosity overlooks its own shadow:

The Map of Facebook Connections

[giant map]

This map was created recently by Paul Butler, an intern at Facebook [via]. Roughly, the lines on the map represent Facebook connections between different cities.

I think we would best learn from this map if we compare it to two others. The first is the famous Earth at night picture:

Wow, they look pretty similar, you might say, after focusing first on the bright hubs of North America and Europe. But there are three major exceptions: China, Russia, and the Middle East. (There are other noticeable holes like Bangladesh and Vietnam.) Asia looks pretty dim on the Facebook map. Sure, it has India, South Korea, Japan, and Southeast Asia lit up, but you can see a giant hole, devoid of light, a pit where, in many of the places, Facebook is banned.

In my previous post I mentioned it was no surprise that Mark Zuckerberg was named TIME’s Person of the Year 2010. But looking at the map above, we easily see that Facebook has not reached out to as big a userbase as it can. Speaking of users, where do they reside? Here is a map of population density around the world:

Again there is quite a close overall match with the Facebook map. And yet again, there is a disparity in China, Russia, and the Middle East, and in Africa.

We can also derive graphically that the percentage of people who use Facebook in North America. is much higher compared to the rest of the world. Compare, for example, the eastern half of the United States and the entirety of India. Though India has over three times the population of the United States, the Facebook connections in the eastern United States alone outshine India’s vastly.

As seen from the map, Australia’s eastern coast plus New Zealand also have a disproportionately high percent of Facebook users.

Before I finish, I’d like to show just one more image: the Facebook map zoomed into the United States:

Damn, that’s home, for me, and for most likely the vast majority of my readers. YOU are on that diagram. You probably have a Facebook, and you are connected to this virtual map. It is not a physical map. That was in an old, ancient age. With Zuckerberg officially recognized, named above other world leaders, it is an appropriate time to say that this moment, this year of 2010, is the year that we can officially turn back and say that we’ve exited an old phase of society. A new one, THAT one, in the picture above you, has replaced it.

No Surprise that Zuckerberg is TIME’s Person of the Year 2010

Just take a look at the numbers:

  • 2004 – 1 million
  • 2005 – 5.5 million
  • 2006 – 12 million
  • 2007 – 50 million
  • 2008 – 150 million
  • 2009 – 350 million
  • 2010 – 550 million, nearly 600 million

These are the numbers of Facebook users at the end of each year.

It wasn’t any one year of growth in particular that made Facebook’s founder Mark Zuckerberg the Person of the Year 2010. If not for the political and economic concerns and recession in the previous years, Zuckerberg might have received the title sooner. (Last year, for example, the Person of the Year was Ben Bernanke.)

Perhaps there’s something magical about the number of 500 million users, which Facebook passed in July 2010. But if anything, 2009 was the year of social networking. In 2009, the more-than-doubling jump from 150 million to 350 million meant that the number of Facebook users had surpassed the population of the United States.

When I compiled the Legacy of 2009 post last year, the only coherent trends I could find were tech trends, specifically those with social networking. Some quotes, all from 2009:

  • Doug Gross: “This [2009] was the year that online social media exploded.”
  • John D. Sutter: “Engineers didn’t make huge improvements to technology in 2009. The year’s big tech names — Twitter, Facebook, Google, Apple, Amazon — all existed before January. Instead, this is the year technology changed us.” (emphasis added)
  • Sutter, again: “We could have done any of these things in 2008. But we embraced in unprecedented numbers a digital-centered life in 2009.”
  • Pete Cashmore: “One factor that’s dramatically different at the end of this decade versus the beginning: Ubiquitous connectivity.”

It seems that Sutter’s point about technology changing us strikes an even stronger chord in 2010 than in 2009. If 2009 was the beginning of a new society of mass social networks, then 2010 was the year in which we began to really surround our lives with them.

TIME this year is honoring not only a person, but a technology. And not just one technology, but many. Cyberspace in 2010 is a lot different than it was in 2000. In the meantime were the rise of blogging (and later, micro-blogging), Web 2.0, mass file-sharing, Youtube, and of course social networking sites. In the last 10 years, the only other Person of the Year relating to technology is Bill Gates, who shared the title with his wife and the U2 singer Bono in 2005. They were all recognized, however, not for technology, but for philanthropic virtues. (Not that philanthropy is unimportant.)

It is about time that TIME looked around and noticed, “Oh, society has changed!” By naming Mark Zuckerberg as the Person of the Year, TIME has honored not only one person in one year, but also, through him, the vastly consequential online technologies of the decade.

Embedded Powerpoints!

Props to Microsoft for this feature. I’m thinking of ideas for a PowerPoint post already. 🙂

Edit: Nice! If you modify the PowerPoint, it will automatically update on the blog and you will not need to repost it!

Edit 2: This feature isn’t too polished yet; for one thing, it takes forever to load sometimes.

Edit 3 (11/29/10): Aww, it stopped working! 😦