Tuesday, December 18, 2007

What I'm Playing: Super Mario Galaxy

A couple of weeks ago I started playing Super Mario Galaxy. Make no mistake about it: this is a truly amazing, top-notch game. For anyone who has played Mario games before, it is like meeting an old friend -- all spruced up and full of new stories to tell. For those new to the series, the game demonstrates over and over why Super Mario World, Mario 64, and others are considered some of the greatest video games in history.

The Mario series is not about story line. Yes, there is a story -- which is almost always the same -- Princess Peach is kidnapped by Bowser and Mario has to rescue her. The story is just an excuse for a collection of challenges, usually focused around a launch area: the map in Super Mario World, Peach's castle in Super Mario 64, Delfino Plaza in Super Mario Sunshine... You could say that the Mario video games are formulaic -- in the best sense of the word.

And Super Mario Galaxy is no exception. You explore different galaxies collecting stars for achieving certain feats and you use a floating observatory as your launch pad (literally). The story is the same: Bowser kidnaps Peach; Mario must rescue her and save the universe in the process. What could be wrong?

So I was all the more shocked -- and disappointed -- when I started playing the game for the first time. Let me repeat: this is an amazing game with some of the best game play ever. but the first 30 minutes of the game are some of the most disappointing, confused, and annoying moments I have experienced in a long time.

The disappointment is all the more intense because the game -- even from the trailers -- is fantastic. How could they have messed it up so badly?

This is the exact opposite of the sense of wonder I described before when a game has a driving vision from start to finish. In Mario Galaxy it looks like the development team completed a fantastic game and then handed it over to a team of amateurs to tack on the opening sequence.

For the first 30 minutes of the game you are told the story, given a chance to practice the game play, and transition into the game proper. In Mario 64 DS, this takes about 5 minutes. (In the original Super Mario 64, even less since there is no predefined practice.) Mario receives a letter from Peach, he arrives at the castle -- which will be his jumping off point -- finds the door locked, and must chase a rabbit who has stolen the key. The rabbit is an obvious ploy to force you to practice using the controls. But it is also very effective, quite short, and brings you directly into the game. Once you have the key, off you go.

In Super Mario Galaxy you also start with a letter. You are also given control to run down a path collecting stars and talking to a host of toadstool characters (most of whom have nothing interesting to say). Then, there is a longish cut scene (which you cannot control) of Peach being kidnapped and Mario being knocked unconscious. Fade to black. Next thing you know you are being woken up by a star who changes into a rabbit (with two of his brethren) and tells you to catch them. Wait a minute! I thought I'd already gone through the training running down the hill to the castle?

But since there is no other way to advance, you chase the rabbits. Once you catch them, they tell you another story about star fragments and tell you to talk to a fairy for more of the story. Again, cut scenes you can't control and a story that is only barely related to the princess's disappearance, and you end up in a third and final location! (The star observatory.) Some more explanation, then off to your first planet to find some fragments.

By this time -- when you actually enter the game itself -- you are so confused, you are afraid yet another new character will stop you and force you to perform more training. In fact, for the next 15 to 20 minutes, this apprehension clouds the game play. But eventually you realize you have reached the game proper and start to enjoy what is really a masterful piece of craftsmanship and exhilarating gaming experience.

What went wrong here? Well, just about everything. The beginning of the story is told in still frames with text -- but not the stylized frames of, say, Zelda's Wind Waker or Phantom Hourglass where the frames themselves tell a story. (And in Phantom Hourglass become part of the story themselves!) In this case, dull nondescript frames. Then the game begins -- or so you think since you gain control. But there is only one way to go (downhill to the castle) and far too many toadstools repeating instructions to you.

Then comes the cut scene. Despite dramatic camera angles and smoke effects trying to mask it, it is hard not to notice how dated the graphics of this section are. The objects are rudimentary (in 3D terms) and blocky, the textures are simple... It looks more like N64-quality graphics rather than two generations later on.

I am not claiming graphic superiority is necessary. I think the opening of Phantom Hourglass is spectacular, on far more limited hardware. But that opening is designed to exploit and celebrate what the Nintendo DS can do. The opening of Mario Galaxy seems to be satisfied with making do. This sloppiness is even more galling since once you get into the game itself, the graphics are bright and seamless -- a perfect match of game and hardware. The opening and the game itself stand in stark contrast to one another.

Finally, the opening overall is far too long, and tells a confusing, disjointed story that disrupts rather than justifies the game play. All I can say to other players who are starting the game is "hang in there". Try to ignore the disappointment of the opening and enjoy a brilliant game once you get through it. And Nintendo, please try not to do that again. Thank you.

Monday, December 17, 2007

Web 2.0 and the Lack of Process

I was at a meeting a few weeks ago to establish our plans for the upcoming year. In my part of the company, KM efforts are divided into three logical categories: people, process, and technology.

Now, this categorization is a relatively innocuous way to manage the projects. However, I always balk at it a little, because -- although it is clear that these are the three key aspects to KM -- you need all three to work together for any one project to succeed. Separating projects into people projects, process projects, and technology projects is artificial and may reinforce false assumptions about the balance of emphasis. However, the three categories are ultimately a handy way to divvy up responsibility. Besides, since the team I am in is small and works well together, it all comes out in the wash.

I only mention this because it led me to an interesting discovery.

While pondering what to say about our current efforts with Web 2.0 technologies, it occurred to me why this topic creates so many problems for business today. Web 2.0 is all about people (the wisdom of crowds, etc.). It is also about technology (the "stuff" that makes web 2.0 so interesting). But there is no process in web 2.0.

By that I mean that the technology itself makes no assumption about how or why the technology would be used. What is the usage model for Twitter? Who should blog? What should you use a wiki for? The answer -- if you bother to ask -- is usually "whatever you want!"

There are plenty of people out there willing to give their opinions (including myself, it appears). But at their core, most of the interesting web 2.0 technologies provide capabilities, whose potential increases as the number of users increase, but few if any limitations or even guidance on their use.

For example, a wiki is simply a web site anyone can edit. Why? Oh, there are many potential uses. But there are no restrictions. The result is that many wikis (most, I suspect) become trash cans of unrelated, out-dated, and inappropriate content.

A wiki becomes interesting once it has a purpose. By that I mean, someone decides what the wiki should be used for, defines a structure, and decides on a process to achieve that structure. Wikipedia, the classic example of a successful wiki is also a prime example of the amount of work needed to make that wiki a success: A clear statement of purpose, a well-defined process for contributing, and mechanisms for handling exceptions. None of this is inherent in the wiki itself. It must be defined and agreed upon by the owners and maintainers of the wiki, which is no small feat. The consequence is that many wikis are created hastily without the necessary process, resulting in failure or abandonment.

Compare this to earlier technologies. Email, for instance. Process is designed into the very core of most of these older technologies. You have an email client. You choose who to send email to. They can read it, reply to it, forward it, save it, and delete it. That's all. The technology embodies the processes previously defined for physical mail.

Even more recent technologies have process built into their design and nomenclature:
  • In instant messaging, you send "messages" to individuals who can choose who to receive messages from or not. Once a message arrives, a conversation starts and you can reply or close the dialog. Period.
  • IRC is divided into "channels" that users can open and participate in. Each channel implies a separate topic.
  • Even the web site -- the very essence of web 1.0 -- that on the surface would not seem to dictate a usage model, is laden with implicit assumptions about usage and structure. The URL itself defines a host, a directory (that is hierarchically structured), and a page. Ownership is implied by whoever owns or manages the hosting server. A logical structure is ascribed to the information by the hierarchy of directories. And finally the content itself is chunked into "pages".

But a wiki has no implicit structure, no directories (beyond the automated "recent changes" and alphabetical list of titles), and no owner (if you follow the original concept of anyone can edit). And wikis are not alone in this laissez-faire approach. Blogs place no structure on their contents except chronology. The use of tags allows the user to apply structure if they wish but tags are optional, and under the individual user's control. (No common vocabulary.) And collectively, blogs do nothing to help the readers sort through the massive collection of information qualitatively. Which are the blogs that deserve attention? It is totally up to the reader to decide.

This is the complete antithesis of today's corporate intranet, where "quality" and "consistency" rule and millions of dollars are spent each year to make sure only the right information is posted in the appropriate location by the right people using the right procedures. An entire industry -- CMS -- has developed to make this possible.

So if web 2.0 is so completely lacking any structure or process, why is corporate America so interested in it? The answer is because it has proven to succeed exactly where corporate intranets have failed.

Within the corporation, getting people to communicate with each other and share ideas (outside of the set patterns of regular meetings and organizational structure) is like pulling teeth. They don't have the time, don't know how, etc. But set them loose on the internet and they will willingly comment on anything from favorite sports to the detailed pros and cons of a specific model and brand of VCR, free of charge. Similarly, getting anything posted onto a corporate web site can take days or weeks as it passes through the approval and formatting processes. Updating a wiki entry is a matter of minutes.

Nobody complains about the ease of use of Wikipedia (except those who claim it is too easy to add false information). The same can not be said for any corporate application I can think of.

So web 2.0 appears to resolve two of the key problems of corporate applications: acceptance and adoption. But volume of use and acceptability of the software are not measures of business value. Although they are antidotes to the most common complaints about corporate IT, they don't in and of themselves solve the problem of effectively managing corporate knowledge.

So, IT is interested, but they are afraid.

They are afraid of what will happen when you set loose an undisciplined technology inside the firewall. How do they support it (when in many cases it isn't commercial code)? How do they control it (when they don't know what it should or should not be used for)?

They are afraid and well they should be because history has taught them that technology for technology's sake can become a monster. And as much as everyone would like to think business applications could take on the viral characteristics of web 2.0, it is not likely to happen. It won't happen because the audience (the corporate employee base) is too small, the audience is (in general) don't have the time to experiment or want to, and even if a valid business case develops, 3 out of 4 times there will be an existing application that the web 2.0 technology will compete with. Corporations do not like competing technical solutions because they cause confusion, cost money, and complicate what the company wants to maintain as simple step-by-step procedures.

That doesn't mean web 2.0 doesn't have a place within the corporate firewall. It just means it doesn't have a predefined place within the business world and it will take some intelligence and deep thinking to map it to the appropriate processes.

Wednesday, December 12, 2007

Why I Don't Twitter

I don't Twitter. I can't Twitter. Why? Because I am Twitter-challenged.

The problem is I am a writer. Or, rather, it is the way I write. My blog entries take days, sometimes weeks, to complete as I worry each sentence and paragraph into formation. But that's OK, because they are not time-dependent. Twitter is too fast for me.

I wish I could Twitter because there is something new and potentially transformative about this technology.

At first glance, Twitter looks like a cross between instant messaging (IM) and blogging. So it is difficult for newcomers to see where the innovation comes in. Although the technology itself is not revolutionary, Twitter is interesting because its usage model -- or potential usage models -- are innovatived.

Instant messaging is like stopping by someone's office for a quick chat: fast, interactive, intimate. Unlike email, which is much more like sending a letter and waiting for a response (or not), the presence information and immediacy of IM gives you the interactivity of real conversations. Blogging, on the other hand, is like posting a note on your office door. People may read it as they walk by, or anyone who comes to see you will see it. They may even scribble a response (i.e. comment) on it. This usage of blogs is far more public and yet still a personal style of interaction.

Twitter is just blogging in shorter, more frequent bursts. This speed and shortness (one line at a time) is what gives it its similarity to IM. But the broadcast mechanism (not targeting a specific individual and asynchronicity between writing and reading) is what makes it blog-like.

As I say, the technology itself is not innovative. You could use a regular blog for this purpose if you wanted to. But it is not the individual twitterer that is transformative.

Whereas IM is like a one-on-one conversation and blogging is like posting notes, Twitter is like the office watercooler, the coffee machine, the cafeteria table; wherever groups gather to chat and exchange the trivia of the day. What makes Twitter interesting is the collection of twitters around various topics, events, or pre-existing social groups. Twitter supports these communities (calls "blocks") to some extent and there are new hacks that make it usable on an event-by-event basis (such as eventtracker).

The ability to define the realm of twitters you follow and respond to those twitters creates a virtual meeting place with the type of interactivity and ease of use email, teleconferencing systems, and virtual worlds cannot match. Yes, there is significant confusion and cacophony. There is far more noise than signal. but quite frankly this is exactly what makes face-to-face meetings both enthralling and vital to the knitting of a social fabric among geographically dispersed individuals.

You don't choose your lunch companions because they will teach you something new. You choose them because they are fun to talk to, comfortable to be with, or just familiar. The fact that trivia and tiny fragments of information get shared in the grousing, joking, and storytelling that might end up being remembered and critical later on is the serendipity that is hard to reproduce in a less casual, chaotic environment.

It is this chaos of random facts that twitter reproduces and that makes it fascinating. In the public realm, the volume of twitters is almost intoxicating. As a mechanism to maintain the connectivity of random personal interactions, Twitter looks like a very attractive tool for organizations that are spread around geographically.

Unfortunately, despite all its potential, there are two possibly fatal flaws in Twitter.
  • As I said before, I can't Twitter. It is the same personality deficit which makes me hang back and not talk a lot during parties or at large group luncheons. I listen, I observe, but I tend to be much quieter than I am in one-on-one interactions (which is why I can IM but not Twitter). If I am not alone in this affliction, Twitter may only be useful to a certain personality type -- significantly reducing its potential usefulness as a group collaboration tool.
  • Despite its simplicity and ease of use, the twitters themselves have a techie feel that makes them look as cryptic as an IRC channel. The use of special characters and Twitter-specific references (to the posting application) puts off novices and occasional users. For example (picked at random):

@snbeach thanks! @Digimom ah, no, we had them remove the clothes before they delivered it.

Twitter is still in its infancy. The technical/presentation issues might be addressed as the service evolves. However, the social hurdles will be harder if not impossible to overcome. But daydreaming just a little bit, if someone could mix the intimacy of IM with the social context of twitter (and the ease of use of both), they might just come up with the next new killer app...