Tuesday, December 18, 2007

What I'm Playing: Super Mario Galaxy

A couple of weeks ago I started playing Super Mario Galaxy. Make no mistake about it: this is a truly amazing, top-notch game. For anyone who has played Mario games before, it is like meeting an old friend -- all spruced up and full of new stories to tell. For those new to the series, the game demonstrates over and over why Super Mario World, Mario 64, and others are considered some of the greatest video games in history.

The Mario series is not about story line. Yes, there is a story -- which is almost always the same -- Princess Peach is kidnapped by Bowser and Mario has to rescue her. The story is just an excuse for a collection of challenges, usually focused around a launch area: the map in Super Mario World, Peach's castle in Super Mario 64, Delfino Plaza in Super Mario Sunshine... You could say that the Mario video games are formulaic -- in the best sense of the word.

And Super Mario Galaxy is no exception. You explore different galaxies collecting stars for achieving certain feats and you use a floating observatory as your launch pad (literally). The story is the same: Bowser kidnaps Peach; Mario must rescue her and save the universe in the process. What could be wrong?

So I was all the more shocked -- and disappointed -- when I started playing the game for the first time. Let me repeat: this is an amazing game with some of the best game play ever. but the first 30 minutes of the game are some of the most disappointing, confused, and annoying moments I have experienced in a long time.

The disappointment is all the more intense because the game -- even from the trailers -- is fantastic. How could they have messed it up so badly?

This is the exact opposite of the sense of wonder I described before when a game has a driving vision from start to finish. In Mario Galaxy it looks like the development team completed a fantastic game and then handed it over to a team of amateurs to tack on the opening sequence.

For the first 30 minutes of the game you are told the story, given a chance to practice the game play, and transition into the game proper. In Mario 64 DS, this takes about 5 minutes. (In the original Super Mario 64, even less since there is no predefined practice.) Mario receives a letter from Peach, he arrives at the castle -- which will be his jumping off point -- finds the door locked, and must chase a rabbit who has stolen the key. The rabbit is an obvious ploy to force you to practice using the controls. But it is also very effective, quite short, and brings you directly into the game. Once you have the key, off you go.

In Super Mario Galaxy you also start with a letter. You are also given control to run down a path collecting stars and talking to a host of toadstool characters (most of whom have nothing interesting to say). Then, there is a longish cut scene (which you cannot control) of Peach being kidnapped and Mario being knocked unconscious. Fade to black. Next thing you know you are being woken up by a star who changes into a rabbit (with two of his brethren) and tells you to catch them. Wait a minute! I thought I'd already gone through the training running down the hill to the castle?

But since there is no other way to advance, you chase the rabbits. Once you catch them, they tell you another story about star fragments and tell you to talk to a fairy for more of the story. Again, cut scenes you can't control and a story that is only barely related to the princess's disappearance, and you end up in a third and final location! (The star observatory.) Some more explanation, then off to your first planet to find some fragments.

By this time -- when you actually enter the game itself -- you are so confused, you are afraid yet another new character will stop you and force you to perform more training. In fact, for the next 15 to 20 minutes, this apprehension clouds the game play. But eventually you realize you have reached the game proper and start to enjoy what is really a masterful piece of craftsmanship and exhilarating gaming experience.

What went wrong here? Well, just about everything. The beginning of the story is told in still frames with text -- but not the stylized frames of, say, Zelda's Wind Waker or Phantom Hourglass where the frames themselves tell a story. (And in Phantom Hourglass become part of the story themselves!) In this case, dull nondescript frames. Then the game begins -- or so you think since you gain control. But there is only one way to go (downhill to the castle) and far too many toadstools repeating instructions to you.

Then comes the cut scene. Despite dramatic camera angles and smoke effects trying to mask it, it is hard not to notice how dated the graphics of this section are. The objects are rudimentary (in 3D terms) and blocky, the textures are simple... It looks more like N64-quality graphics rather than two generations later on.

I am not claiming graphic superiority is necessary. I think the opening of Phantom Hourglass is spectacular, on far more limited hardware. But that opening is designed to exploit and celebrate what the Nintendo DS can do. The opening of Mario Galaxy seems to be satisfied with making do. This sloppiness is even more galling since once you get into the game itself, the graphics are bright and seamless -- a perfect match of game and hardware. The opening and the game itself stand in stark contrast to one another.

Finally, the opening overall is far too long, and tells a confusing, disjointed story that disrupts rather than justifies the game play. All I can say to other players who are starting the game is "hang in there". Try to ignore the disappointment of the opening and enjoy a brilliant game once you get through it. And Nintendo, please try not to do that again. Thank you.

Monday, December 17, 2007

Web 2.0 and the Lack of Process

I was at a meeting a few weeks ago to establish our plans for the upcoming year. In my part of the company, KM efforts are divided into three logical categories: people, process, and technology.

Now, this categorization is a relatively innocuous way to manage the projects. However, I always balk at it a little, because -- although it is clear that these are the three key aspects to KM -- you need all three to work together for any one project to succeed. Separating projects into people projects, process projects, and technology projects is artificial and may reinforce false assumptions about the balance of emphasis. However, the three categories are ultimately a handy way to divvy up responsibility. Besides, since the team I am in is small and works well together, it all comes out in the wash.

I only mention this because it led me to an interesting discovery.

While pondering what to say about our current efforts with Web 2.0 technologies, it occurred to me why this topic creates so many problems for business today. Web 2.0 is all about people (the wisdom of crowds, etc.). It is also about technology (the "stuff" that makes web 2.0 so interesting). But there is no process in web 2.0.

By that I mean that the technology itself makes no assumption about how or why the technology would be used. What is the usage model for Twitter? Who should blog? What should you use a wiki for? The answer -- if you bother to ask -- is usually "whatever you want!"

There are plenty of people out there willing to give their opinions (including myself, it appears). But at their core, most of the interesting web 2.0 technologies provide capabilities, whose potential increases as the number of users increase, but few if any limitations or even guidance on their use.

For example, a wiki is simply a web site anyone can edit. Why? Oh, there are many potential uses. But there are no restrictions. The result is that many wikis (most, I suspect) become trash cans of unrelated, out-dated, and inappropriate content.

A wiki becomes interesting once it has a purpose. By that I mean, someone decides what the wiki should be used for, defines a structure, and decides on a process to achieve that structure. Wikipedia, the classic example of a successful wiki is also a prime example of the amount of work needed to make that wiki a success: A clear statement of purpose, a well-defined process for contributing, and mechanisms for handling exceptions. None of this is inherent in the wiki itself. It must be defined and agreed upon by the owners and maintainers of the wiki, which is no small feat. The consequence is that many wikis are created hastily without the necessary process, resulting in failure or abandonment.

Compare this to earlier technologies. Email, for instance. Process is designed into the very core of most of these older technologies. You have an email client. You choose who to send email to. They can read it, reply to it, forward it, save it, and delete it. That's all. The technology embodies the processes previously defined for physical mail.

Even more recent technologies have process built into their design and nomenclature:
  • In instant messaging, you send "messages" to individuals who can choose who to receive messages from or not. Once a message arrives, a conversation starts and you can reply or close the dialog. Period.
  • IRC is divided into "channels" that users can open and participate in. Each channel implies a separate topic.
  • Even the web site -- the very essence of web 1.0 -- that on the surface would not seem to dictate a usage model, is laden with implicit assumptions about usage and structure. The URL itself defines a host, a directory (that is hierarchically structured), and a page. Ownership is implied by whoever owns or manages the hosting server. A logical structure is ascribed to the information by the hierarchy of directories. And finally the content itself is chunked into "pages".

But a wiki has no implicit structure, no directories (beyond the automated "recent changes" and alphabetical list of titles), and no owner (if you follow the original concept of anyone can edit). And wikis are not alone in this laissez-faire approach. Blogs place no structure on their contents except chronology. The use of tags allows the user to apply structure if they wish but tags are optional, and under the individual user's control. (No common vocabulary.) And collectively, blogs do nothing to help the readers sort through the massive collection of information qualitatively. Which are the blogs that deserve attention? It is totally up to the reader to decide.

This is the complete antithesis of today's corporate intranet, where "quality" and "consistency" rule and millions of dollars are spent each year to make sure only the right information is posted in the appropriate location by the right people using the right procedures. An entire industry -- CMS -- has developed to make this possible.

So if web 2.0 is so completely lacking any structure or process, why is corporate America so interested in it? The answer is because it has proven to succeed exactly where corporate intranets have failed.

Within the corporation, getting people to communicate with each other and share ideas (outside of the set patterns of regular meetings and organizational structure) is like pulling teeth. They don't have the time, don't know how, etc. But set them loose on the internet and they will willingly comment on anything from favorite sports to the detailed pros and cons of a specific model and brand of VCR, free of charge. Similarly, getting anything posted onto a corporate web site can take days or weeks as it passes through the approval and formatting processes. Updating a wiki entry is a matter of minutes.

Nobody complains about the ease of use of Wikipedia (except those who claim it is too easy to add false information). The same can not be said for any corporate application I can think of.

So web 2.0 appears to resolve two of the key problems of corporate applications: acceptance and adoption. But volume of use and acceptability of the software are not measures of business value. Although they are antidotes to the most common complaints about corporate IT, they don't in and of themselves solve the problem of effectively managing corporate knowledge.

So, IT is interested, but they are afraid.

They are afraid of what will happen when you set loose an undisciplined technology inside the firewall. How do they support it (when in many cases it isn't commercial code)? How do they control it (when they don't know what it should or should not be used for)?

They are afraid and well they should be because history has taught them that technology for technology's sake can become a monster. And as much as everyone would like to think business applications could take on the viral characteristics of web 2.0, it is not likely to happen. It won't happen because the audience (the corporate employee base) is too small, the audience is (in general) don't have the time to experiment or want to, and even if a valid business case develops, 3 out of 4 times there will be an existing application that the web 2.0 technology will compete with. Corporations do not like competing technical solutions because they cause confusion, cost money, and complicate what the company wants to maintain as simple step-by-step procedures.

That doesn't mean web 2.0 doesn't have a place within the corporate firewall. It just means it doesn't have a predefined place within the business world and it will take some intelligence and deep thinking to map it to the appropriate processes.

Wednesday, December 12, 2007

Why I Don't Twitter

I don't Twitter. I can't Twitter. Why? Because I am Twitter-challenged.

The problem is I am a writer. Or, rather, it is the way I write. My blog entries take days, sometimes weeks, to complete as I worry each sentence and paragraph into formation. But that's OK, because they are not time-dependent. Twitter is too fast for me.

I wish I could Twitter because there is something new and potentially transformative about this technology.

At first glance, Twitter looks like a cross between instant messaging (IM) and blogging. So it is difficult for newcomers to see where the innovation comes in. Although the technology itself is not revolutionary, Twitter is interesting because its usage model -- or potential usage models -- are innovatived.

Instant messaging is like stopping by someone's office for a quick chat: fast, interactive, intimate. Unlike email, which is much more like sending a letter and waiting for a response (or not), the presence information and immediacy of IM gives you the interactivity of real conversations. Blogging, on the other hand, is like posting a note on your office door. People may read it as they walk by, or anyone who comes to see you will see it. They may even scribble a response (i.e. comment) on it. This usage of blogs is far more public and yet still a personal style of interaction.

Twitter is just blogging in shorter, more frequent bursts. This speed and shortness (one line at a time) is what gives it its similarity to IM. But the broadcast mechanism (not targeting a specific individual and asynchronicity between writing and reading) is what makes it blog-like.

As I say, the technology itself is not innovative. You could use a regular blog for this purpose if you wanted to. But it is not the individual twitterer that is transformative.

Whereas IM is like a one-on-one conversation and blogging is like posting notes, Twitter is like the office watercooler, the coffee machine, the cafeteria table; wherever groups gather to chat and exchange the trivia of the day. What makes Twitter interesting is the collection of twitters around various topics, events, or pre-existing social groups. Twitter supports these communities (calls "blocks") to some extent and there are new hacks that make it usable on an event-by-event basis (such as eventtracker).

The ability to define the realm of twitters you follow and respond to those twitters creates a virtual meeting place with the type of interactivity and ease of use email, teleconferencing systems, and virtual worlds cannot match. Yes, there is significant confusion and cacophony. There is far more noise than signal. but quite frankly this is exactly what makes face-to-face meetings both enthralling and vital to the knitting of a social fabric among geographically dispersed individuals.

You don't choose your lunch companions because they will teach you something new. You choose them because they are fun to talk to, comfortable to be with, or just familiar. The fact that trivia and tiny fragments of information get shared in the grousing, joking, and storytelling that might end up being remembered and critical later on is the serendipity that is hard to reproduce in a less casual, chaotic environment.

It is this chaos of random facts that twitter reproduces and that makes it fascinating. In the public realm, the volume of twitters is almost intoxicating. As a mechanism to maintain the connectivity of random personal interactions, Twitter looks like a very attractive tool for organizations that are spread around geographically.

Unfortunately, despite all its potential, there are two possibly fatal flaws in Twitter.
  • As I said before, I can't Twitter. It is the same personality deficit which makes me hang back and not talk a lot during parties or at large group luncheons. I listen, I observe, but I tend to be much quieter than I am in one-on-one interactions (which is why I can IM but not Twitter). If I am not alone in this affliction, Twitter may only be useful to a certain personality type -- significantly reducing its potential usefulness as a group collaboration tool.
  • Despite its simplicity and ease of use, the twitters themselves have a techie feel that makes them look as cryptic as an IRC channel. The use of special characters and Twitter-specific references (to the posting application) puts off novices and occasional users. For example (picked at random):

@snbeach thanks! @Digimom ah, no, we had them remove the clothes before they delivered it.

Twitter is still in its infancy. The technical/presentation issues might be addressed as the service evolves. However, the social hurdles will be harder if not impossible to overcome. But daydreaming just a little bit, if someone could mix the intimacy of IM with the social context of twitter (and the ease of use of both), they might just come up with the next new killer app...

Thursday, November 22, 2007

Searching & Finding

It doesn't make any sense to look for something that isn't hiding. Why do the things people search for need to be hidden? Do they search for things because they're hidden, or are things hidden because they're searching?
-- Miyuki Miyabe


This intriguing assertion is posited by a character in Miyabe's novel Brave Story.

Two things particularly interested me about the statement when I first read it: that it sounds right, but at the same time it feels wrong.

Agreed. Things are not lost until we try looking for them. If loose change falls out of my pocket into the sofa, but I don't notice, is it lost? I don't think so. I didn't notice its going, so it is not missed.

A quick check of Websters turns up nine definitions of lost, some of which apply to loose change, some which don't. And one which is contradictory: "2 a: no longer possessed b: no longer known". Well, it is no longer in my possession, but I barely knew it to start with, so there is not much "loss".

In fact, things are only lost once we are looking for them and they are not where we expect to find them. For example, when I come home I tend to deposit what I am carrying somewhere in the house: my watch, phone, and wallet. They belong in a drawer in the kitchen, but I often drop them on a bookshelf at the top of the stairs when passing. When I go to find them again, if they aren't in the drawer, I'll go look on the shelf. They can hardly be said to be "lost", since I know where they will be within a limited set of possibilities.

However, I tend to being both lazy and forgetful. So I often drop them somewhere else in my haste, and do not remember where. Then when I go to look for them, they are truly "lost" because they are outside the scope of where they ought or could be expected to be.

But the words used in the quote are "hiding" and "hidden", not "lost".

You could say the things I misplace are "hidden", since in looking I have difficulty finding them, even if I am practically staring right at them. On my desk in particular, they are hidden like a tree in a forest -- one of an innumerable collection of things that make up the clutter I call my work area.

So, yes, they are "hidden" even if they are not hidden from view. Because, like being lost, they are outside the realm of their expected locations and I have trouble finding them against an unfamiliar backdrop, even if that backdrop is something I deal with day after day. I have frequently had the experience of looking for a book on the bookshelf, not finding it on the shelf I expect, then searching all of the bookcases in the house to no avail. Then, on a second go round, I find it on a shelf I had already searched. I simply didn't know how to "see" it in its new and unexpected location.

So in a sense, things are not hidden until you search for them and fail to find them where expected. And it is also true you search for things because they are hidden, in that you can hardly be said to be searching when you look within the scope of expected locations.

To make this discussion real, think of it in terms of something we use every day: search.

  • On the simplest level, things (web pages, information) are "hidden" because there is just too much information on the web for us to know where it all is -- or even might be -- and search engines help us "find" that information. Is it "lost"? Often not. Because in many cases we do not know whether the information exists or not. We are searching hoping that something will show up. So it is "hidden" but not "lost". This is the simplest view of search: search as discovery.
  • Sometimes we are searching for things we know exist -- sites we have visited before or information we have been told to search for. In this case, we are looking but the information is neither hidden nor lost, since we know it is out there somewhere on the web and search engines help us find it -- as expected. This is the second view of search: search as locating. Just like the wallet and watch I expect to find on the bookshelf
  • Sometimes, whether we know it exists or not, we look for information using the wrong words. We might misspell a name ("Dwayne Allmann") or look for synonyms to the words in the content we are looking for ("fix" instead or "patch"). The consequence is a failure to find the item, in which case it is truly "lost". Internet search engines do a lot to try and save us from this dilemma; they recommend correct spellings and support stemming, synonyms, and fuzzy logic to broaden the results. However, even these techniques may not solve the problem and we must try again and again to define a search that matches our requirements. This is search as hunting.
  • Finally, even if you construct the proper search -- you look in all the appropriate places -- you may fail to "see" the item you want. Most internet searches produce hundreds or thousands of results. The search engines do their best to prioritize the results (called relevancy) so the most likely are at the top of the list. In other words, where you expect to find them. However, if the items you need are not at the top, you need to do a second search: searching through the search results. This can be extremely frustrating -- just like searching through your entire house for keys, wallets, glasses, or whatever -- because you cannot find the items you need. You do not recognize the title, the abbreviated description, or the location/URL as meaningful. At this point, the information is truly "hidden" from you because you cannot distinguish it from the forest of other results, just as I cannot find a book on my bookshelf if it is outside the bounds of where I expect to find it. This is search as loss. In fact, the desired results may never be found. (Many people give up before even looking at the 2nd or 3rd page of search results. The field of possibility is so vast it discourages exploration.)

So, given this situation, why is the quotation both right and wrong? It is the first sentence of Miyabe's quote that holds the key: It doesn't make any sense to look for something that isn't hiding.

There could well be an issue of translation here. Because, although "hidden" and "hiding" are different tenses of the same verb, they have significantly different connotations in English. "Hidden" is passive; it implies something that cannot be seen or found. "Hiding" is active; it implies the object is deliberating taking action to hide itself. And to say we aren't looking for something unless it is "hiding", would eliminate most inanimate objects from the equation.

It is true: it doesn't make sense to look for something that isn't hidden -- even if it is in plain view. And it explains the frustration and despair many users feel when they need to look for information, since they often have no clue how large the field of possibility is before they start. They assume it is hidden.

One of the stories I like to tell is the experience I had when interviewing consultants to determine how they looked for knowledge about previous projects. The overwhelming response was, in order:

  1. Ask someone in the office.
  2. Call someone they think might know.
  3. Send email to people they think may know someone who may know.
  4. If all else fails, look online.

These were experienced, tech-savvy engineers; they knew how to construct search queries; they had a good sense of what information should be available within the corporation. But assuming the information was hidden, their #1 preference was to look for a guide within their community of peers.

This tendency has been repeated time and again across the diverse audiences I have supported. Why do things people search for need to be hidden? Because they are searching. Because they are outside the realm of known possibility. Because their need exceeds the bounds of personal knowledge.

Part of the work of making things findable is bounding the field of possibility for the searcher. if you can make it clear that their search is bounded within a scope of likely candidates (rather than everything on the intranet, for example), you can encourage them to search earlier and have more faith in the results.

One of the keys within knowledge management, or the design of any information space, is establishing confidence in your audience that your structures form a clear and reliable scope of possibility for the classification of information you support. In other words, your systems are the shelves where they are likely to find their keys and wallet. This means turning searching into finding and the hidden into the found.

Thursday, November 15, 2007

What I'm Playing: Zack & Wiki

I recently started playing a new game for the Wii called Zack & Wiki: Quest for Bardaros' Treasure. It is a puzzle game, built around a crazy story about rabbit pirates, what looks like a flying monkey, and collecting the parts of a pirate's skeleton while searching for the legendary pirate ship. I'm having a great time playing it, my family enjoys watching the game, and I'd recommend it to almost anyone.

Funny thing is, I can't figure out why.

Yes, Zack & Wiki is a lot of fun. Its charming, witty, cute (without being overpowering), as well as challenging. But it also has a host of video game no-no's, each of which would be sufficient to kill any other game. But for some reason, they just don't detract from this one.
  • The game is cute. Undeniably cute. From its stubby young hero Zack to its teddy bear/stuffed lion villains. Even death is cute in this game. (Be prepared to die... a lot!) They show you a baseball card-like profile each time a new character is introduced. Their profile even lists a favorite food for each. This is meaningless fun. But, quite frankly, it is fun. Even for adults. (Perhaps especially for adults.)
  • The plot is pablum and the opening sequence is far, far too long. For the first 10 minutes there is nothing for the player to do but endlessly press A to scroll the text while the characters make various incoherent squeaks and squawks. This alone could kill a game. But, even so, the plot is just crazy enough, and the dialog sarcastic enough to keep you going.
  • Your sidekick, Wiki, is a golden monkey who flies using his helicopter tail, constantly says "Ding!" to get your attention, and changes into a bell when you shake the Wii remote. Say what?!?
  • For the first few stages, there is far too much hand holding. Wiki is constantly interrupting the game to explain the obvious, giving the impression that this is a game just for kids. However, it doesn't take long -- a couple of stages later -- to find that you need to know this stuff to solve the later puzzles. I wish I had paid a little more attention rather than pooh-poohing the helpful advice.
  • The puzzles take the form of a sort of interactive Rube Goldberg device where you have to string a series of tangentially related objects and events together to reach your goal. There are many ways to get the objects and events in the wrong order (this is where death comes in). Some key relationships are arbitrary (or even counter intuitive to trap you), which in other games would be infuriating. But in Zack & Wiki, you need to apply enough real-world logic to work out the puzzles and turn failures into success that it gives a real sense of achievement for each stage completed.

So, why does it work, where so many other erstwhile games would have been felled in their tracks?

... (silence) ...

If you are waiting for me to answer the question, you will be disappointed. Because I really don't know. But I suspect it has something to do with vision and quality.

All of these "flaws" are used consistently within the game. The opening story line is totally in keeping with the graphic design and mechanics of the game play. Similarly the humor is ever present, from the opening sequence, through the tutorial, and into the missions. (Zack's favorite food is candy bars. So whenever you pause or start a new mission, you find him munching on a candy bar. Again, meaningless, but keeping you inside the story.)

There is no manual or step-by-step guide you can follow to achieve this sort of fluidity and seamlessness to a game. It requires someone having a clear vision of what the game is about and helping the development team share that vision and bring it to life. We've seen it before in other games: Mario, Zelda, Katamari, Shadow of the Colossus... Even for games I don't necessarily enjoy playing myself, I can appreciate the intense focus on a vision; games like Gran Turismo or Final Fantasy.

In some cases this sense of clarity evolves over time, iteratively, as with Madden Football or Tony Hawk. With other games it emerges full-blown like a new planet suddenly appearing in the universe, as with Katamari and -- now -- Zack & Wiki.

It is as inexplicable. Or rather, it is undefinable. Like art. Because it is art. I am not trying to get into the debate about whether video games themselves can be art (that is a separate discussion) but there is an art to creating exquisite video games just as there can be art in any activity: baking a cake, building a wall, writing a letter... There is art that is the object and there is art that is the intensity, clarity, and pure focus in the doing of something, in the creation. Video games like Zack & Wiki come from the intensity of the doing. And although we may not be able to name or describe how it is done, we can admire it and be grateful for the gifts that it generates for us as gamers.

Sunday, November 4, 2007

Autobots and Art

While watching TV a couple of weeks ago, I noticed they were doing a review of the Transformers movie, which had just came out on video. I didn't see the actual review, but it started me thinking about the movie (which I saw in the theaters over the summer) and whether it should be recommended. My thinking went along these lines:
  • The movie was "ok". Not good, not bad, but ok.
  • The acting was ok.
  • The plot was ok (for your run-of-the-mill outcast human must battle to save the world with assistance from aliens/animals/robots/whatever)
  • The final fight scene was ok.
  • The filming was ok...
  • It was nothing exceptional, but nothing horrendous.
  • I could imagine someone wanting to rent it, but can't see why anyone would need to own it.

Why? Here is where my thinking ranged beyond mere daydreaming. Transformers are alien robots that disguise themselves as cars, trucks, planes, etc. Pretty much the sole attraction of the original cartoon and spinoff toys is the transformation from robot to car (and back again). Converting this cartoon idea to live action has several potential pitfalls, and the transformations would clearly be one of them. So, obviously, a lot of money was spent on CGM to make this happen.

There are many things wrong with the movie, from the so-so plot to the hammy acting, but that hasn't stopped many a summer action movie from being enjoyable. (Think of innumerable James Bond films.) Unfortunately, it is the basic operation of the transformers themselves where this movie lets you down.

The transformations on screen can best be described as a windstorm in a junk yard. Thousands of metallic bits whirling about. I am sure there was someone on set who could have explained exactly where every piece of the automobile was in each frame -- to prove how "realistic" the transformation is. However, despite any technical veracity, the resulting footage has no emotional truth to it. The animation has the effect of masking the transformation (sort of like Superman's phone booth) rather than making it believable.

By the final fight scene, there is so much whirling, clanking metal it is impossible to tell who's fighting who and which piece of disembodied metal you ought to be rooting for.

So why am I classifying this post as poetry? There is an analogy here to art in general, and certainly poetry as one of the arts.

There have been a number of times in the past when someone has explained to me why a particular work of art deserves attention and admiration. You can stand me in front of one of Chuck Close's larger-than-life portraits and explain how remarkable his technique is, and I can admire that technique on an intellectual level. But I just have to turn to look at one of Franz Kline's seemingly crude black & white paintings to realize how big the difference is between respecting the effort and being enthralled by the result. You feel a Kline painting, you think about a Close portrait.

Similarly, I've had innumerable people explain to me why John Ashbery's Self-Portrait in a Convex Mirror is great poetry. So much so, that at this point I have come to despise his work (with two notable exceptions). As gymastic as Ashbery's intellectual efforts are, what the technique does is mask the author's lack of any real feeling for the subject.

Ditto Pound's Cantos. If anything, what I get from reading the Cantos -- instead of awe at his erudition -- is a feeling for how much Pound hates and despises his readers. He literally flings literary and historical minutia in your face in place of any real revelation of emotion.

However, Chuck Close is not a good example -- because in his case, the technique is the primary focus; it is not being used to hide anything. The problem with the Transformers Movie and other works like it is not just an issue of form over function, surface vs. substance, artifice vs. art. The problem occurs specifically when the presentation is used to dazzle and distract the audience from noticing gaps in the argument, whether emotional or intellectual.

Perhaps I am so adamant about this issue because I have been taken in by this trick before. In college I fell for the French surrealists and their antics, several of whom turned out in the end to be more poseurs than probers of the subconscious (with Andre Breton leading the parade). In the case of American poetry, it is often tone or subject matter that is used to stir up the reader in place of any real depth. Diane Wakoski's brash style is very attractive, but in several cases is used to cover an empty shell. (Her long poem, Greed, shows both the good and the bad of this reliance on tone.) The violence of Ai's first book, Cruelty, is captivating. But by the third book, Sin, you begin to doubt the veracity --real or imaginary -- of the feeling. There is only so much imagistic punishment a reader can take before they begin to fight back and say "wait a minute, not everything can be so black..."

As a counter example to Ai, Eleanor Lerman's first two books had that same harsh (yet funny) edge to them. It was 26 years before she published her third book, The Mystery of Meteors. Again you feel the stark, uncompromising view of life. But you also see a woman growing and facing different, sometimes more subtle trials. Here is a true voice bound tightly to emotion, each carrying the other on. They enhance each other and make you feel as if you are seeing a life lived through poetry through the medium of her books.

Perhaps the biggest trap is the use of childhood and illness as fodder for poems. When all else fails, talk about your childhood traumas. I know I sound flippant saying this, but it is a common trap. I've fallen for it; probably every poet has at some point in time. Several poets have made entire careers at this.

Its not that you can't use your childhood as subject material -- everyone does -- but you can't use it as a crutch, as an automatic attention grabber, in place of what poetry deserves. As an example, look at the poetry of Len Roberts. Roberts is a poet I like quite a bit. But he is a good example, since he has both fallen for the temptation and recovered from it.

Roberts' first couple of books were promising. He is a poet of the past, writing autobiographical poems of a tense childhood. And there is a lot of power in those early books.

There is nothing wrong with childhood as a topic. However it is very hard to view your own childhood objectively, and the events that shaped you carry -- for you as author -- immense emotional weight. Roberts' initial poems are brimming with that sense of overpowering emotional conflict.

By his third and fourth books, Roberts starts to move into the present tense. But even when poems start in present tense, they act as triggers for the past:


Pushing the yellow Cougar out of the snow,
its tires spinning muddy slush onto
my good pants, I remember all the men
back on Olmstead Street coming out
at dawn when someone's car was stuck....

-- "Pushing cars out of the snow"

Unfortunately, by now, the past seems more abstracted, more theatrical, more habitual, and its use as an emotional trigger wears thin. The past becomes an easy out, an appealing way to bring a poem to a close with that sense of suspended tension -- an unresolved drama -- as if the lights went out in a theater half way through the last act.

...and I remembered those five a.m.'s when
my father rose to shovel the entire block, his father coming out quickly to join him
[...]
moving further and further
apart until they reached the ends of the walk and then,
without one word, without even a wave of the hand, entered their separate doors.

--"The Block"

The problem is that in many cases the drama is not really there. It is imagined, instinctive -- ghostly like pain in a missing leg. And there are only so many times you can leave your audience hanging that way.

Fortunately, Roberts' later poems recover and show less of a tendency to fall for the easy ending, the cliched past tense. He does the hard work that is needed and his poems show it.

As counterpoint, another poet, Sharon Olds, tends in the other direction. Her poems are hewn from the pain, suffering, and terminal illnesses of what seem like an endless collection of family members and other relatives. At first you are taken aback by Olds' frank portrayal of disease and dysfunction. But it doesn't take you long to start doubting everything -- her reactions, the situation itself. There is just too much dissolution and too much hardened angst to believe it would be written this way -- whether real or imagined.

I am not saying Sharon Olds' family hasn't suffered the many mishaps she describes. (I have no idea whether her poems are autobiographical, fictional, or a combination of both. It is not my job as a reader to know that.) I am also not refuting any emotional response she personally may have had to the events described. But as art, the poems rely far too heavily on the reader's visceral reaction to serious illness and drama (and the narrators' reaction to it) and do far too little to weave it into an artistic vision. The poems are blunt, harsh. But their bluntness is the bluntness of a dull weapon, not of raw beauty.

In the end, poetry, like any art, is hard work. No matter how easy the end result appears. Avoiding the quick fix, the emotion-laden set piece, is part of the artist's job, so as not to play fast and loose with the audience's "willing suspension of disbelief".

Thursday, October 25, 2007

Pocket Review: Madden 08 for DS

For a lark, we bought the original Madden 05 (the first DS version of Madden) to play between my sons and I. We have Madden on the consoles and play that when we have it out. But for some quick pick up and play, Madden 05 fit the bill. It is certainly not "next gen"; more like playing with crudely drawn robots who can only move in four directions (up, down, left, right). But it can still be a lot of fun.

Anyway, the game has been through several iterations since that original release and one of my sons mentioned not having certain new players available in the roster, so I suggested trying the latest version, Madden 08 for DS. After an hour of play, his assessment was that the newest game "tries to do too much."

He explained that in an effort to make the game playable, they put up controls indicators, messages, and touch screen options all over the place before the play. It is not that these markers interfere with the play but, in my son's words, "it doesn't feel like football."

So, I expect they will play it to use the latest players. But I also expect they will switch off with Madden 05 when they want some basic robot head-smashing fun.

Monday, October 22, 2007

The Threat of Social Software, Part 3

(continued from Part 2)

So, if web 2.0 is not going to transform how business is done and it’s not going to infiltrate the corporate intranet without being modified, permuted, tamed, and subdued… what, if any, is the threat of social software? The real threat is that it makes the corporate intranet irrelevant.

The threat is not what it might potentially do inside the firewall but what it is doing, as we speak, outside the castle walls. Of those millions and millions of people flocking to MySpace, Flickr, Digg, Twitter, and other sites, a significant number are members of the white and blue collar classes in the United States and elsewhere.

Granted, much of the time spent on these sites is personal. But some amount – a growing percentage I suspect – is professional. Because, unlike corporations that make a clear distinction between work and personal life (inside vs. outside the firewall, work time vs. personal time) individual employees tend to move fluidly between one and the other, often combining them. We have personal conversations in the office; we discuss work at home; we tend to mingle with people with similar interests…

In the past this intermingling of personal and business life had been restricted by the limits of one’s own community: the town you live in, the people you talk to, the clubs you frequent. There were opportunities to exchange mail or phone calls and possibly meet once or twice a year at professional conferences, but not much else.

Now there is a generation that was raised in a virtual world and has lived much of their personal life online. When they reach the workplace, they expect that the easy exchange of communication and emotion extends into the corporation. In most cases it does not. And those corporations that are adopting web 2.0 technologies, are doing so specifically with the intent of maintaining the security, privacy, and control that corporate intranets are designed for. In other words, a mini web 2.0 inside the firewall.

But if I wanted to talk to people in my profession, would I contact the 3-5 information architects I have found (largely by accident) in my own company, or the hundreds or possibly thousands who are members of IAI or the SIGIA-L email list? Or I might search the blogs of the many talented and highly-visible IA’s that are readily available on the web.

Even following web 1.0 logic, if I have a technical issue with programming or managing an application (as ubiquitous as Microsoft Word or as specific as MySQL database maintenance), will I find the answer faster searching the corporate intranet or the vast knowledge accessible through Google and Yahoo? Experience teaches me the latter.

The fact is that there is far more professional and technical information publicly available and willingly shared outside the corporation than inside. This is the transformative power of web 2.0. And it is all accessible within approximately the same framework I use for my social life. This fact is emphasized by the recent developments in applications like FaceBook and LinkedIn that are helping people manage their social and business lives within a single environment outside the firewall.

So, for what we can call the MySpace Generation*, corporate efforts to restrict access and interoperability between inside and outside the firewall is purely a provincial attitude that is easily ignored in favor the better solution. Why not keep your bookmarks in del.icio.us? Corporate security types can cry foul that internal URLs are exposed externally, but there is little they can do to stop it and it makes no sense to the individual to have to maintain two separate sets of bookmarks inside and outside. If I am on LinkedIn, what benefit is there to me maintaining a second profile on a weak copy of the technology inside the firewall? Which will gain me more exposure and contacts?

By trying to maintain the old policies of secrecy and separation, corporations are forcing modern workers to make a decision between managing their professional lives inside or outside the firewall, and in most cases the clearly more effective choice -- and the one they are familiar with -- is outside. They will do what they see as necessary to meet corporate requirements, but they see no reason not to (and many, many reasons for) sharing their personal/professional knowledge outside the firewall with their preferred community of virtual friends and professional peers.

After 15 years of downsizing, outsourcing, buyouts, booms, and busts, employees are indebted to their employers for their current situation. But no one with any intelligence or sense of history is going to assume the company is acting in their own personal best interest. Employees who effectively utilize external knowledge and contacts will prove far more successful both within their current company and with any future employer.

What we are seeing is a democratization of knowledge management as web 2.0 technologies evolve from purely social to social and professional. The communities that individuals belong to (and gain strength from) are far more extensive, less restrictive, easier to use, and in many cases far more personally productive outside the firewall than in.

The individuals themselves are recognizing that the knowledge they possess -- or have access to -- is a key source of power, prestige, and employability. (As demonstrated by the proliferation of blogs on business topics by individuals.) As lifetime employment vanishes as a concept, employees see knowledge and experience as part of their own professional personalities and one of the key leveraging points they possess.

This shift away from knowledge as the sole property of the corporation to knowledge as a professional tool owned by the individual will force corporations to rethink how they "manage" the combined intellectual capital of the company, its employees, partners, customers, former employees, and even competitors.

This is not a conflict. It is a realignment of responsibility that complements both sides. It is an old saw, but still true that knowledge increases when it is shared. And those companies that realize this and effectively support and utilize this cooperative knowledge environment are the companies that will "win".

(continued in Part 4)


* I am not fond of the term the Naked Generation that Caroline McCarthy coined, because it tends to focus on the more extreme edges of the current generation. However, it does have the advantage of capturing one of the key attributes of the times – transparency. Living your life, both private and public,out in the open on the internet is one of the identifying characteristics of the generation and one of the distinguishing marks of web 2.0 technology as well (not surprisingly).

Friday, October 19, 2007

More About Casual Gaming

Shortly after posting my thoughts about casual gaming, I came across two essays on the topic at Gamasutra (thanks to a reference from Kotaku). However, where I discussed the casual gamer, these essays discuss the games themselves -- Ian Bogost discussing the meaning of casual games and Juan Gril the role of innovation in casual games. Both essays make great reading, especially Ian's comments about time commitment.

The question that comes to mind, though, is whether we are actually talking about one or two distinct audiences here. Ian classifies casual games (and consequently, casual gamers) as follows:

Casual games typically offer short gameplay sessions, come at a lower cost than hardcore games, and allow play on more ordinary devices like personal computers and mobile phones.

So, is there a difference between the "internet" casual gamer as identified by people who play Solitaire and Bejeweled, and the "console" casual gamer that Nintendo is pursuing?

Let's, for discussion's sake, assume they are different markets and see what is the same and what distinguishes them. I believe the overall impulse is the same for both audiences. Ian quotes a white paper from IDGA that characterizes casual gamers as "gamers who play games for enjoyment and relaxation." (What I had described as recreational gamers.)

He also presents a very useful matrix for understanding the motivation behind casual games; examining time, money and control and the casual gamers assumptions (or limits) for each in terms of complexity and investment. He argues that "casual" is a misleading term because it implies a limit on the investment the player will make; but the common business model (try then buy) is based on users willing to invest time beyond a simple demo or mini game.

He suggests that "informal" is a better adjective than "casual" and that informal allows for variants, such as indifferent, spontaneous, or fleeting. But that they can be repetitive (allowing for more investment of time, and therefore a viable market).

Bogost explains this much better than I can so I encourage anyone interested to read his essay. He also goes into more detail concerning "fleeting" games and their application to the news game genre (an area of particular focus for him). What I want to explore here is the implications for the console market.

Ian explains why "respecting" a casual gamer's time commitment should not necessarily mean short. (And, by extension, why providing only minigames is an unnecessarily limited strategy for game developers.) It may be better to say that the time commitment is variable rather than short. I may choose to play for five minutes today, but for an hour tomorrow. Again, this applies to informal gaming on both the internet and consoles.

The two areas of Bogost's matrix that do cause trouble for the nouveau console gamer are money and control. According to the matrix, informal gamers want games that are easily accessible, low cost, and/or run on existing equipment (such as a PC). This is definitely not true of the market Nintendo is pursuing.

Although the Wii is the cheapest of the current generation consoles, $249 + $50 per game is not cheap by any measure. It is a considerable investment for informal gaming. The innovative controller might explain part of the allure (claims that it is a "fad"). But certainly not to the level of sales the Wii has experienced for practically a full year since its release.

The difference in commitment (and in audiences) might be compared to the difference between someone at a dinner party suggesting a game of cards versus someone bringing a board game with them. Most people have playing cards, you can decide on the spur of the moment. But owning and bringing a board game shows serious intent to have fun.

Another difference, which I mentioned previously, is scope. Internet informal gamers tend to play games to pass the time: Solitaire, Bejeweled, puzzles... individual games. Informal console gamers would prefer to play as a group. Even if the game itself is single player, they will play together -- encouraging, advising, kibitzing, and even playfully joshing whoever has the controller.

This expectation that they will be able to enjoy the gaming together may be part of what helps the console informal gamer overcome any resistance to the initial steep investment. And it is certainly an aspect Nintendo emphasizes in almost all of it's marketing. One aspect of this theory -- if it is correct -- is that there needs to be steady flow of innovative games if this market is to be kept flourishing. Hardcore gamers can sate themselves on multiple releases of similar shoot and kill games, but there are only so many versions of Monopoly or Clue you can play before your friends will get tired.

Ultimately, I think we are talking about two separate audiences. Or at least two ends of a single spectrum: the informal internet gamer and the expanded audience of informal console gamers Nintendo is pursuing. The industry wants to treat them as one at the moment; possibly because they don't understand either! But it will be interesting to see if by sheer volume alone, the two audiences differentiate and force a rethinking of the simplistic hardcore/casual two-way split many companies in the video game industry are pursuing.

Wednesday, October 17, 2007

Three Types of Knowledge Workers

I have come to believe that there are essentially three types of knowledge workers. They are:

  • Knowledge Generators -- these are your primary sources of new knowledge: the people who know, the experts, practitioners, talkers, explorers... They answer questions, proffer theories, discuss ideas, and find solutions for others.
  • Knowledge Consumers -- these are the people who use the system to find information but have little to offer themselves. They ask questions, they search the repositories, and listen intently.
  • Knowledge Brokers -- these are the people who do not generate significant knowledge themselves, but are well-versed in finding information. The classic example of a knowledge broker is the secretary; a good secretary is a storehouse of knowledge about how to get things done. Brokers are the ones who know where to look.

Of course, there are many more distinctions you can make if you delve deeper. There are the classic nicknames for users of bulletin boards and distribution lists: lurkers, newbies, trolls, sock puppets, etc. You can also distinguish users by their profession or job. But in general, there are only three types that matter.

The reason the type of user is significant is because in knowledge management programs, we tend to treat all users alike. They are treated as if their knowledge management needs are the same.

Part of this is just the basic constraints of running a corporate program; the old 80/20 rule -- where you provide 20% of the functionality to meet the needs of 80% of the audience. But part of it is an assumption about the continuity of people's knowledge needs and tactics. However, experience teaches us the exact opposite (as in the case of the bulletin board members mentioned above). Based on personality, expectations, needs, and experience, people participate (or not) in unique ways, but we tend to provide a single set of tools for them to use. In most cases this works because the technology is just a shell and the majority of the users find a way to manipulate the tools to their needs.

But recently I've run into several cases where the basic goals of the KM infrastructure have been called into question, based on differing views of the primary users' goals. This conflict has never become explicit, but runs as an underlying discontinuity impacting discussions of KM future directions with management.

Why is the happening? It happens because management and those running the KM program have differing views of the role of the users. It is not the role of the individual that matters (since we all tend to play differing roles at different times), but the overall mixture that can influence a company's KM strategy.

If you assume your employees are generating knowledge (and so have a mix of all three types), then KM will focus on making sure that knowledge is shared. Forums, communities, mentoring, making the tacit explicit, are all key activities within the KM environment.

If, on the other hand, you assume your employees are purely consumers with very little innovative or creative production, there is little need for peer-to-peer sharing and KM strategies will focus on documenting and distributing best practices, standard procedures, policies, etc.

Product development is an example of the former, where there is often a clear focus on sharing knowledge among peers and fostering innovation. The outsourcing of support is an example of the latter, where the assumption is that the knowledge pre-exists and anyone -- even someone for whom English is a second language -- can be taught to give the right answers. Here documenting the "right" answers is the primary focus.

Which brings me to my point. Similar to the outsourcing craze of the late 90's, managers today are focusing more an more on "operational efficiency" and quarter by quarter metrics. Since you can only squeeze so much efficiency out of an existing process, there is a lot of interest in "commoditizing" what have traditionally been seen as skilled or experience-based processes. In the consulting field, this manifests itself in the effort to focus on a limited set of core services that can be defined, documented, sold, and delivered by consultants both experienced and new. The idea is that you dramatically speed up the evolution from custom solutions to standardized offerings before the market commoditizes the price.

You can't blame management. And there may well be opportunities for commoditization here. However, problems occur when the managerial will outstrips the organizational capacity. In their enthusiasm to achieve their goals, they expect KM to follow suit and switch from supporting an organization of creative professionals to pumping "approved" content to delivery engineers. They also insist KM focus solely on those strategic services, to the exclusion of all other knowledge.

But strategy does not equal reality. What happens in the field often does not match the suppositions of headquarters. And unfortunately -- or fortunately, depending on your point of view -- knowledge management has to deal both with goals and realities if it is to succeed.

The outcome is a schism between what managers believe and what KM needs to do to support the organization as a whole. If it is a large corporation, you will also see it in conflicting requirements placed on KM from different portions of management. (Where management closest to the field demands support for what is and those at a global level demand support for what is desired.)

It sounds presumptuous, but I believe it is just a point of fact that knowledge develops much slower and perseveres much longer than almost any corporate strategy. The fact that a particular service is dropped from the catalog doesn't mean the knowledge garnered from it is not applicable to other services or that there are not customers still being supported who require it. (Or employees who can gain from discussing it.)

Also, even when supporting a "commodity" model, there is a significant gap between what can be documented and what happens "on the street". The classic example of a community of practice, described by Wenger and more recently by John Seely Brown in The Social Life of Information, is the case of Xerox technicians who were bombarded with printed information (i.e. "answers") but struggled to solve customers' problems until they were connected through a community so they could exchange tricks of the trade learned through experience. Almost an exact replica of today's managers pushing "best practices" to the exclusion of other KM activities.

This difference between the ideal and day-to-day reality gets expressed in thousands of assumptions about what is important in KM. Insistence that "best practices" get prominence over peer-submitted materials in the interface; requests that contributions be reviewed and "qualified" before being shared; demands that communities and forums be aligned to the organizational structure and other communities be dropped.

In the worst cases, as in the last example above, the difference of opinion results in demands that KM "support the business" and focus solely on efforts to support the strategy of the day to the exclusion of the needs of the employees. (This suppression of "non-strategic" knowledge also will have a critically negative impact on people's desire to contribute, damaging any knowledge sharing culture you are trying to establish.)

What can be done about this? Initially (a number of years ago), I was adamant that the interests of KM be given autonomy and that the knowledge architecture should support the actual topics that employees require or affiliate around. Communities should be aligned around "hot spots" within the corporation or where there was a clear strategic need to bring people together. It is also important that the communities be persistent, so they should be named using industry standard terminology -- not the corporate organizational name du jour.

Later, I despaired of any resolution. It was hard to see how conflicts could be avoided and the synergy between business operations and KM was almost totally dependent on the character of the manager -- the more open and visionary they were, the more synergy was possible; the more strictly operational their focus, the more conflicts would occur.

Now, I believe the tension between KM and corporate operations cannot be resolved -- it is a natural consequence of differing goals and points of view. However, I have learned (thanks largely to the opportunity of watching a few very smart and dedicated individuals working these issues) that it is possible to use this tension to your advantage.

KM must stick to KM -- actually managing knowledge -- not falling for the lure of "strategic alignment". By laying the proper foundation of technical support for collaboration, goals and incentives for individuals, and KM policies and procedures that align with business processes rather than specific, short-term business targets, you establish a demonstrable, long-term KM infrastructure. Then, when managers tell you to align with the business plans, rather than changing the infrastructure, you can challenge them to have their organizations use the infrastructure properly.

For example, rather than creating a new community around a specific version of a product, challenge them to set and meet a goal for participation in the existing community appropriate to that technology area. Or set and meet goals for contributing to existing communities and repositories. When they say push "approved" materials, challenge them to increase usage (downloads?) of that material in their organization or of awareness (training?). By pushing goals and metrics rather than brute force control of the source materials, you can not only get the managers to help push the goals of KM but also give them a view of reality -- see what is really being used and how much or little participation their organization is contributing.

There is no panacea. The broader goals of KM will always be hard to measure against short-term operational targets. However, by providing a stable set of core KM strategies and support, you can help management measure its own success within that environment, without distorting the overall flow of knowledge within the corporation.

Thursday, October 11, 2007

The Fallacy of Casual Gaming

The latest fad in video games is “casual gaming”. Nintendo is to blame – the resounding success of the Nintendo Wii has demolished the gaming industry's preconceptions about who their audience is. Unfortunately, this comeuppance has not given them an equal insight into the real truth about gaming.

Instead, they have replaced one misconception with another. Now, rather than believing their only audience is males between the ages of 16 and 25 interested in brutal, complex, and immersive games, they think they need to cater to an additional audience of “casual” gamers. This translates for many people in the industry into a series of short, light-weight “mini-games” connected by the slimmest ghost of a story.

Sorry. They got it wrong once again. “Casual gamer” does not mean brain-dead. The problem with this line of thinking is that it confuses the audience's initial response to the video game console with their gaming interests.

The fact is that a significant portion – the significant portion – of society is afraid of video games. Watching someone play a game on a traditional video game console (xBox, Playstation, or even Nintendo's own N64 or Gamecube) is like watching someone possessed by an alien spirit. They clutch the controller with both hands, twitching and swerving as if in another dimension, while brightly colored (and often heavily armed) objects hurtle towards them on the screen, just missing by inches. How do they do that?

I know how they do it. I've do it myself. But for the uninitiated, it is both disconcerting and off-putting. The standard PlayStation controller has 10 face buttons, two thumb sticks, and four shoulder buttons. Given that the thumb sticks are overloaded with two additional button actions (if you push down on them) that gives a total of 18 distinct controls. Not for the uninitiated.

The Wii remote, on the other hand – despite having 12 buttons on it – is essentially a stick you wave around with one large button (A) you push. Can't get much simpler – and more approachable – than that!

The true genius of the Wii is in its design: you can ignore the other 11 buttons until you need them... And then there is that connector on the bottom. The connector lets you add functionality as needed, such as the nunchuck. The Wii remote with nunchuck attached is at least as complex as the PlayStation or xBox controller, because although it only has a total of 17 distinct buttons or dials, it has two separate components whose position, rotation, and/or speed can be used as controls.

So what the Wii has done (and the DS to some extent) is overcome the initial aversion to the game console as a device. But once you get people to pick it up, what do they want to play? My experience – with friends and relatives, which is hardly scientific – is that once they are willing to play, they play pretty much anything as long as they are not being chased and you do not offend them.

Once you get them on the machine, they will stay on it and enjoy themselves, as long as they are not put into a threatening position early on. This tends to mean no FPS (first person shooters). After they become adept at the controls, they might try it. But nothing will drive your audience away faster than making them feel helpless while they are still learning.

It is significant that Nintendo shipped Wii Sports with the console in the US. These are not mini-games; these are fully-fledged and recognizable games – again making the new audience feel comfortable and reinforcing the physical nature of the game play.

And the other top quality games on the console are not mini-games: Super Paper Mario, Zelda (which involves chasing but brings the user along slowly, which is the genius of all the Zelda games), and a series of lesser but equally enjoyable games: Wing Island, Mario Strikers, Excite Truck...

So, there is plenty of room for innovation. Plenty of room to wow, welcome, and enthrall new users of any age. The opportunity exists, but the industry can't seem to be able to see past the initial simplicity. They have read "casual gamer" as fickle, feeble, and with a short-attention span. Nothing is going to disappoint and turn off these new users faster -- once they have got past the initial resistance to the console itself -- than finding nothing but a bunch of simplistic, unchallenging titles like Boogie, Carnival Games, etc.

So, if a "casual gamer" is not a twitchy 16 year-old and its not a frumpy housewife with ADD, who are they? Generalizing about an audience is a dangerous thing and I have absolutely no scientific or statistical knowledge to go on. But since it is clear something has changed in the marketplace, it is probably worthwhile to figure out what it is. And we can start by determining what it isn't.

Let's go out on a limb and say casual gamers are not hardcore gamers. That means gaming is not their primary passion -- it is a pastime. Gaming is for them a recreational activity. They may play video games for two or three hours at a time, but once or twice a week at most -- not every day. They may only play once or twice a month.

Which means that games that require you to memorize a dizzying array of button combos are probably out. (Sorry, Madden NFL.) Games with simple controls, where control reminders are shown on the screen (ala Zelda), or where the player can easily practice and freshen up before getting embroiled in a new battle will be favored. It sounds silly, but the ability to save easily is also essential if any storyline episode lasts more than 30 minutes. (Save locations like the blocks in Paper Mario or Marvel Ultimate Alliance may well be too far apart for gamers who don't have the time. Or the player gives up simply not knowing how much further they need to go.)

Next, they are not all 16-25 year old boys. Which means that the subject material of games needs to expand to include more than just those of interest to teenage boys. Lots of shooting, crashing, and punching will upset a significant portion of the audience under 12 and over 35. That doesn't mean competition, conflict, racing, and fighting are out; but the violent edge that infects the majority of "hardcore" games is a problem for the expanded audience. So less "intense" variations are likely to appeal more -- that is, less intense in presentation (no death, blood, screaming, etc.), not necessarily in game play.

We can also assume, since video gaming is a recreational activity, that they have other hobbies. Given a free half hour, they may be more likely to pick up a book, do a crossword, or work on a model airplane than play a video game. These are your competitors, not other video games.

Finally, you will find that as with many recreational activities (sports, board games, even crafts) casual gamers have more fun when they to play together rather than separately. Online play has its benefits, but playing with strangers can be off-putting and it is much easier (more comfortable, less stressful) to get your friends together and play, just as you would to watch a movie or play touch football.

Local multiplayer is key to supporting the expanded audience of casual gamers. Wii Sports is fun, but it is ten times better playing with a friend or family member. Even single player games significantly expand their playability for casual gamers if they include a local multiplayer mode. (Not some weak minigame. Something that can be played for at least 30 minutes without getting bored.)

Of course, because of the unprecedented sales of the Wii console, many many games are being released for it, as publishers try to cash in. Such as the ubiquitous Madden NFL, a title that is anything but user friendly for the casual gamer.* Even with the Wii remote, you have to memorize a dizzying array of button presses, waves and shoves. The same goes for flashy but weak games based on movie franchises (usually with little or no Wii-specific functionality beyond their comparable Playstation or xBox releases).

Nintendo's success is ultimately the cause of the confusion -- and disappointment -- these "filler" games create as everyone and their brother tries to sell to this new audience. On the up side, Nintendo continues its push to develop high-quality games for the real "casual gamer" and selected other publishers are beginning to catch on. Zack & Wiki is one upcoming game that appears to understand its audience and looks terrific to boot.

So let's hear it for diversity and hope the other publishers eventually catch on.

*Footnote: I read somewhere that the latest Madden NFL includes two control schemes, regular and simplified. I haven't tried this new edition, but alternate play modes and control schemes may be one way for multiplatform games to retain their essence while meeting the needs of new players on the Wii.

Wednesday, October 3, 2007

Measuring KM

Just one day after posting about the dangers of ROI, someone suggested that we should look at KM as a set of services with deliverables, service levels, etc.

My initial reaction was similar to my reaction to requests for ROI. It is easy to see where this is leading: decomposing KM into its constituent "service" parts so the business groups can pick and choose the ones they want -- the ones that they see as directly contributing to their own bottom line and leave the fuzzier and more complex goals of KM behind.

KM is not just a menu of services. If it were, there would be very little purpose in separating it from the rest of business operations. On the other hand, I understand the desire to get a handle on exactly what is the business benefit of a program such as KM.

The analogy that came to mind at the time was to the difficulties involved in funding public education. Individual citizens do not get to choose whether they fund one service over another. There is a need to educate our young (reinforced by federal laws) that cannot be broken down into individual services.

At the same time, my own analogy betrays me because I don't believe the US educational system is well run or evenly remotely efficient in achieving its goals.

So how do you retain the proper perspective on KM's long-term goals and still make the program accountable? I don't claim to have the answer, but I know several things you shouldn't do:
  • Don't "put KM in business terms" and measure it against short-term business objectives such as increased sales, shorter time to market, etc. In other words, ROI. Although KM practices influence many of these objectives, KM's contribution is only part of the story and cannot be accurately measured or, more importantly, can't be "adjusted" as management likes to do to increase the return.
  • Don't measure KM value as "savings" against the potential cost if KM was not involved. This is extremely appealing (along the lines of "now it only takes five minutes to find a sample proposal to copy versus the day and a half it would take to write it from scratch"). However, the result is often a return so astronomical (200%, 300%, 400%) that no rational manager would believe it. And in fact they shouldn't. Because although the benefits are real, the alternative is false: without a viable KM environment, employees will find a workaround -- such as stealing from an inappropriate sample or leaving out sections (usually those needed to mitigate risk) to save time.


The fact is that there is a natural tension between business operations and complementary initiatives such as KM. (Others that come to mind are R&D, usability, sustainability, etc.) The larger goals of KM cannot be measured in standard operational terms, whether that be ROI, chargeability, time-to-market, etc. But that doesn't mean they can't be measured. It just means you have to be careful to measure them on their own terms.

For example, from a business perspective the role of communities of practice (CoPs) is to increase the flow of "deep" knowledge among practitioners in fields important to the business. Two ways to measure the impact of a community initiative would be to measure the level of participation in the CoPs or the level of interaction between members.

But you do have to be careful. CoPs are communities of practitioners -- a mutually beneficial network of peers. As soon as you talk about CoPs and convince management of their benefit, someone will try to strictly "align" the communities with the business objectives, either by changing the community's objective or by controlling (and only measuring) the individuals they believe belong in the community due to job role or organization.

You can encourage people to participate in communities, even measure and reward that participation. but you cannot require or control it. If you require participation, it is no longer mutually beneficial. It is then performed out of compunction and the culture and attitude will no longer be one of sharing to mutual benefit but complying (and consequently doing as little as possible to meet the requirement).

Achieving balance between maintaining the autonomy of KM initiatives and retaining enough connection with the business organization to garner support is a tricky business. One way to make the connection real is to measure based on KM objectives but report along organizational lines. For example, what percentage of the people in organization X are participating in communities vs. organization Y. This, plus the participation rates on specific communities allows you to turn the tables: it gives managers a clear metric (especially if you set a target goal) that they can work towards. But rather than "turning the dial" of KM funding or services, they can work to encourage the employees in their organizations to participate to raise the metrics for their organization or the communities they consider critical to their mission.

Wednesday, September 26, 2007

What I'm Playing: Namco Museum DS


This week I am playing Namco Museum for the DS - a thoroughly nondescript title that came out recently and hides a very pleasant surprise.

Namco Museum didn't receive much fanfare on its release. Not surprising. It is yet another in the long list of collections of old 8-bit video games redone and repackaged for more recent systems. There must be hundreds of them out by now. (I have another Namco Museum title for the Gameboy SP.)

This time around, the DS incarnation of Namco Museum is a good title. It avoids the pitfalls of many previous 8-bit rehashes. It has a better-than-average selection of games (not just 15 variations on one famous game) including both famous and cult favorites. And, more importantly, the emulation is excellent. I started with Galaga, an old favorite, and the controls, sound, and graphics are spot on. Ditto the other games I've tried so far. They even include a "Library" so you can listen to the soundtracks without the distraction of space aliens shooting at you.

The only drawback to the Museum is the screen size. Most of these games were originally designed for vertical presentation on arcade machines. On the DS, Namco has chosen not to try using both screens (with the resultant issue of what to do with the gap between them) and show the game on the top screen by default. but the DS screens are oriented horizontally.

So at first, the game is stretched across the screen. This makes it visible, but hard to play since the relationship of horizontal to vertical movement is distorted. To make up for this, the game lets you choose a variety of different screen layouts: bottom screen instead of top screen, unstretched graphics, as well as several rotated versions.

I switched to the original aspect ratio, which eliminates the stretching but uses only about half of the screen. The emulation is perfect. The problem now is... its just way too small! Its like watching a video game played on Mars through a telescope. Next, I tried rotating it so the game is shown with the top of the screen to the left. Now, this is definitely the best video setup! It is glorious 8-bit color graphics at their best! Mind you, I have to hold the DS on its side and play with both hands on the left side of the screen. Not as uncomfortable as it sounds, but it is awkward.

The fact is the DS screen is simply too small and in the wrong orientation to play these old games well. Namco has done an admirable job to make it as playable as possible -- great emulation, flexibility on screen layout, but there just isn't any option that provides a completely satisfactory sensation.

What is interesting is that I ended up using different layouts for different games. For games I was familiar with from my youth (Galaga, Pac-man, Galaxian) I put up with the discomfort of holding the DS sideways so I can play it rotated fullscreen. Dig-Dug II, which has a color background (rather than dots on black) I don't mind playing on the small default layout (although I still switch to the correct aspect ratio). And for Xevious, which I never played before, I actually found that rotating the screen but not rotating the DS resulted in a very entertaining side-scrolling shooter that was easier to control (rather than the bottom-up layout it is supposed to have).

Its still fun, but ultimately even with the best emulation, these games play better on a bigger screen.

But the old games aren't why I bought this title. What I bought it for was Pacman Vs.

If you have never played Pacman Vs, which was originally sold as a bonus disc for Pacman World 2 on the Nintendo Gamecube, you have missed out on some serious fun. Pacman Vs was created as a technology demo by Nintendo to show off the connectivity between Gamecube and the GameBoy Advance. In the game, one player plays the character Pac-Man just like in a regular game of Pac-Man (except in 3D). The trick to the game is that the other players play the ghosts! The "Ghosts" only see a small area of the game board near them while "Pac-man" see the entire board (giving him or her a necessary advantage over the three opponents). Pac-man gains points for eating pellets and ghosts; ghosts gain points for catching Pac-man. Once Pac-man is caught, a different player is picked at random to become Pac-man and the game repeats until someone reaches a predetermined score.

A very simple game mechanic (rumor has it the game was created in two weeks), but the result is fast-paced chaos, with plenty of opportunity for strategizing (quickly) or just crazy running around, with lots of yelling and shouting. For all its simplicity, Pac-man Vs is one of the best multiplayer games around.

The reason it did not get attention originally was because you needed 2-4 people, a Gamecube, a GameBoy Advance, and a GBA link cable to play the game. But now with the Namco Museum, all you need is the game and a few friends with DSes. And, believe me, it is well worth it!

Unlike the 8-bit classics, Pac-man Vs was designed for a TV screen -- horizontal orientation -- so on the DS it can fill the upper screen. And since the game narrows the focus (zooming in on the characters) it still looks fantastic on the smaller device. If anything, this incarnation of Pac-man Vs is better than the original because there are no cables to get mixed up when passing the GameBoy Advance back and forth. It is fun with 2 people; it is outstanding with 3 or 4.

At $20, the Namco Museum for DS is a steal for Pac-Man Vs alone, one of the best portable multiplayer games around. if you have 2 or 3 friends with DSes, consider it a must-have. Oh, and you get some nifty old 8-bit games thrown in as a bonus. Frankly, I don't know why Bandai Namco doesn't advertise it that way...

Tuesday, September 25, 2007

ROI: the Sad Case for KM

More and more frequently I hear calls for proof of the "ROI" (Return on Investment) of knowledge management. I hear it within my own company; I hear it from KM practitioners in other companies; I even hear KM consultants espousing the importance and benefits of calculating ROI to demonstrate knowledge management's contribution to the business bottom line.

This concerns me. Not because I don't believe KM has value -- it obviously does! -- but because ROI is a specific type of business measurement that overemphasizes the direct-to-bottom-line component of KM while completely ignoring (and discrediting) the rest.

KM certainly contributes indirectly to the bottom line, as it contributes to many other aspects of the company's fiscal and intellectual diversity and health. But that is not its primary goal. This call for ROI is part of a larger tendency within corporations today to "align" KM with business operations. By that I mean making KM a tool used by business management to ensure the optimal and efficient exercise of business processes.

Now, I have no objections to KM supporting business processes. Clearly, that is the primary use of knowledge and the company wants to encourage anything that contributes to the bottom line. But that is not all that KM is about. KM also significantly contributes to the breadth of knowledge, experience, and expertise of its employees. It contributes to the resilience and responsiveness of the company to changes in the business environment by strengthening its core intellectual capabilities. It impacts business processes both direct and indirect. And it establishes a culture and channels for distributing business intelligence at lightning speed.

The problem is the measuring. Managers don't measure things for intellectual stimulation. They measure them so they can make changes and confirm the results. Managers also tend to think high-level. If ROI is what you are measuring, then that is the goal (not a goal, the goal). That is not a slam against managers, it is just an attribute of their job: to think clearly and succinctly and not get bogged down in details.

The results, if you are not careful, can be both dramatic and unfortunate. The analogy that comes to mind is college. If you see the goal of college being to get a job (your ROI), then there really is no need for English, history, languages, or even science -- depending upon your target profession. However, if you see the goal of college as expanding your knowledge and broadening your character, not only will it have a strong indirect impact on your employability, but your opportunities will be far more flexible and adaptive to the business environment when you graduate. Business opportunities fluctuate on a cyclic basis. At one point, there was a strong need for engineers. But if you went to school specifically for that career, the market was pretty much saturated 4-5 years later when you graduated. Ditto MBAs and other focused degrees. I pity the poor Cobol programmer trying to break into the web era. Or Algol, PL/1, Pascal...

So just as the goal of college is to teach capabilities, not specific skills; the goal of KM is to facilitate knowledge development and transfer, not solely to apply knowledge to the product pipeline.

Another problem with ROI and similar types of business measurement is that it starts to infiltrate your thinking. In a recent discussion among KM professionals concerning assessing the value and success of communities of practice, several members of the list argued that you had to use the business objectives for the CoP to measure against those goals and calculate success. (In other words, did the company get what it wanted out of the community.) Again, companies don't sponsor Communities for altruistic reasons, but people participate in those communities for personal and professional reasons and it is the participants who ultimately determine whether the community succeeds or fails. I've seen a number of cases where companies tout the success of their CoP programs while at the same time complaining how hard it is to get members involved. Say what!?!

The success or "return" of a KM program is the cumulative benefits -- both short and long-term on the company and its employees. This is a very hard concept for line-of-business managers to grasp. They understand it when they feel its absence -- the recent rebirth of KM within American companies runs a parallel course to the enthusiasm for the business fads of downsizing, rightsizing, and outsourcing in the late 80's and 90's. Many companies followed the trend only to find that the intelligence of the corporation had left with its employees. The need for knowledge management became apparent.

But there hasn't been major corporate gaffe such as downsizing for several years and management tends to have a short-term memory. The current business fad has shifted from business process re-engineering to supply chain optimization and process refinement -- squeezing the last penny out of the business pipeline. And KM is beginning to feel this squeeze. It is hard to tell what the outcome will be.

But for the time being I believe it is the responsibility of KM professionals to avoid the rush to ROI and make sure both the direct and indirect "returns" of KM are recognized and re-established as objectives.