Thursday, July 31, 2008

Preface to a Month of Poems

This is a shaggy dog tale if there ever was one and I'm not sure how interesting it will be to others, but I find it a curious example of how the mind works. At least, my mind.

I am about to start a new post where I will read and comment on a poem every day for at least a month. The process will be:

  • Each day I will read a poem by a different poet
  • I will will then write a short comment about the poem, the poet, or some random thought the reading of it instigates.
  • At the end of the month I will either stop, or if I still find it interesting, I'll keep going.
Since this is an experiment and I don't want to bloat this site with lots of short -- possibly boring -- entries, the entire month of poems will be a single post that I will edit each day.

Why do this? Well, that's what I find curious. It all started when I was cleaning my office and looking at my shelf (actually, shelves) of Nintendo DS games, several of which are still in the wrapper. I want to play them, I just don't have a lot of time. That's when it occurred to me to help encourage me to play them by creating an exercise: a month of video games.

The original idea was to play a different video game a day for an entire month. That would get me through most of my DS collection, including both games I've played before and those I haven't. I could then use the excuse of commenting on them here in my blog to complete the exercise.

Unfortunately, there was an immediate problem with this plan. Video games, even the simplest ones, take time to get used to. Quite frankly, even if I played for an hour a day, there are a number of games where I would not get sufficiently involved in or comfortable with their controls to do any more than frustrate myself.

So that wasn't going to work. Next step was to think of similar things that I don't spend enough time with. The obvious answer was my collection of poetry books. I had recently rearranged the bookshelves and --for lack of any better scheme and as a change from my previous by school or genre organization -- I sorted the books alphabetically by author.

So the next plan was to read a book a day for a month, reading from one end of the shelf to the other. Since I have far too many books to read them all (that would be more like a year of poetry), I decided to limit it to a different poet each day.

But I still have the problem of time. Poetry books, like video games, take a while to get involved with. To be fair to a book of poems you need to familiarize yourself with the poet's voice (or voices), their style, what you could refer to as the ontology of their poetic world... But unlike video games, poetry books are made up of individual poems that are -- in most cases -- intended to stand on their own. They do not require learning a control scheme, a background story, or any other prerequisites.

Which led me to my final refinement: reading a single poem by a different poet each day for a month. I have no idea if this will result in any useful revelations for either myself or the readers of my blog. That is why it is an experiment. But succeed or fail, it should be interesting to find out what happens.


Thursday, July 17, 2008

The KM Core Sample

One of my favorite diagrams of the past year or so is what I call the "KM Core Sample". The Core Sample is not really an architectural diagram, since it shows no process or function that can be implemented. But I have found the diagram to be extremely useful in explaining why knowledge management is such a complex topic and where various KM methodologies "fit" within the strata of the knowledge universe.

(Click to expand)

The Core Sample is -- like its name sake -- a snapshot of a point in time. It captures the various levels of "knowledge" and where they reside. The diagram also illustrates the rationalization and codification of knowledge as it rises through the layers.

That last statement might sound like the description of a process: the codification of knowledge. But what I like about the diagram is that it shows that different types of knowledge reside in all levels at any given time.

This is because the process of codifying or standardizing knowledge into actionable procedures and practices actually changes the knowledge. It cleanses, sanitizes, and simplifies the knowledge -- removing the stray tidbits, the ugly but necessary workarounds, the secret tricks of the trade... all of the untidy clutter that make up true expertise in a field -- all of this is stripped off to achieve a linear, documentable, process.

But back to the diagram. Let's take a quick look at the various strata of the core sample:

  • Starting at the bottom, at the very core, are people. This is where true knowledge exists. In other words what people know. And the most accurate way of sharing that knowledge is talking to the people who possess it: asking questions, telling stories, cracking jokes.
  • The next layer up is where that personal communication is expanded to allow people to "talk" to others they do not know or cannot meet in person. Email distribution lists, forums, and other discussion technology reside in this layer. (Note that blogs are also in this layer.)
  • The next layer up represents "knowledge capture". Here the knowledge is instantiated in documents of some kind: sample documents, lesson learned, case studies, white papers. These all represent mechanisms used to selectively capture and sort knowledge in such a way that it can be reused by people who may never come in contact with the original author. The obvious limitation is that only a small portion of what any individual knows about their profession is captured in any of these documents. This is offset by trying to capture the most important or influential pieces of wisdom.
  • Finally, in the top layer the captured knowledge and learnings are further refined into a defined set of templates, guidelines, and standard processes. In some sense, you might say that in this final layer the actual "knowledge" has been removed and is replaced by step-by-step procedures to ensure a consistent and reliable execution of desired behavior. To achieve this goal, a significant amount of sorting, sifting, and selection is required to winnow down all possible options or alternatives to a limited set of recommended or required processes and deliverables.

What I like about the core sample diagram is that it helps you discuss the scope and effects of different approaches to knowledge management. Collaboration strategies focus on the tacit knowledge layer. Methods like knowledge harvesting, lessons learned, and storytelling focus on the best practices layer. While ITIL, Six Sigma, ISO 9001, and other standardization methodologies focus on establishing institutionalized knowledge.


Monday, July 14, 2008

Implementing Web 2.0 Inside the Castle Walls

All the buzz about Web 2.0 and Enterprise 2.0 is exciting and good for theoretical discussions and all, but how do you actually go about doing something about it?

In response to one of my previous posts zyxo commented that Enterprise 2.0 is not just Web 2.0 inside the firewall. True. It is certainly not just implementing technology, for sure. But it is also more than just thinking differently. It is acting differently and managing knowledge differently. And that change is impossible using the traditional business applications that are built on old assumptions about security, ownership, and usage. So at some point you must tactically bring social software into the mix.

As I mentioned before, process is extremely important when you bring social software in-house. It is the process that needs to change or adapt for web 2.0 to have any impact on the business. (It won't do any good if you switch from SharePoint to wikis if no one knows its there or cannot access it due to security restrictions.)

On a more tactical level, you need to understand what usage you expect and what you don't, so you can manage the technology and its content. You need to identify success criteria so you can tell whether you are succeeding in solving a problem or not. At the same time, you don't want to apply so much control you squelch the inherent viral nature of the technology which requires users trying and learning for themselves.

More importantly, you are operating in a microcosm -- the scope of your company employees -- rather than the entire web universe. This significantly reduces the elemental power of many web 2.0 technologies and in some cases may make them totally ineffectual.

There are five ways of making the shift to web 2.0 technologies inside the firewall. (Actually, I have only seen three or four "in the wild", but there are additional options you might want to consider.) Needless to say, the most common option is not necessarily the most effective:

  • Build it and they will come -- this is the process-less option. Set up a web 2.0 technology inside the firewall and let people use it as they will. This is quite common with blogs, bookmarking, and wikis. The problem is, as mentioned before, there is no way of telling if these technologies are succeeding at solving a business problem. A more inherent problem with this approach is that if your users are already using the same technology outside the firewall to manage their links, their friends, or whatever, why would they use an internal version and then have to maintain both? And if they aren't using the technology outside, what would drive them to use it inside? There is no impetus to use the service for new users and a deficit for existing users. The service tends to sit idle or be used by only a few enthusiasts.
  • Replicate what succeeds on the web -- otherwise known as "Wikipedia inside the firewall". If it worked outside, it should inside as well, right? Well, not quite. Many times the first thing a company does with a wiki inside the firewall is try to create an internal wikipedia. Why? What information do they expect to collect here, that isn't readily available outside the firewall already? And do you have the enthusiasm for maintaining business-related content that the maintainers of Wikipedia have? Ditto blogs. Follow the external model where anyone can have a blog. Many get started, few stay alive. Why? Because, quite frankly, there are usually several other, well established, channels for sharing information within a corporation and the blogs create an alternative, competing signal.
  • Define a process and pilot -- This is the traditional business approach: define what the technology should be used for, who should use it, and run a pilot to test it. The only problem here is that most web 2.0 technologies are dependent on a critical mass of users to be effective. Five people editing a wiki or three people blogging is not necessarily going to tell you much about the potential of these technologies. Also, because these technologies often offer new usage models (rather than computerizing existing processes) it is easy to miscalculate what processes will actually benefit from their application.
  • Establish a service and solicit trial cases -- This is a combination of #1 and #3. I have never seen this done, but it seems like a reasonable approach. Have IT establish an internal service, then ask the business groups to propose pilots (i.e. processes to apply the service to). This will have a better chance of exposing innovative applications of the technologies to business cases and would require the declaration of the business process that it is being applied to.
  • Extend existing services/processes using web 2.0 technology -- I have not personally seen this in use elsewhere (except where we are doing it ourselves) but most of the current success stories of web 2.0 inside the firewall -- such as IBM's Fringe -- involve extending or integrating existing services or applications with web 2.0 technology. Fringe adds tagging and rating of people that is integrated into an existing white pages application, as I understand it. This is possibly the most likely approach to succeed because the existing application provides an inherent process, an established audience and user base, and linkage to familiarize users with the new capabilities.

Linking web 2.0 technologies to existing systems has another benefit -- it justifies their existence. For example, social tagging inside the firewall vs. social tagging outside has little to recommend it, and a number of drawbacks. A smaller audience, less flexibility to grow and extend features, simply less exposure and name recognition than public services... On the other hand, tie that tagging to how the corporate intranet search works (automated favorites, improved relevance, best bets, etc.), and users will start to see the direct impact of their use of the internal service as well as having the service in front of them on a regular basis when they search.

Wednesday, July 9, 2008

Understanding Technology Adoption From the Customer's Perspective

Much has been written about the adoption of technology from an industry perspective. Clay Christensen in The Innovator's Dilemma, Geoffrey Moore in Crossing the Chasm, and Malcolm Gadwell in Tipping Point all articulate models for the adoption (or lack thereof) of technologies based on their position in the product lifecycle.

However, as interesting as these models are, they provide little solace to the individual customer who is trying to decide whether to purchase and rollout a specific technology for his or her own business. All of the preceding authors discuss technology at a macro level: its adoption by the market in terms of volume of customers. But for each customer, there is a second, more important adoption that occurs after the purchase: the rollout and, hopefully, successful integration of the technology into their specific business processes.

The problem is that no matter how "successful" a product is in the market, there is no guarantee it will actually prove to be effective when applied to a specific business situation. SAP may be the poster child for this syndrome, where several large-scale implementations are rumored to have proven unusable in the end and ultimately had to be abandoned.

So, what does determine if a technology can successfully be incorporated into an existing business environment? The answer is not related to the technology's current marketing position or "disruptiveness" -- although that will impact the outcome. The real attributes that influence the success or failure of technology rollout in an individual business are all related to the business itself: its culture, its environment, and its history.

Traditional Technology Rollout Plans

Any corporate technology plan worth its salt includes an adoption chart showing the expected rollout over time. These charts fall into two categories: the "s" curve and the stairstep.



The "s" curve shows a slow but steady adoption shifting to a steep climb flattening out at a plateau, usually marked by 100% of the target audience. This model follows the "chasm" or "tipping point" theory where at some time enough early adopters are using the technology that word of mouth takes effect and rollout becomes self-realizing. Adoption ramps up until success is achieved. (Here is an example.)

The stairstep is a more phased approach and assumes adoption based on the ability to train users. The steps in the chart are usually based on incrementally adding divisions or projects as the technology is rolled out progressively through the corporation.

Neither chart takes into consideration that employees may choose not to use the new technology or may actively resist using it. And as much as we would like to think it doesn't happen, these are the real reasons technologies fail. There may be technical problems. There may be bugs and system failures. But ultimately what determines any technology's success or failure is whether the target employees agree to use it or not.

Understanding the Technology Adoption Curves

Where traditionally adoption is seen as a single curve, there are actually three equally important variables that need to be considered:

  • Adoption rate
  • Rejection rate
  • Misuse and abuse

Therefore, the real adoption might look something like the following diagram



Adoption is the number of employees actively using the technology. Resistance is the opposite of adoption; it is the number of employees who refuse to use the technology or actively complain about it to their friends and colleagues. Misuse is the number of users who are using the technology, but in ways it was not intended (and usually for activities that should not be encouraged).

Real adoption rates are more erratic and event driven than the theoretical s-curve or stairstep. There is usually a series of "bumps" with each announcement or management memo concerning the new product. However, usage then drops off after the bump. Why? Because unless the users see a direct impact on their own jobs, there is no incentive for them to keep using the new technology (beyond management dictate). So with each memo more people will try it; some will stick with it, but others will stop.

Resistance is difficult to measure, but has a real and significant impact on adoption. If users find the technology objectionable, too hard to use, or simply burdensome, they will avoid it, work around it, or use it grudgingly (and often badly). Resistance will tend to exaggerate the spikes and can often lead to a drop off of usage over time.

Misuse is the hardest to account for, but is again a serious problem. The classic example of misuse is email: many users in large corporations use email as an archiving tool -- emailing themselves documents as a way of saving them (rather than leaving them on their PC and risk their loss). The result is quick saturation of the mail storage system with little or no way to sort out the "misused" mails from real business correspondence.

Understanding and Accounting for Resistance

It would seem that resistance to a technology is solely a reaction to the usability or applicability of the technology to the function it performs, but that is not the case. People can reject technical solutions for a number of reasons. Yes, if it is difficult to use or hard to understand, resistance will be higher. But it also depends on whether there is already a solution in place.

Replacing an existing tool can be more difficult than instituting a new tool. Even if the existing processes are outdated or overly complex, employees can be resistant to replacing the known for something new. And it doesn't have to be one technology for another. There can be resistance to implementing a technical solution to even a manual process, if they see the manual process as "working". In other words, unless the employees themselves see a problem, they are not likely to appreciate or accept the solution.

This is particularly problematic when replacing multiple point solutions with a single corporate-wide technology. Each of the existing tools will have advocates who will adamantly argue the merits of their own solution over the proposed replacement. And. quite frankly, in many cases their arguments are not entirely baseless. Each division may have instituted a point solution tuned towards their needs and a corporate-wide solution is likely to result in some loss of functionality. Even if the overall outcome is better for the company, these separate divisions will see it as a step backwards for their own purposes.

So resistance is actually the result of a number of factors:

  • Corporate culture: how accepting the organization is of change and technology in particular
  • Current environment: whether there is an existing solution (or solutions) in place that is being replaced
  • History: whether past rollouts have gone well or badly will heavily influence the receptiveness to further change

Clearly, when the technology you are implementing provides a unique and obvious advantage to the business and to those who must use the technology, then resistance will be low. But that combination of variables is rare. In most cases it is useful to take resistance into consideration when planning your rollout to lessen its impact.

Usually resistance can be overcome with sufficient management support. The implicit threat of losing one's job for not following through on a management dictate can help drive adoption. But at the same time, it will foster additional resistance as well. So if you take this approach, you better be sure you have the necessary management support -- and not just verbal support -- to address any complaints that arise about overly aggressive deadlines, time lost to training, missing or faulty features, etc.

Getting sufficient management attention for an extended period of time is not always possible, so the other option is to try to avoid resistance by not asserting too much pressure for adoption. In other words, using the carrot instead of the stick. Obviously, the tradeoff with this technique (i.e. not demanding strict adoption or applying management pressure) is that adoption will be significantly slower. On the plus side, done well, the adoption will be slow but steady, as there will be less resistance. But if there are strong advocates for alternate solutions, even this approach is likely to fail.

With either approach, there is likely to be at least some resistance, and the best policy is to preemptively counteract it. How? Preferably before, or as early as possible during, rollout identify the most likely sources of resistance. That is, alternative solutions, processes that will be affected by the change, and so on. Then identify the primary advocates for the alternatives or the most reputable critics of the change. Finally, approach these people personally. Explain the plans for rollout, the rationale, and ask them what are the most significant issues they foresee in adoption.

The goal is to persuade these key individuals that the plans are taking their concerns into consideration. To succeed, you may need to actually change the rollout plans or modify the technology somewhat (which is why doing this before rollout begins is preferable) because the goal is to convince them that their concerns are being taken seriously. And the only way to do that is to take them seriously.

Note, I did not say find the loudest or the harshest critics. The key is to find the most respected, dedicated, and sincere advocates. Loud critics can make your life a pain, but they can be overcome -- or at least counteracted -- by reasonable, respected people. You want to find the gurus, the experts people turn to for help. These are the people you want to convince.

Note that I also did not say convince them that the planned rollout is the best option. Be realistic. You will not be able to convince everyone that your plans are the right solution. The goal is to get them to recognize that it is at least well thought out and that their alternatives have been considered, even if rejected. This way, they may not turn around and advocate for you; but at least they will not argue against it and are likely to stand as a voice of reason during any confusion that arises during rollout.

Understanding and Accounting for Misuse

Misuse is different than resistance. Whereas resistance results in a downturn in adoption, misuse will give the impression that adoption is progressing well, because it involves active use. The problem is that the use runs counter to the original intent and may well interfere with the ultimate business goal.

Misuse can be very hard to identify and sometimes even harder to stop once begun. Like resistance, the key is to try and predict where it will occur and then (if it is serious enough) design around it, rather than trying to clean up after it becomes epidemic. But unlike resistance, where you can often guess where opposition will come from, it is difficult to predict in advance all the possible misuses of a system.

Take SharePoint, for example. SharePoint is a very useful tool in some ways -- it mixes the best of automated web site design, document management, and Windows-based security. But it doesn't do any of them in any great depth. It provides the easy creation of sites and subsites, libraries and lists.

But if you allow users to readily create these repositories (which can be a very efficient way to manage artifacts -- especially for smaller projects or teams -- without requiring a "librarian") you are also making those individuals responsible for the appropriate use and maintenance of those sites. Unfortunately, as eager as people are to create repositories, it is very hard to get them to do proper maintenance and delete them or clean them up periodically.

So two possible misuses of SharePoint are creating too many sites and not removing "dead" sites when their usefulness is over. (This is above and beyond the usual misuse issues such as storing and sharing inappropriate materials: copyrighted music, videos, etc.)

The use of disk quotas gives the appearance of alleviating these problems, since it stops runaway inflation of individual sites. But it doesn't actually stop the misuse. People can just create more sites if they can't increase the volume of the ones they already have. Also, disk quotas do not address the problem of undeleted "dead" sites. Restricting the number of sites any one user can create is another deterrent to creating too many sites, but involves an arbitrary limit (how many sites is "too many"?) and can result in animosity from your user population.

One alternative, if you suspect this lack of cleanup will be prevalent, would be to institute a policy of deleting sites that become inactive for a set period of time. Note that to make this practical, you will need to enhance the application itself to identify and automate this procedure.

Users will still complain that the content in inactive sites (for example, sites with no one accessing them for more than 60 days) is still needed. But unless SharePoint is also your archiving tool (a really bad idea, by the way), storing old content offline can be easily addressed with alternative, less expensive solutions.

The key is to predict what forms of misuse are most likely to occur based on the nature of the business, proclivities of the users, and any gaps or open capabilities in the technologies and processes being rolled out. This may require some imaginative thinking. More importantly, once the danger areas are identified, there may need to be changes or additions to the technologies themselves to ensure the desired processes are followed and negative alternative uses are avoided.

Note that you don't want to eliminate all alternatives, since users are likely to discover creative and effective business uses for the technology that were never planned. But this is another reason why it is a good idea to monitor the rollout periodically (every 6 months or so) to see what sort of uses are developing. This allows you to catch both misuses you hadn't thought of but need to account for as well as creative new uses that you may want to acknowledge and promote throughout the user community.

Monday, June 30, 2008

What I'm Reading: Mark Strand

A surprising thing happened to me when I went to the bookstore last week. I found two books of poems that I liked.

Now, this wouldn't seem to be such as surprise -- I like modern poetry. However, in most visits to the bookstore they either stock books I've already read or books I already decided not to read. For example, I love Robert Bly's work, but I have more of his books than the bookstore does. Same goes for Charles Simic. On the other hand, I find the work of May Sarton and Mary Oliver boring and pretentious. (Ditto Stanley Kunitz, Donald Hall, etc.) And as shocking and titillating as Charles Bukowski can be, his poems are pretty shallow. After 2 or 3, the persona starts to grate on me. So I have no need to read or own any of his 20+ volumes that every store seems to make available.

But last week was an exception. I found two books of interest. One is not so surprising: James Tate's The Ghost Soldiers. I've been a fan of Tate's work for a long time, starting with his first book The Lost Pilot. However, I went through a period (or more correctly, he went through a period) that put me off his writing. Starting around Riven Doggeries he published a number of volumes that seemed more interested in poking fun at language (and by extension, the people who use such idioms) than illuminating the small actions and inconsistencies that make up our lives.

Not that every poem has to be instructive or informative. (Frank O'Hara has brilliantly proven that.) But at some point poetry -- serious or not -- has to have some touch points with the readers' lives if it is going to have any lasting impact. And Tate's work of the late 70's and 80's seemed to lose that connection.

But Tate's recent books seem to have brought him back from whatever jag he was on. His work is still irreverent (if not more so) and almost frightening in its ability to switch between the glaringly realistic and clownishly absurd within a single sentence. So finding a new volume of his poems was a pleasure.

The second book was more of a surprise. I read Mark Strand's books many years ago and despite my friends' fascination with his work -- and my own best efforts to like it -- I was put off. In fact, rather than growing on me, his work became more painful and annoying over time. To the point where I haven't read any of his work, except a stray poem here or there, for thirty years.

So I don't know what came over me at the bookstore but I picked up Strand's latest book, Man and Camel, and started leafing through it. Rather than flipping through a couple of pages, grunting disapprovingly and putting it back, as I expected to do, I found myself attracted to the poems I read. Why? They were recognizably Mark Strand poems with his spare, objective writing style. But something was different. Something held my attention, was speaking to me like his previous work never had.

Maybe it was just the one or two poems. Maybe I was in an overly receptive mood and tomorrow I would wake up and recognize the poems for the pretensions they ultimately were. Whatever. I was intrigued enough to take a chance and buy the book.

And a good thing I did. Despite whatever reservations I had, the book turns out to be one of the best books I have read this year.

But how did this happen? What makes this book different than the rest of Strand's works I read before? Did I misjudge the earlier ones?

Unlike Tate, where there was a clear change in style and content, Mark Strand's writing doesn't appear to have changed. Either there was a change in my perception of his work or something more subtle was going on. So when I got Man and Camel home, I not only read it but pulled out his older books and started looking through them to find out what had happened.

It turns out my tastes haven't changed, at least that much. I still have difficulty reading Strand's earlier work, like Reasons for Moving and Darker. At the same time, my suspicions are correct: that earlier writing and his recent book are very, very similar. Which baffled me further.

At first I suspected it was something specific but minute, like a change in verb tense or a switch from second to first person. Because the new poems at least seem more personal:

On a warm night in June
I went to the lake, got on all fours,
and drank like an animal. Two horses
came up beside me to drink as well.
This is amazing, I thought, but who will believe me?
from "Two Horses"

But looking back at his earlier poems, many of them are in the first person as well, like this poem from Reasons for Moving:



A man has been standing
in front of my house
for days. I peek at him
from the living room
window and at night,
unable to sleep,
I shine my flashlight
down on the lawn.
He is always there.
from "The Tunnel"

These poems demonstrate the consistency of Strand's style and tone over time -- a sense that you were reading the diary of a visitor from a strange but parallel universe. But at the same time, these poems hint at the difference.

In Man and Camel, Strand the narrator is not so definitive, not quite so self-assured as before. In "The Tunnel", as in the majority of Strand's earlier work, the actions are absolute, unequivocal, as if the narrator controlled his (or her) own destiny, as bizarre as that might be. Later in the poem he says:

I weep like a schoolgirl
and make obscene gestures
through the window. I
write large suicide notes
and place them so he
can read them easily.
I destroy the living
room furniture to prove
I own nothing of value.

In Man and Camel, the actions are not so definitive, not so much like some magical incantation. But at the same time, they seem more realistic and more humane. Again, from "Two Horses":

The horses eyed me from time to time, snorting
and nodding. I felt the need to respond, so I snorted,too,
but haltingly, as though not really wanting to be heard.
The horses must have sensed that I was holding back.
They moved slightly away...

"From time to time", "as though", "moved slightly away". The language is approximate, like human perception is. And the reaction is equally based on assumption rather than fact; the narrator "felt the need" and the horses "must have sensed".

Now, not all of Strand's new poems are as equivocal as "Two Horses". Many still carry the absolute statements familiar from his early work. But the overwhelming feeling is that there is human frailty involved, even if it is simply the uncertainty of the narrator's own perception. This may be a small point, a tiny point, but it makes a world of difference in the poems themselves.

The absolutism of Strand's early work is what affords the poems their power, a sort of magical aura based on the incantations of the narrator. And it is that power that my friends saw and appreciated. The problem is that if you have any doubt in the narrator's authenticity -- if you don't accept the absolute statement -- the spell is broken and the poem fails and fails badly. It becomes unbelievable. My problem was that I didn't accept many of the narrator's absolutes.

The change I see in Strand's latest work as represented by Man and Camel is the narrator's acceptance of his own fallibility. This not only makes the narrator seem more human and more believable, it makes them more empathetic and powerful as a consequence. The narrator is not me, the reader. Strand's poems still take place in a world apart from reader. But now the narrator could be the reader, if the reader inhabited that world. And that makes all the difference.

Saturday, June 21, 2008

The Web Litmus Test

[Editorial Note: At first I was doubtful about posting a concept I developed ten years ago. However, just the other day someone called me looking for a designer/developer. When I suggested doing an architectural design for the site content first, he said "we have all the content. What my boss wants is to make sure the site is flashy and cool." I guess we haven't made that much progress in ten years....]

Web design is a tricky business. there are so many conflicting requirements to consider, as well as rapidly shifting expectations on the part of the users as the web grows and evolves.

On the positive side, there is no shortage of guidelines and recommendations for designing web sites to make them usable and functional. However, despite this guidance, there are still sites that are simply "unusable" at a higher level. Sites that aggravate, annoy, insult, confuse, or simply bore their users. Why?

The fact is that most usability guidelines operate at a rather low, micro level dealing with specific interface artifacts and interactions: the placement of buttons, the arrangement of forms, the structure and consistency of the navigation, etc. These attributes certainly impact the usability of web sites and shouldn't be ignored. But often when a web site fails it fails on a much larger, dramatic scale. It fails because it doesn't offer what the user wants.

It is not possible to provide a simple set of design rules that guarantee a successful web site. There is just too much variation in the intent and purpose of sites to cover all circumstances. But there are a few basic measures -- what I call the web litmus test -- that can fairly consistently tell if a web site design will fail or not. Passing the test does not guarantee success; it only means your site has a chance of succeeding. But fail the test and your site is toast.

As I say, the web litmus test can't be used to design sites -- there is much more skill and experience required to design the site right from the beginning -- and that is where the art and science of information architecture comes in. But the test can be a very quick and useful reality check that anyone can perform for designs before they get implemented or for existing sites planning a redesign.

Seven Characteristics of Human Behavior that Affect Web Design

The problem is often not the design but the site itself -- what the site is doing or trying to achieve. It is not failure to implement, it is a failure of intent. The seeds of failure are planted early and concern the basic impulses that drive the creation of the site from the very beginning.

There are two separate sets of goals that control any web site: the goals of the visitors -- or audience -- and the goals of the owner -- or sponsor -- of the site. Those driving impulses are different for every site, but fall into seven basic categories.


For the visitor, there are only four possible goals:

  • Help me find something
  • Help me do something
  • Help me fix something
  • Once I have satisfied all three of the above, entertain me!

As I said, the specifics of what the visitor wants to find, do, or fix are different for each site they visit. (I wouldn't try to buy a vacuum cleaner from www.bmw.com, but figuring out why my current vacuum is making so much noise is a likely goal for a visitor to www.kirby.com.) With the exception of people simply "channel surfing" the web, the goals of all of your visitors fall into one of these four categories.

From the other perspective, the web site owner has only three basic intentions:

  • Let me tell you something
  • Let me sell you something
  • Let me impress you!

These three impulses apply to all websites, even non-commercial web sites. (For non-commercial sites, "sell" can be interpreted figuratively to be an attempt to persuade the visitors to take some action: sign a petition, join an organization, etc.)

Achieving Alignment

It would seem, at first glance that aligning the needs of the owners and the audience should be simple: you want to find something and we want to sell something! Unfortunately, in practice the priorities and order of importance are often askew.

Site owners often focus on their last impulse first: let me impress you. At this point, it is fairly well accepted that elaborate flash intros to web sites are more annoying than effective. However they are still very prevalent.

Similarly, the days of commercial internet sites proudly displaying a photo and message from the CEO as a home page are pretty much over. However, many corporate intranets are still littered with web sites that prominently display a photograph of the manager, an org chart, and list of "news" stories and other managerial announcements. How does this help their employees find, do, or fix anything?

But the real problem is that the web site needs to address all of the visitors' possible needs, not just the one or two that match the owners' goals. Even if you can get past the sponsor's desire to turn impressiveness into a requirement, there is still too often a narrow focus on what the company wants to achieve and not what the users expect.

Note that addressing the needs of the visitors is not the same as solving them. If you are a manufacturer, you don't have to sell online. But you can expect at least some of your site's visitors will be looking to buy your goods, so you better tell them where they can buy your items rather than leaving them to vainly search your site and give up in frustration.


How to Use the Web Litmus Test

So, how do you apply the web litmus test? It is simple. Try this 15 minute experiment:

  1. Pick a site on the internet. Any site. (If you have a commercial site on the internet I would suggest not starting with that one. It is hard to be objective the first time.)
  2. Take 2 minutes to make a list of the things that site's visitors would want to find, do, or fix.
  3. Spend 3 minutes trying to perform each activity from the web site's home page.

The key points to note here are that the web litmus test is in no way a complete analysis of a web site. Its goal is to test the site's main features against the visitors' main goals and nothing more. So don't try to go into too much detail.

Keep it short. 1-3 specific tasks for each of the visitor goals is more than enough. And if you can't complete a task in 3 minutes, you are already spending more time than the majority of visitors would before giving up in disgust.

Again, it is not a test of the entire site. It doesn't matter whether a specific function exists on the site, but whether someone can find it in an acceptable amount of time. This is why you should always start at the home page (as visitors are likely to do).

But the best way to understand the web litmus test is to see it in action, so let's try a few of examples...

The Web Litmus Test in Action

[Continued from The Web Litmus Test]

The best way to understand the web litmus test is to see it in action, so let's try three examples.


Example #1

Let's start with an example of a site that is known to be good: Amazon.

Amazon is a retail site, so the things visitors want to find are items to buy. Which means they want to find the item, find out more about it, and then buy it. Most of the finding is finding items for sale and most of the doing is buying. (You might want to add comparing items as an activity, but you don't really need to go into that much detail.)

Fixing
is equally easy. Most visitors will be trying to fix a purchase gone wrong -- not arrived, wrong item, broken item, need to cancel, need to return, etc.

Starting at the Amazon home page, you will see that finding and doing are well covered. The top of the page includes a set of tabs for browsing goods by department and a search box (which can also be filtered by department). Having found an item, the description includes many different pieces of information to help the visitor assess it's suitability (such as the vendor description, customer reviews, etc.) They even include information of items that other visitors purchased after viewing this item as a form of comparison.

Fixing is also easy on Amazon. It is not the primary activity for the site, so it is less prominent. But to make up for that Amazon provides at least three ways to reach it. Both "My Account" and "Help" on the top control bar provide information on returns, cancellations, and other changes. The most common fixes ("where's my stuff?", "shipping & returns", and "need help?") are also provided on the footer of every page. Barring all else, the help pages have a "Contact us" button.

Note that Amazon provides many, many more functions to support these and over activities. The goal of the litmus test is not to test everything; just the most obvious. And in this case it easily passes the test.

Example #2

On the whole, large retail sites will all pass the web litmus test. As well they should considering the money spent on them. If nothing else, they tend to learn and copy from each other as functions and capabilities prove successful. But smaller sites and non-retail sites are a different story. As our second example, let's look at a site that misses the mark.

Just to set the record straight, I did not pre-select this example. I thought about sites that have interesting usage models and randomly chose the United Nations.

So, why would people visit the United Nations site? What do they want to find, do, or fix? The quick list I came up with was the following:

  • Find: find out what the UN does, find out who is in the UN
  • Do: visit the UN building in New York, participate in one of their programs
  • Fix: Contact the UN delegation for your country

(There would be a separate list of activities for people who are in the UN. But my suspicion is that they have a separate intranet for those individuals. So it would not be fair to assess their public website on those functions.)

So let's look for the primary tasks. After selecting a language, there is a home page that covers a number of topics: "development goals", "news centre", "about".... One's first reaction is that the majority of these items are telling -- what the UN wants you to know -- not focused on the visitors' goals. But "about the United Nations" certainly sounds like it might cover our two finding goals. And sure enough, the first three headings on the left are "background information", "main bodies", and "main documents" (including the charter and other documents).

> > >

Unfortunately, the first link under "background information", (labeled Basic Facts About the UN) is an advertisement for a hardcopy book you must purchase. So, trying to find out what the UN is about starting from the home page and following the most meaningful links takes three clicks and leads to an offer to sell you a book. (Note the following links under "background information" do lead to meaningful information, but it is not clear that any but the most persistent visitors will find it.)

Our second finding goal is a little more successful, since the "About..." page also includes a link labeled Member States on the right-hand side. The resulting page is slightly odd since it is labeled "useful tools and documents" and does not mention the member states until halfway down the page (in reference to a press release). However, there is a link close to the top of the left-hand menu that provides a list of UN members. So it takes four clicks, but you do find the information.

> > > >

The site doesn't do nearly as well in the doing. There don't seem to be any links on the first few pages that would help you visit the UN. (In fact, I had to visit the site twice to find the appropriate information.) However, if you are persistent and visit the About the United Nations page and scroll down (it is not visible at first) you will find information on tours of several UN buildings, including the headquarters in New York.

When it comes to participating in UN activities, things get even worse. I did finally find a site that portends to help you "get involved" called UN Works. But by then I was far beyond the three minute limit, on my second visit, and found little more than ways for me to donate money.

Finally, as one might expect from the difficulties just doing, fixing is even more challenging. Although I was able to find a list of member states (during the finding task) the list is static and contains no links! There is no way to find out more or to contact the individual delegations. However, if you back up a page and ignore the body of the text, lower down on the left-hand navigation menu is an option for Permanent Missions > New York > Home Pages. this leads you to a form that lets you select a member state and get redirected to the US mission home page. If you select the United States, their home page does have an option to "contact us".

So it is possible to complete the fixing task, but it takes an unreasonable amount of detective work and far exceeds the 3 minute limit. What is worse, if you give up in frustration (which all but the most ardent visitors are likely to do) and look for a way out, some pages have a Comment link in the left-hand menu. However, rather than provide a way to provide feedback to the UN, the first two links on the comment page focus on feedback for the website only. The last link is for "comments" to a generic email account. But even that is tempered with the warning that "we may not be able to reply individually to all e-mail".

So, in essence, the site fails all aspects of the web litmus test. For sites with so many problems, the test isn't really necessary for uncovering issues. However, it can still be useful if you are trying to fix such a site. Since the number of individual problems can quickly overwhelm any repair work, you can use the test to stay focused on the primary themes and not get distracted. (Being able to see the forest for the trees, so to speak.)

Example #3

The preceding examples demonstrate the extremes. But most sites fall somewhere in the middle. Particularly for small to medium size business sites, the web litmus test can help you quickly identify gaps and dead ends.

As an example, let's look at a business site that is focused on products but not necessarily commerce. For this example, the appliance manufacturer Maytag.

Maytag makes home appliances. I suspect their primary income is generated from third-party sales -- through department stores such as Best Buy and Sears or through local appliance stores. I would further surmise that, like other manufacturers, they do not want to undercut their existing vendor relationships by competing with online sales. However, their website is very ambiguous about this.

But I am getting ahead of myself. Let's get back to the test.

  • What would visitors to the Maytag site want to find? What appliances and models are available, as well as more detailed information about the products.
  • What would they want to do? Buy an appliance or compare models.
  • What would they want to fix? Repair their Maytag appliances or get replacement parts or manuals for said equipment.

Now that we have identified the key activities, let's see how the site fares.

The Maytag website is very handsome. The home page has the obligatory flash animation and focuses on telling you about their latest products. However, this can be forgiven because the site overall is well structured with consistent presentation and navigation, making it easy to move around.

The primary navigation gives you three tabs: Products, Accessories, and Support, which fairly closely map to the visitors' needs to find, do, and fix. However, as attractive as the site is, its looks are deceiving.

Finding Maytag's current products is well supported. The Products tab categorizes the products by location (kitchen and laundry) and then by type. Within each type you can further filter the results by various features (such as color, size, etc.). Selecting a specific model then lets you see details concerning its features, available colors, and so on.

Doing is a little more confused. Maytag's catalog of appliances looks striking like the product catalog from a retail site such as Amazon. They even have the shopping cart and "My Account" in the top left corner. But wait... when you click on a specific model, there is no "Add to Cart" button. So, can I buy an appliance here or not?

Even more confusing, if you click on the Accessories tab, the accessories do have "Add to Cart" buttons. So, do they sell appliances or not?


If you look closely, the controls in the top right contain not only a shopping cart and account, but also a link to a "store locator". This is a common interface element for sites that sell primarily through brick and mortar retail stores. So the site -- as attractive as it is -- confuses the user by presenting contradictory interfaces. Is it online or retail sales?

This often happens because the site's sponsor is well aware they can't sell online but forgets that visitors may not know. There are any number of simple solutions to this dilemma once it has been identified. For example, you can add a "Find a Store" button on the appliance page right where the "Add to Cart" button appears for accessories. This clearly tells the visitor they can buy the item but need to do so through a retail store.

Finally, Maytag has tried to support fixing as well. And in general they are successful. But again, the missing features that the sponsor takes for granted are not apparent to the user and can cause confusion. The Support tab provides access to online copies of the manuals. (Which are also available from the individual product's details page -- well done!) It also provides a FAQ and "Service & Parts".

However, Service & Parts is really only "Service". Despite the name of the link, the page only lets you schedule a service call or find a qualified repairman. There is no way to order parts. So the site only gets 50-70% for addressing our presumed visitors' expected fixing actions.

Again, the solution is simple once the problem is identified. Adding a single sentence on the Service page stating that parts can be ordered through local service franchises (if that is the case) would be sufficient. It is not necessary to do everything the visitor needs, just provide a way to get it done.