B  U  L  L  E  T  I  N


of the American Society for Information Science and Technology       Vol. 31, No. 2    December/January 2005

Go to
Bulletin Index

bookstore2Go to the ASIST Bookstore

Copies

Special Section: E-Government

Constructing a State Web Portal Through Design Alternatives, Measurement and Iterative Refinement
by Tom Brinck

Tom Brinck is co-author of the book Usability for the Web: Designing Websites That Work and president of Diamond Bullet Design (diamondbullet.com) in Ann Arbor , Michigan . Tom is an adjunct lecturer at the University of Michigan's School of Information, where he teaches user interface design. He can be reached at tom@diamondbullet.com

In the year 2000, the developers of a large-state portal approached us with a need to redesign the information architecture of their state website. The portal provided access to thousands of state agencies. Users consistently expressed frustration in finding relevant information.

Through a series of user interviews and user tests with successive prototypes of the new architecture, we dramatically improved the success rate and time for finding information through the state site. Our approach was based on systematically exploring the design space and was metric-based, allowing us to show formal quantitative evidence of site improvement and make well-grounded estimates of financial benefit for the state. Our approach was fundamentally user-centered, gathering feedback from many people throughout the process to achieve effective tradeoffs in the design.

Backgroun d Research

In order to inform our redesign, we began the process by conducting interviews, creating user profiles, analyzing “competitive” sites, conducting card sorting and analyzing the common tasks that users would perform.

Interviews. Our project was done in collaboration with the state government and the company that was developing and supporting the site. Our initial step was to develop an understanding of how people interact with their state and what they wanted from their state website. We sent usability specialists to visit several cities and interview citizens.

When people were asked what activities they do with the state government, a common response was that they couldn’t think of anything. In some sense, it’s reassuring to know that citizens are not regularly worrying about their government. However, there are, for example, two items that you would expect to be an issue for most adults: driver’s licenses and taxes. This lack of awareness is suggestive of the general issue we had to address: people are generally not familiar with the roles government agencies play and do not have much knowledge of the structure of their state government.

User Profiles: These interviews then led to most of our user profiles, identifying who the audience for the site would be and what their expectations would be. Primary target users are citizens, businesses, out-of-state visitors, government users and education users (students, teachers and parents). The site had primary business goals associated with each user type. A central goal was to enable more effective participation by citizens in government and to make interaction with the government convenient. The site itself is self-funded based on pay services for businesses and cost savings through encouraging people to do government transactions online rather than through staffed offices.

Competitive Analysis and Design Alternatives. We identified several other state websites that were considered to be the best at the time and analyzed the architectures they were using and their style of navigation. This helped reveal various types of information that were highlighted on the home pages, labels used and categories chosen.

Through this process we identified five primary paradigms used or considered for presenting the top-level categories of the site:

  • Role: categories corresponding to the different audiences, such as “Citizens.”
  • Topic: conceptual groupings, such as “Facts & History” or “Taxes.”
  • Agency: government divisions and agencies, such as “Legislative” or “Transportation.”
  • Task: grouping by what someone intends to do on the site, such as activities involved in “Visiting the State.”
  • Lifecycle: grouping by stages of life, such as “Primary & Secondary Education” and “Marriage Licenses” (this approach proves insufficient to logically categorize all topics).

The old website for this state organized information by agency to a large degree, which proved to be the most difficult organization scheme for users, as people were very unfamiliar with state agencies, both agency names and the division of responsibilities.

As with any site design identifying all the options for a state home page quickly led to debates within the design team over which kinds of items were appropriate for the home page. Is a home page primarily intended to route users to stable topics or should regularly updated news dominate the page? Should the governor’s photo appear on the home page? Is this inappropriate political self-promotion for the governor or does it serve a useful civic function? These alternatives involve tradeoffs. There are no clearly right or wrong answers, but these are the types of discussions that will arise.

Exposing these design alternatives is a crucial step in the development of a site architecture. Most people are familiar with the process of exploring the design space for page layout in what we call the inverted pyramid process. In this approach for page layout, the designers begin with a large number of small, quickly rendered thumbnails to explore possible design layouts. Then they proceed to a moderate number of larger, relatively quick mockups that explore alternatives representing the best thumbnails. Then the designers make a small number of detailed prototypes of the best mockups, followed by iterative refinement of the best design.

We advocate the same approach for information architecture, where top-level architectures are explored as alternatives, then a smaller number are explored with more levels spelled out in detail, followed by optimization of the preferred architecture. The exploration of alternatives in the design space avoids premature commitment to a non-optimal navigation paradigm.

Card Sorting. The state Web portal had no significant content of its own but was primarily designed to provide links to various resources of the state that resided on other sites. Thus, we had a very clear idea of what the final resources were. We wrote down a large number of these on cards and asked prospective users to sort these cards into categories and suggest labels for their categories. This provided an additional starting point for how the users would organize the categories of the site.

Task Analysis. We identified several common tasks people would want to do on the site. These tasks were inferred from the user interviews, hit logs and typical searches people used. We used these tasks to check the architectures we proposed to ensure that the path to accomplishing each task was both direct and clearly labeled. We chose 10 tasks as benchmark tasks that we evaluated in user testing:

  1. You are interested in renewing a {State} driver’s license online.
  2. How do nurses get licensed in {the State}?
  3. To assist in traveling, you want to find a map of {State} highways.
  4. What four-year colleges are located in {the State}?
  5. What is the state bird of {the State}?
  6. You are interested in registering as a voter in {the State}.
  7. Who are the U.S. representatives and senators from {the State}?
  8. How would someone start a business in {the State}?
  9. What are the current road conditions on {State} highways?
  10. You are interested in getting a hunting license.

Refining the Architecture Through User Testing

We tested our proposed architectures through six rounds of user testing with 10 to 13 users in each round. An initial round tested users on the old version of the website. This helped establish baseline measures of how well the old site was doing. The next round compared four broad architectural approaches, as spelled out above: role, topic, agency and task architectures. We did not test the lifecycle approach, as it did not make sense for representing the full spectrum of information. After testing these broad alternatives, we identified that the topic and task architectures were most successful, so we developed an architecture that integrated the two and progressively refined the resulting architecture through subsequent tests.

Each user test was conducted using a traditional think-aloud method. Users were asked to perform each task (or answer each question) while speaking aloud whatever came to their mind as they did it. An observer timed them and took notes on the problems people had. At the end of each task, and at the end of the entire sequence, users filled out response sheets to rate their impressions of the site.

The following is the architecture for the old site compared to the final new architecture we designed. We’ve expanded the state government section of each to highlight some differences.

Top-level architecture for the OLD portal

Search

Visitor’s Guide

{State} History

Government

·        Legislative

·        Elected Officials

·        State Agencies

o       [Alphabetical List of Agencies]

·        Judicial

·        Local Government

·        State Committees

Professional

Business & Commerce

Education

Kids Net

{e-Gov company} Services

Gov Technology (external)

Top-level architecture for the NEW portal

Living in {State}

Learning in {State}

Operating a Business in {State}

Working in {State}

Recreation & Travel in {State}

Government

  • Governor

  •  Elected Officials

  •  State Agencies, Boards & Commissions [includes scope notes – a brief description of each link]

  •  quick reference listing

      • [alphabetical listing]
  • detailed listing

      • [alphabetical listing & descriptions] 
  • State Associations

  • [etc...]

{State} Facts & History

The new architecture is structured around tasks such as living, learning, operating a business and working. It also has elements that are purely topical, such as facts and history, and retains the government agency view because that appears to be the most efficient organization for state employees.

What might appear to be subtle terminology changes were made throughout the site. For instance, “professional” becomes “working in the state.” This type of change has a dramatic effect on people’s ability to recognize the right category. In addition, we liberally use scope notes throughout the site, which are brief descriptions below each link that explain the scope of the information that is found through that link.

User Testing Results. Two measurements we made in user testing proved particularly useful. The task completion rate is the percentage of tasks users were able to successfully complete. Through user testing and continual improvement of the architecture, we were able to improve the task completion rate from 72% on the old site to 95% on the new site, as illustrated in Figure 1.  

Figure 1. Task completion rate for the old and new sites.

The other measure was task time, how long it took people to complete the tasks. On the old site, the average time per task was 132 seconds. On the new site, the average time per task is 50 seconds, a radical improvement in speed. Figure 2 shows a comparison of time improvement for each task, showing that our redesign not only improves the performance of almost all the benchmark tasks, but decreases the variability in time for different tasks.  

Figure 2. Average time in seconds to complete each task for the old and new sites.

One interesting aspect of this improvement in time is that for these benchmark tasks, our new architecture typically requires one extra click on average in the optimal path to find the appropriate information on the site. We buried information one level deeper on average, but still decreased the time it took to find information by more than half!

Though this seems counter-intuitive at first, categories and labels are much more clear, and items are located where people expect, so users spend less time being lost on the site and less time scanning lists of links looking for one that seems to fit. This demonstrates the limitation of the traditional “3-click rule” that recommends having all screens available within three clicks. Instead of minimizing clicks, we recommend planning the site organization around clarity and the speed with which information can be found.

This process of iterative refinement produces a final design that works spectacularly well, and being part of the final round of user tests is a very satisfying experience. As an anecdote, in the last round of testing the state portal, I was testing a retired librarian who had barely used computers or the Web. She was enthusiastic about learning this skill but timid, concerned she wouldn’t know how to use the site. After reading the first task, she said she wasn’t sure where to begin. I encouraged her to just give it a try. So she clicked, clicked, clicked, and went immediately to the correct information quickly and without mistakes, demonstrating the effectiveness of the labeling. She followed through with the remaining tasks quite successfully as well. We were pleased to see in the testing that the site was highly accessible to people of quite varying backgrounds.

Business Case

This approach of measuring user performance helps us gauge the quality of our improvements through successive rounds of refinement of the architecture. It keeps us focused on the problems users are having, and breaking measurements down by task helps us identify which tasks are creating serious problems so we can focus on diagnosing the problem tasks. Finally, however, the metrics serve a very useful financial purpose of demonstrating the payoff of investing in the site redesign.

For a state government site there are several benefits of improved websites: improved convenience and time savings for the citizens and business owners, improved marketing to tourists, cost savings in government administration and revenue generation from the website.

As an estimate, this state website made an average service fee of about $2 per month from each individual who regularly transacted business on the site, such as paying for processing fees for construction permits. As a conservative estimate, about 100,000 users would make a paid transaction each month, but on the old site, 28% of them would fail to complete their transaction. Based on our redesign, we reduced the failure rate to about 5%, enabling 23,000 additional successful transactions per month, thus generating an additional $552,000 in site revenue per year. While this is only an estimate, it’s clear that investing in this improvement readily pays for itself in direct dollar terms, and that decreasing the failure rate further is likely to continue providing value.

There are other important benefits which aren’t as directly linked to revenue for the state. One is the benefit in time savings for citizens who use the site. We estimate that by reducing the time to complete common tasks on the site, we save the citizens of the state about 14,800 total hours per year in using the site, thus contributing to productivity in the state for the average citizen.

Ideas for the Future of State Portal Development

In any project of this scale, you’ll come away with ideas for improving your work the next time. These are some ideas I’ve had after completing this state portal redesign.

In the competitive analysis, I’d suggest not just gathering information from a few states, but systematically analyzing the architecture of every state, as well as the approach of some other governments, such as the U.S. federal government, cities and counties, and some foreign governments. While it will take some extra time, there is tremendous information to be learned from this expanded study that can save refinement steps later in the process.

I’ve been fairly pleased with the design of FirstGov.gov, the U.S. Web portal, and would recommend using their site as a benchmark for comparison. They use multiple navigation paradigms on their home page, including role, topic, task and agency schemes all on one page. They also provide more options on their home page, and my experience since the design of this state portal is that it’s worth having more options in general, as long as they are presented in a well-organized fashion.

Finally, as a citizen, it’s my hope that multiple government agencies will coordinate to provide more consistency and standards in their navigation, as this is both possible in principle and extremely useful in helping citizens find the information they need. For instance, I would like every government site in the United States to follow a convention in the header or footer that would allow someone to easily identify the government units both encompassing and contained in the one they are looking at. A state, for example, would have a list of links like this: “USA > Michigan > Michigan Counties > Michigan Cities.” Establishing such standards is not an easy process. Web sprawl and inconsistency is a problem for any large organization (consider the Microsoft website, for example) and an even larger challenge for state and federal sites, but a worthy goal.

Acknowledgements

I’d like to thank Diana Persell, Debra Luling, Melinda Morris-Black, Jason Withrow, Alfred Speredelozzi, Stephen Markel, Rod Lowe, Juliane Morian and Bryce Erwin for their significant contributions to this work. I’d especially like to honor the business case contributions of Stephen Markel, former president of Diamond Bullet Design. Stephen passed away at the age of 37 on September 18, 2004, due to complications of a ski accident. He was a champion of a quantitatively driven approach to usability and accessibility and a motivator and guidepost for his entire team.

For Further Reading

Withrow, J., Brinck, T., & Speredelozzi, A. (2000). Comparative usability evaluation for an e-government portal (Diamond Bullet Technical Report U1-00-2). Retrieved October 19, 2004, from www.diamondbullet.com/egovportal.pdf. This white paper provides a more detailed description of this state portal project.

Brinck, T., Ha, S., Pritula, N., Lock, K., Speredelozzi, A., & Monan, M. (2003). Making an iMpact: Redesigning a business school website around performance metrics. In DUX 2003: Designing User Experiences, San Francisco, CA, June 2003. Describes a similar process with the redesign of the Michigan Business School website.

Bias, R., & Mayhew, D. (1004). Cost justifying usability. San Francisco: Morgan Kaufmann. This book collects other papers providing the business case for usability. A forthcoming 2nd edition will soon be available with up-to-date case studies in Web development.

Brinck, T., Gergle, D. & Wood, S. (2001). Usability for the Web: Designing Web sites that work. San Francisco: Morgan Kaufmann. This book describes usability methods that can be applied throughout the Web design process.


How to Order

American Society for Information Science and Technology
8555 16th Street, Suite 850, Silver Spring, Maryland 20910, USA
Tel. 301-495-0900, Fax: 301-495-0810 | E-mail:
asis@asis.org

Copyright © 2005, American Society for Information Science and Technology