Agile software development has been picking up momentum constantly over the least few years, however, as more companies and developers adopt agile methods, an unease regarding the lack of attention to usability concerns has also surfaced. Agile teams strive to produce high quality software for a minimum cost. They often are good at producing useful software – but sometimes usability takes a backseat.
ActiveStory is a tool developed for the purpose of designing and performing usability testing on an application in a manner that is inline with Agile principles. Our tool allows designers to sketch user interfaces, add interactions and finally export the design to the Internet via a built in web Wizard of Oz system.
ActiveStory has two major components: The design tool and the Wizard of Oz system.
The first component of ActiveStory is the Design tool. The ActiveStory design tool allows designers to quickly draw user interfaces and link them together. The pen and paper metaphor is strictly maintained in that there are literally two modes for drawing, pen and eraser. This decision was based in the results of the web survey, which suggested that a tool needed to be fast, easy to use and not have features which slow down design. Figure 1 shows a screen shot of the design tool.
A flipchart metaphor is used to manage the many interfaces in a design. A flipchart is a physical pad of paper that is positioned vertically and then pages of the pad can be drawn onto, while order is maintained. Many drawing applications use this same metaphor, so we decided not to stray from it. The ActiveStory’s flipchart has one extra feature that is not found in the physical equivalent, the ability to make a copy of the current page and place it either before or after the page currently being viewed. Figure 2 details the flipchart controls.
Finally, once all of the page designs are created, interactions can be added to the design. In ActiveStory, an interaction is simply a region on a page that, once clicked upon, causes the system to load a new page. For example, suppose a designer draws two pages. The first page is a layout of a online catelogue, complete with a title, sketched image, description and price. The second page is a drawing of the shopping cart. To add an interaction to the catalogue page, we simply draw a button “Add to Cart” and select the boundary of the button with the activate tool, and finally select the destination page. This procedure sounds complicated, but in reality is quite simple. Figure 3 shows the full series of actions required to perform this task.
Once the design is complete and all interactions have been added, the designer is ready to export the project to the web. The internet is the perfect medium for providing a system to multiple, distributed individuals. It is not hard to image the benefits a distributed Wizard of Oz usability testing application could bring to the table. ActiveStory sports a built in web server to make the process of exporting more fast and convenient. To export the current design to the web, the designer simply clicks on the export button on the lower right hand side of the main window application. Finally, the designer enters the task she/he wishes the participants to attempt in to the text box provided on the displayed export dialog window. This task is the designer’s way of controlling what features the usability test attempts to evaluate.
Once the design has been successfully deployed to the Internet, the designer need only email his participants the web address of the usability test.
ActiveStory's second component (Wizard of Oz System) allows designers to create wire frame prototypes and run usability tests on them without requiring test participants to be collocated with the designer. As a result, fewer resources are consumed while performing simple usability evaluations. Before a Wizard of Oz test can be administered, the participants must be recruited. As we have already hinted at, recruitment for a usability test with ActiveStory is much easier than a traditional test because the participants do not need to be brought in. All a participant needs to start testing a project is the web address (ie: http://www.company-domain:8080/start.html). The start page describes the process involved with the test, the task that the participant is to accomplish and basic directions to providing feedback to the designer. When the participant decides to actually test the usability of the application, she/he simply clicks the “Run Usability Test” link on the start page and the wire frame application is started. The participant then attempts to complete the task using the wire frame application. At any time, the participant may leave a comment to the designer, by right clicking on the interface and entering some text in the textbox provided. ActiveStory will then attach the comment to the page and remember the coordinates so that the comment can be reassembled later in the proper context on the page. While the test is under way, the ActiveStory server is constantly collecting data about application usage. Specifically, ActiveStory stores all mouse activity on the part of the test participants as well as page timeframes (how long was a participant viewing a page before moving on to another page).
Once all the participants have used the application, the designer can begin analyzing the design based on the feedback provided as well as the data collected by the ActiveStory server. As previously discussed, ActiveStory collects three major categories of data: mouse data, page time data, and comments. We will now discuss how a designer can interpret each of these categories assisted by ActiveStory. Firstly, lets look at mouse data. ActiveStory allows designers to get a glimpse of how a user is using a mouse to complete the required task using the wire frame application. ActiveStory generates an image for each interface and participant that show where the participant’s mouse was located at any given time. Figure 4 shows a snapshot of a mouse trail generated by ActiveStory. This information can be useful in discovering such usability problems as labels which appear to be functional. If a there is a consistent mouse trail for every user spending time over an element that is not a button or link, there may be a problem.
ActiveStory also allows designers to see how long users are spending on specific pages. For instance, suppose a web site prototype has five pages, a home page, a contact page, a products page, a shopping cart page and a checkout page. If the users are consistently spending 30 seconds on the shopping cart page, where no actions are really required, that might be a sign that the continue shopping button is badly placed. ActiveStory presents the page time data in tabular form, with a table for each page in the application under test. The first line of each table provides an average time frame, and every following line provides the time frame for each visit to the page (including a unique number that represents each participant in the usability test). Figure 5 shows an example of what this table might look like. Finally, ActiveStory allows designers to see comments in the context in which they were submitted. When a particpant submits a comment during the test, she/he does so by right clicking on the interface, and entering text in the text box provided. For example, suppose a participant is confused by the text in a title, she/he can right click on the title and enter a comment explaining the problem. When the designer analyses the data, the comment will be super imposed over the title and thus the designer knows that the comment has something to do with the title (even if the comment does not specifically mention the title). It is important to note that while a comment is being entered, the page timer is stopped so that the recorded time spent on that page is not artificially high.