Sample Order Tool
The Sample Order Tool allows users to order sample and test products. During a summer internship at Cimpress, I conducted usability testing to help the developers working on the tool discover what the usability issues were.
Usability Testing Report
High Fidelity Wireframes
I began by meeting with the project manager to better understand the tool, its users, and its challenges. Once I understood it better, I used both my own assessment of the tool and the conversations with the product manager to create test goals. I created both high level and specific feature goals.
I used the test goals to determine the ideal participants and create a moderator’s guide. Based on the test goals, we were looking for 5-7 participants who were:
A mix of users who use the Sample Order Tool regularly and users who rarely/have never used it
A mix of beginners (no computer science or technical experience) and advanced (experience with coding)
A mix of age and gender
I recruited 7 participants for the usability testing. The participants were all internal employees who actively used the Sample Order Tool, or could theoretically use it, in their current role.
During the testing, participants were asked to place test orders and customize products. We tested several scenarios, including searching for a product and ordering a blank product. The one-hour usability sessions were conducted in a usability lab, where the UX Team and developers could observe the testing through a two-way mirror. The sessions were also recording using Camtasia Studio.
Many aspects of the site worked well. Participants particularly liked the micro-interactions and placing their order. The tool is a huge improvement to previous options and participants loved the customization flow.
Issues around the language of messages, as well as small bugs and inconveniences, hindered the overall usability of the application.
After the usability testing, I met with the team of developers who created the Sample Order Tool to rank the issues and insights by severity. During this meeting, we focused on the number of participants who made the mistake and the degree to which the issue prevented the users from accomplishing the task. I used this prioritization to focus on certain pages and aspects of the design for the redesign. During the redesign, I chose to focus on the Choose Products page, the Shipping page, and a confirmation email.
During the redesign, I added a “No Design” option because participants struggled to figure out how to order blank products. There were also several smaller issues that did not prevent orders from being placed, but were still frustrating for participants. Several participants could not figure out how and when to save products if they were ordering multiple products. To solve the issues, I redesigned the flow of the Choose Products page and added a “cart view” to mirror the traditional e-commerce mental model. This allows participants to see what is currently in their cart and know exactly what they are customizing.
“Quantity always gets lost over here”
Several Iterations of the Choose Products Page
Final Choose Products Page
The cart was moved from below the product information into a right-hand panel, so it was always visible to users.
Quantity was placed just below the product name and image to make it easy for users to find.
The no design option was added to clarify the design options, since users did not know they could leave a product blank by not uploading an image.
Final Version of the Choose Products Page
I added the ability to save multiple shipping addresses, a feature the tool did not originally have. The “State/Province” field in the address form did not apply to every user, because not every country uses states or provinces in their addresses. To fix this, I added custom address forms for the 5 countries with the most users, so the forms were specific to the mental models of the users. Ideally, every country would have an individual address form, but that was not technically feasible. Lastly, I tweaked the styling and location of some of the form fields that participants forgot to fill out because of inconsistent styling.
Final Shipping Page
The country input field was brought to the top, which allowed the rest of the form to dynamically change what information it needed based on the user's country.
The ability to set multiple addresses and specify which address was the default was added in this redesign.
By putting the shipping address before the delivery date, the Sample Order Tool could show which dates were available for each specific address, rather than estimating.
Final Version of the Shipping Page
I designed an email confirmation to be sent to users after they placed an order. Participants were nervous that they would lose the order number after they placed their order, so the email contained standard billing and order information, as well as the order number. It also contained a button, “View Order”, that takes users to a tool where they can see the progress of their order and whether it has shipped.
"I’m a little afraid right now because this seems like we’re in this fragile state where if I don’t copy this ID and if I don’t save it somewhere it’ll be gone forever”
Final Confirmation Email
During testing, users were most concerned with keeping track of their order number, so it was placed at the top of the confirmation email.
Many users were unaware there was an existing platform they could use to track their orders, so a link to this platform was included.
Because users were often ordering the same product with different attributes, quantities, and designs, just listing the product name or SKU would not have been helpful to users. Instead, the confirmation email includes all this specific attribute information and an image of the product with their customized design.
Final Version of the Confirmation Email
After creating annotated wireframes, I handed them off to developers who implemented the designs. Ideally, the tool would go through another round of testing to validate these changes. During the original testing, participants also filled out a System Usability Scale survey and the tool scored an 86. It’s important that the survey is administered again during the next round of testing to understand how the usability has changed.