Sample Order Tool
UX Research Internship — Cimpress Technology
During a summer internship at Cimpress, I worked on improving a tool that allows users, primarily merchants, to order sample and test products. I conducted usability tests and used the findings to redesign a portion of the tool to provide a simpler and more intuitive experience.
First, I met with the product manager to better understand the tool, its users, and its challenges. I also did my own analysis of the tool while going through common tasks and familiarizing myself with the interface. I used the information I'd learned from the product manager and my exploration to write both high-level and feature-specific test goals.
I used the test goals to determine the ideal participants and wrote a moderator’s guide. We were looking for 5-7 participants who were:
A mix of users who use the Sample Order Tool regularly and users who rarely/have never used it
A mix of beginners (no computer science or technical experience) and advanced (experience with coding)
A mix of age and gender
I recruited 7 participants for the testing. The participants were all internal employees who either actively used the Sample Order Tool or theoretically could use it in their current role.
During the testing, participants were asked to place test orders and customize products. We tested several scenarios, including searching for a product and ordering a blank product. The one-hour usability sessions were conducted in a usability lab, where the UX team and developers could observe the testing through a two-way mirror. The sessions were also recording using Camtasia Studio.
Many aspects of the site worked well. Participants particularly liked the micro-interactions and placing their order. The tool is a huge improvement to previous options and participants loved the customization flow.
Issues around the language of messages, as well as small bugs and inconveniences, hindered the overall usability of the application.
After the usability testing, I ran a prioritization exercise with the development team to rank the issues and insights by severity. During this meeting, we focused on the number of participants who made each mistake and the degree to which each issue prevented the users from accomplishing the task. I used this to focus on certain pages and aspects of the design when updating the interface. During the redesign, I chose to focus on the Choose Products page, the Shipping page, and a confirmation email.
Overall, the participants struggled with finding specific expected options and actions on the page.
Participants consistently missed the quantity box when adding a product and had to go back to enter a quantity after being unable to add the product to their order.
Participants didn't understand they didn't have to upload an image if they wanted to order a blank product.
Participants could not figure out how and when to save products if they were ordering multiple products at the same time.
“Quantity always gets lost over here”
— Participant 5
Final Choose Products Page
Quantity was placed just below the product name and image to make it easy for users to find.
The cart was moved from below the product information into a right-hand panel, so it matched a more traditional e-commerce mental model and was always visible to users when they scrolled.
A No Design option was added to clarify the design options to make it clear users could order a blank product.
The original shipping form worked for only a small set of users following a specific flow. Users who wanted to deviate from that flow could not easily be accommodated by the form.
The input fields were based on the structure of addresses in the United States and some of the required fields did not apply to users with international addresses.
Some power users also had multiple addresses they would frequently ship to, but the system could only remember one address so they had to type in the alternate addresses every time.
The delivery date was an estimate that wasn't based on the user's addresses so it often was not accurate.
Final Shipping Page
The country input field was brought to the top, which allowed the rest of the form to dynamically change what information it needed based on the user's country. I added custom address forms for the 5 countries with the most users so the forms matched the users' address structures. Ideally, every country would have an individual address form, but that was not technically feasible.
The ability to set multiple addresses and specify one of the address as the default was added. Users with multiple addresses could pick between saved addresses during each order.
By putting the shipping address before the delivery date, the Sample Order Tool could show which dates were available for each specific address, rather than estimating.
During the testing, most participants expressed concern about keeping track of their order. Several participants who used the tool often showed me spreadsheets and lists where they kept order confirmation numbers because they didn't think they were nervous that they would lose the order number after they placed their order.
I designed a confirmation to be sent after they place an order to alleviate these concerns.
"I’m a little afraid right now because this seems like we’re in this fragile state where if I don’t copy this ID and if I don’t save it somewhere it’ll be gone forever”
— Participant 6
Final Confirmation Email
During testing, users were most concerned with keeping track of their order number, so it was placed at the top of the confirmation email.
Many users were unaware there was an existing platform they could use to track their orders, so a link to this platform was included.
Because users were often ordering the same product with different attributes, quantities, and designs, just listing the product name or SKU would not have been helpful to users. Instead, the confirmation email includes all this specific attribute information and an image of the product with their customized design.
At the end of the summer, I handed off annotated wireframes to the development team who then implemented the designs. Ideally, the tool would go through a second round of testing to validate the changes solved the issues discovered. Participants scored the tool on the System Usability Scale during the initial testing so having users score the tool again would provide a helpful quantitative comparison to understand whether the changes were beneficial.