Wiggle Checkout
What if we could improve the conversion rate by addressing customer doubts?

What if we could improve the conversion rate by addressing customer doubts?
We were tasked to iterate the Wiggle Checkout — one of the run-away successes of the eCommerce giant’s online touch-points with customers.
- Prototyped in code with Twitter Bootstrap
- Conducted a UX audit of the current experience with users to discover pain points
- Researched multi-national nuances to the checkout experience
- Worked with multiple stakeholders to create buy-in
- Ran multiple user tests on a variety of devices on location
- Launched September 2017
- Delivered 3 times the value put into the original business case — a conversion improvement by 2.5%.
Goal
As a team, we identified two key objectives
- Improve the total checkout completion rate by 2.4%
- Improve the rate of customers proceeding past the log-in page
We didn’t have much time, so we cracked on with starting the right way.
Approach
Something we wanted to do without doubt was test early assumptions out with users, not in focus groups, but by creating prototypes early and getting user feedback.
To ensure that everyone was aligned in what we wanted to achieve, we ran a number of internal workshops to start understanding what each stakeholder wanted to see in the new design. We would run competitor reviews, understand some of the problems with the existing checkout and put together some key principles to ensure this project was delivered in a timely and efficient manner.
We knew that we’d release the final product in a staged manner. This was done to reduce risk, but because Wiggle is a firm believer in data, we wanted to compare conversion rate improvements (or reductions) incrementally.

Prototype early
We prototyped an experience based on what we knew. It covered the whole journey, from log in or register, all the way through to confirmation. We knew that we wanted a responsive checkout and to achieve that within our existing tools was going to take too long and would be hard to test and demo with the rest of the business, so instead we created the prototype within Bootstrap.
This meant that I, a UX designer and a UI developer could build quickly, test with users and understand what we needed to do in production. Working within Adobe Photoshop and uploading to Marvel wouldn’t have given us the results we wanted. It’s about picking the right tool for the job!

We had some assumptions in the design to test out. Pages such as the login / register screen were vital, as we knew that existing customers had issues logging in.
To gain confidence, we launched a login page A-B test with two key ideas. Ultimately the above design didn’t perform as well as the winning design. That’s the incredible thing about designing this way. We discover and learn as we go, not right at the end.
We changed the hierarchy of specific parts of the experience — here we found users wanted to know estimated date instead of method
User testing
We ran a number of user testing sessions, not just on the prototype, but also on the existing site, to understand where we needed to apply effort.
We tested the existing checkout with 10 participants to fit a target audience of:
- Mix of male and female
- Desktop, tablet and smartphone
- Users from UK, France, Germany, The Netherlands and the USA

We rapidly prototyped the desired experience and iterated upon that once we had feedback and more business knowledge.

To collect data in the right way, we used Test and Learn cards to identify what we wanted answers to. We used the following format with some of the sample data to illustrate:
- Hypothesis: Customers are overwhelmed and don’t want to spend ages learning about complicated checkouts and would benefit from a simpler checkout when purchasing items from Wiggle
- Test: Test a new design of the checkout
- Metric: Customer satisfaction, time spent on task and success rate.
- Criteria: Customers describe using the new design of the checkout is simple and understand in enough detail the process of buying this way. Returning customers should also spend less than 20 minutes on the task.

We had an opportunity to go to a cycling event in the New Forest to meet customers and get their feedback on our prototype. We took laptops, tablets and mobile phones for users to go through and for us to learn what we needed to improve. This was a fantastic experience for the whole team — we discovered that real work can happen outside in the outdoors!

Internal demos
As we were proving assumptions, we held demos within the business to show everyone what we were discovering. We found that there was a lot of knowledge within different teams and a source of requirements we hadn’t anticipated. This was a good thing! This meant that the checkout would perform well in different countries and different situations.

For example, we discovered that in some areas of Europe, customers chose to get their packages delivered to the local post office. Upon collection, their ID needed to match their details from the confirmation page at the end of the checkout, but because (as we discovered) some customers used an Alias in checkout, they found issues proving they were who they were! We solved this within the checkout design now knowing that while this was a possible edge case, it would improve the experience for those types of customers.
Key findings
Here were some of the key learnings we found from testing early prototypes with users:
- One of our login screens show login and register in the same view — this was preferred by users
- Customers like delivery prediction dates
- Customers don’t understand Local Collection, but when we explained the concept, they said they would use it
- We should include logos to help the user recognise unfamiliar parts e.g. payment method logos
- Users thought “leave a note with Wiggle” would go to delivery partners — it doesn’t
- Users consistently missed the confirmation tick box at the end of the journey
- Forgotten password was too close to the password field, so often was missed
The prototype is the easiest place to make the changes, so we took the findings and made changes directly.
Results
- Launched September 2017
- Delivered 3 times the value put into the original business case — a conversion improvement by 2.5%.

Retrospective
We held a retrospective at the end of Phase 1. During the retrospective, the team reflected on what happened and identified actions for improvement going forward. Those actions went into a backlog for further production.
Improvements included:
- Working in fewer weeks and whole days to improve focus
- Involving the project sponsor in at the very start!
- Spending more time sketching innovative solutions rather than going straight into wireframes