The user testing we conducted at the Obama campaign reduced form field validation errors by as much as 63%. Perhaps more surprisingly we found that this reduction in user frustration lifted conversion rates for subsequent pages that were not manipulated at all.
In past blog posts I discussed how optimizing page load and design and copy through a/b testing can have dramatic affects on conversion rates. While a/b and multivariate testing is a great way to gather quantitative data, it does not give you qualitative data and this is essential to improving web apps. User testing to the rescue!
How we conducted user testing
For those not familiar with user testing, the idea is simply to observe users interacting with your web app in a controlled environment. We installed Silverback on a Macbook which allowed us to record the screen and the camera while a user carried out a task. We put the laptop in a small conference room and setup extra lighting so that we could clearly see the user's face. We then worked closely with our volunteer coordinator who found lots of retired people, students and people in between to make test donations on laptops, tablets and phones. After each test we interviewed the user and solicited their thoughts.
What we learned
We went into user testing with no expectations because as engineers we were very familiar with the way we used our own products, but we had not idea how actual users did. To our surprise we uncovered several issues with our donate forms that we probably would have never found without user testing. Here is a list of the issues that were reported to us ordered by frequency.
- The type was too hard to for users read
- Users did not know what format the credit card number and format should be entered in
- Users who were not employed did not know what to enter for the employer and occupation fields
Solutions and results
Credit card formatting
The formatting reduced the error rate on the credit card number field by 15%.
Employer and occupation fields
Our data showed that the employer and occupation fields had one of the highest error rates of all the fields. When we looked at the values people submitted we saw things like "none of your business" so we assumed that the reason the error rate was so high was because people were uncomfortable providing this information. However we learned through user interviews that retired people and students were unsure what to put in the fields. To solve this problem we added one lin below the fields that read "If you are retired, please enter 'retired' in both fields."
The one line hint reduced the error rate on the employer field by 63% and the occupation field by 58%.
While spending so much time trying to reduce error rates on our donate form we noticed a latent affect. After one of our users made a donation they were taken to an upsell page which asked them to save their payment information. We noticed that when we reduced the error rate on the donation form, we got up to a 7% increase in conversions on the upsell page, without touching the upsell page at all. This is one of the data points that justified spending so much time improving the user experience of our forms.
The qualitative data that user testing provides made a big difference both to our users' experience and to our conversion rates. We found out that a lot of the assumptions we made about our donate forms were wrong and in hindsight I'm glad we challenged our assumptions. Reducing user frustration with your products is a good thing to do for your users, but it can also help achieve business goals. User testing is worth every minute of your time.
Comments here: http://news.ycombinator.com/item?id=5133760