We also added a GA custom event for the number of errors that occurred.
Lastly we added an event to measure the amount of time it took to receive a response from the server when the form was submitted. We grouped this into 100ms groups.
Here is a screenshot of some of the events from our Google Analytics account:
We left these reporting functions in production for the rest of the campaign and found some interesting problems.
The first thing that we noticed was that our error rates were much higher than we thought they were and much higher than we were comfortable with.
The second thing we noticed was that a large percentage of users had an error for the radio button group to select the credit card type. Our first thought was that users either did not notice the credit card type radio button or saw it and forgot to select a radio. We dug a little deeper with GA custom events and found that this was indeed the case. The vast majority of the errors were from users not entering a credit card type at all.
Third, a lot of users had an error on the credit card number field. Our gut in this case was that it is difficult for users to enter a string of 15-16 digits correctly.
The fourth issue was that a high percentage of users had exactly two errors when they submitted the donate form. Coincidentally, the two fields with the second highest error rate (after credit card number) was employer and occupation. We didn’t have any idea why this was happening.
Our first thought on the generally high error rate was that users may not know which fields are required. So we added some more GA events for each field which reported if a value was entered in the fields at all. Sure enough we saw that users were leaving fields blank, even though they were required. When we looked at our form we noticed that there was no indication of what was required. We created a variation which added an astersisk to all the required fields and a/b tested it.
By adding an asterisk to indicate required fields, we reduced the error rate by 9.8%.
To solve the credit card type selection problem we created a variation of the donate page that auto-selected the credit card type. This could be done because we only accepted 4 types of cards and they each had a distinct number pattern. All MasterCards start with a 5, all Viasas start with a 4, all Discover cards start with a 4 and AmericanExpress cards start with a 3. We a/b tested this variation against the control. Below are the results.
By automatically selecting the credit card type instead of asking a user to, we reduced the error rate by 13%.
For the third problem, we created an experiment to test our hypothesis that it is difficult for users to enter a string of 15-16 digits. We built a variation of the donate page that formatted the credit card numbers almost exactly how they appeared on a credit card. Below are the results.
By formatting the credit card number, thus making it easier for users to parse, we reduced the error rate on the field by 15%.
Lastly, on the high occurrence of errors on the employer and occupation fields, we looked at data that people were entering into the fields on successful submission. Many of the responses were along the lines of “None of your business.” Campaigns are required by law to collect this information so while it may or may not have been our business, we had to collect it. At this point we figured that users just weren’t comfortable entering the information and that there wasn’t much we could do about it.
That all changed when we decided to do some user testing on our donate forms. We sat volunteers down on a computer and recorded them making donations using a Mac application called Silverback. (You can read much more in depth about this in my other blog post: User testing is surprising effectively) We found that retired people and students did not know what to enter into the fields.
We created a variation that included the hint, “If you are retired, enter retired in both fields.” and a/b tested it against the control. The results are below.
By adding a helpful hint to the employer and occupation fields we were able to reduce the error rate by 63%.
As I’ve shown above, reducing user frustration can have a big affect on high value conversions. You might not know it, but frustration could be reducing conversions on your app, whether that conversion is usage, form submissions, time on site, bounce rate, repeat visits, or just about anything else. Quantifying frustration is relatively easy and once you do you can make big gains in reducing it with a/b testing.