Why Did Our Best Variation Fail?Tagged As: Question
We just completed a three test project for www.checkineasy.com – an event planning & management platform. The tool allows event greeters to check in their guests very quickly using one or more mobile devices. The platform requires a free application download, a free account, and guest credits which can be purchased later. When we set out to optimize the existing signup screen for more signups (of course), our initial reaction was that stylistically it was quite busy. We started asking ourselves if by simplifying the user interface, and applying a handful of best practices, higher signup rates could be reached. It wasn’t long before our first variation launched and flopped. Here is what we’ve done in variation B:
The Two Signup Pages That We Tested In Test 1 Of 3
1. Giving A Gift
The 50 Guest Credits were emphasized as something users would receive. (GoodUI #2).
The pricing and credit structure was explained in depth (replacing 3 existing benefits). (GoodUI #68).
3. Top Aligned Labels
Form labels were placed above the form fields.
4. Fewer Form Fields
The username field was removed (with the email address being passed on instead). (GoodUI #13).
5. Bigger Click Areas
The size of all buttons and form elements was increased. (GoodUI #38).
6. Keeping Focus
The standard footer with additional links was removed. (GoodUI #16).
7. Social Proof
Logos of existing customers have been shown in the footer. (GoodUI #4).
So Why Did All Of These Great Ideas Fail? You Tell Us First!
Yes, variation B which contained a number of our supposedly “best practices” resulted in a -21% drop to signups. It is not uncommon for what we think will perform well, to have an opposite effect – that’s why we tested it. The key take away of course is that it’s important not to give up on the first try. That’s exactly what we did with the project as we designed two more tests to answer this question. But before we share them with you, we’d love to hear what you think as to what the cause might have been. Please share in the comments section.
Update [Nov 5]: One More Variation To Provide Additional Clues
Like many of the reader comments have already pointed out, we also started speculating that the control’s headline might have been more effective for various reasons. The 50 guest credit gift might have been irrelevant or even causing anxiety. More so the control’s reference to “trial” instead of “account”, might have carried more value. Finally, variation B also might have been showing a conflicting message between receiving a “FREE Account” and then showing overwhelming amount of pricing related messaging. Reverting variation C slightly back to the control, we then turned it into a test. As the data started coming in (and we were acting on thin data in the interest of time), the new variation C started performing better at an insignificant -11% (ranging widely between -32% to 11% with a p-value of 0.43). This was good news as the needle started shifting closer towards the control. It began confirming that some of the changes in variation B were in fact hurting and we were beginning to identify why. We did not stop here. We ran two more variations, one of which finally generated a significant win.
Read The Complete Story In Issue #19 + With 2 More Variations
Be sure to grab the full story where we show in detail how we managed to score a +15% signup rate with two more follow up tests. The issue also includes: additional screens that were tested, the testing strategy used (Making Big Leaps), our reflections on the process, transparent data, and a blueprint template. Issue #19 now also comes with day snapshots which show exactly what we did from one day to another: