This is the first in a series of articles that will dive into improving conversion rates of your Formidable forms through optimization. To start the series, let’s get a basic understanding of what conversion rate optimization means. Then we can go on to explore ways to incorporate these practices. You'll learn how to A/B split test forms in Formidable. A/B testing, when done properly, is a continual process that over time yields higher and higher conversion rates.
Put it to the Test: Why optimize?
Web pages might have a variety of purposes, but with forms themselves, they pretty much always have the same purpose: to get filled out!
Whether or not this happens can depend on an incredibly diverse and overwhelming number of reasons. Learning these reasons and then optimizing our forms based on what we learn gives us a good chance at getting the results we’re seeking.
Of course we do this already without even thinking about it. Many of Formidable’s form field options are for the very purpose of optimizing our forms. How we layout fields, their labels, descriptions and other “clues” that help the user flow through our form with as little friction as possible; these are all examples of form optimization that we don’t really think about as we go about building our forms.
Much of the time, whether or not a form is completed depends heavily on the messaging around it and throughout the page. However, those who specialize in Conversion Rate Optimization (CRO), will tell you that elements of the form itself can have an equally important impact on its conversion rate (CR).
Learning Through A/B Testing
To learn which factors are impacting your conversion rates (CR), both good and bad, you need to test and compare variations in an organized way. Then evaluate what you’ve learned and incorporate the appropriate changes.
The way to organize this testing is to compare the current version of your form with an alternate version, or what is called a variant. This is what is simply referred to as A/B testing, or split testing.
It is the process of making and comparing one change, or a specific set of changes, to the original version while measuring which form gets filled out more.
Once a winning version has been clearly identified, you then use this version as the original and create a new test that focuses on another specific change.
Introducing the Hypothesis
To give our test a focus in order to clearly define a winner, we want to start with a hypothesis. The hypothesis is a statement that defines the specific goal, or desired outcome, of this test.
This could be in the form of a question:
Will conversions be higher if button text reads ‘Buy Now’ instead of just ‘Buy’?
Notice how the hypothesis addresses one specific change. It is very important to test only one change at a time. Sometimes that change could incorporate several elements, such as “Will conversions be higher for “Color scheme A” or “Color scheme B?”, because, while it focuses on many elements, it is still testing only one overall aspect, the color scheme.
Some experts will occasionally go for what they call a “Big Win” by making sweeping changes to many elements at once. If the new version is the winner, they will then continue by refining future tests to more specific changes always seeking to improve CR.
What to A/B test
Let’s take a look at several of the common and not-so-common elements you could be testing in your forms.
While other people’s results can inspire what to test on your site, they should never be used as an excuse or “permission” to skip testing. The successes or failures that were discovered in tests conducted by others on their websites could reflect several unknown factors. For this reason you must start from your own hypothesis and test it on your site to your visitors.
Submit Button
This is a biggie, of course, and probably the element you’re seen the most anecdotal evidence about one way or another.
“Never say Submit!” “Red works best,” “Flat instead of shadow,” “Add urgency,” “Blue works best!” “Benefit laced wording,” “Yellow works best!”...
Truth is, all or none of these could be important factors. Each depends heavily on your goals, products, audience and much more. Test them for sure, but do it in a methodical way. Based on what you know about your audience already, test each factor one at a time.
Also test adding icons to your buttons to give users important clues or to generate trust. A lock icon for submitting your email address or an icon indicating that clicking will open a new page could yield unexpected returns for minimal effort.
Privacy Copy
This is the little line you see on many email list signups saying something like “We hate spam…”, “We’ll never sell your email address…” and you might be surprised by the results of some tests related to this text.
In recent tests by ContentVerve.com, they compared no privacy statement to the phrase “100% privacy – we will never spam you” on a signup form and the latter received an 18% decrease in signups.
Now before you run over to your site and start removing all of your privacy statements from your forms, in a later experiment they tested the phase “We guarantee 100% privacy. Your information will not be shared” and saw nearly a 20% increase in signups vs. no statement!
The lesson here is that it’s not whether the statement is included or not; it’s that the precise wording will have a significant impact on your forms getting completed.
Total Number of Form Fields
Testing the total number of fields could be as simple as whether you ask for only an email address vs. asking for a first, last name and email address in a simple signup form. Or it could be more complex by using conditional logic to hide and reveal fields based on previous answers.
The amount of work required from the user is an important friction point affecting a user’s decision to complete your form. Reduce the friction = increase responses.
Of course, increasing friction could be of equal value, depending on the purpose of the form. If your form requires more work from a user, it could indicate users who do complete it are more interested in your offer.
If your form’s purpose is to generate new sales leads, you will want to test and perfect this balance between cooler, lower friction and warmer, higher friction leads.
How to A/B Split Test Forms in Formidable
To perform an A/B test, you need a way to randomly display the two different versions you are comparing.
Create two versions of your form on two separate pages. Then set up a Google Analytics with your forms, and create an A/B test to rotate evenly between the two pages. Then watch the results roll in.
Leave a Reply