I had been managing this clients Google Search Ad Campaign for a month or so before I was able to implement one small change that increased our conversion rate on her lead gen campaign by 93%.
At first we started with a simple 2 step optin form. The landing page had a button with a 'Request Now' call to action. When clicked, it sent them to a simple optin form asking for name, email and desired time of week to schedule an appointment.
All of the copy was written by the client. Even the call to action buttons. While this did convert, it wasn't converting as well as we were hoping; 1.63% isn't very good.
Anyways, I was just about to start testing different copy and call to actions when I had an idea. I had heard good things about VideoAsk and had read a lot about how powerful micro-commitments can be when it comes to conversion rate optimization. So I figured, what the hell, let's give it a shot!
I gave the client a rough outline and questions to ask in the VideoAsk. She made the video recordings.
The VideoAsk started with a few softball questions. Simple 'yes or no' questions and multi-choice questions. Micro-commitments. Then we asked the real questions like name, email and phone number.
What was the result after a few weeks?
Old Variation: 244 clicks; 4 form submissions which gives us a conversion rate of 1.63%.
New VideoAsk Variation: 285 clicks; 9 VideoAsk submissions. This gives us a conversion rate of 3.15% which is a 93% increase in conversion rate!
It turns out we are a lot like the 'frog in boiling water' metaphor. If you just ask us the 'bucket' questions right away like name, email and phone number. We're more likely to 'jump out of the water'. We have to be eased into it with some low commitment softball questions that give us a sense of self discovery.
After you're a few questions in you feel more obligated to complete the form and give the name, email and phone number.
Does this simple, short test have a large enough sample size to show statistically significant results?
The statisticians out there would definitely say no but I never claimed this was a statistically significant A/B test, or an A/B test at all for that matter.
There were a lot of variables that were slightly different, not just the VideoAsk form. I'm not Google and neither are my clients. We don't typically have enough traffic to run statistically significant A/B tests, nor do we have the budget to do this.
So does that mean we shouldn't try different things to 'see how it goes'? I think not. I recommend reading this great article on the topic 'A/A Testing: How I increased conversions 300% by doing absolutely nothing'
As always, if you need help increasing your websites conversion rate, feel free to drop me us a line.
Sign up for our newsletter and get a Free Google Ads Audit Conducted Over Zoom by a Seasoned Google Ads Expert With 10+ Years of Experience.