Ebooks   ➡  Nonfiction  ➡  Business & Economics  ➡  Sales  ➡  Marketing

Data-Driven Split Testing


p((((((((((<>{color:#000;}. THE FASTLANE TO BETTER ROI
































Most of us didn’t get into business to analyze order form conversion rates or webinar attendance. Whether it be as a coach for others, developing software to help passionate entrepreneurs or something else — only you know your product like the back of your hand.

Unfortunately, these days being great at something isn’t enough to make your voice heard. Sometimes hard marketing data is needed to encourage others to participate in your vision.


One of the many strengths of entrepreneurs worldwide is the ability to start something new, experiment, and press on while learning lessons from any failures along the way. Sadly, these lessons can be painful and costly.


To combat this painful learning curve, many have taken up the process of split testing. Through tests that they create, they can easily see what type of messaging their crowd prefers. Best of all, it’s included at no extra fee in most modern platforms like ONTRAPORT.





Where to Begin



Marketing campaigns do much more than bring in new leads. They provide you with data — the key to unlocking and optimizing future campaigns.


In this guide, we’re going to show you how to leverage analytical comparisons to fine tune your marketing and messaging. Say goodbye to guessing and hello to split testing.


So… what should you be split testing? Is A/B testing the same as multivariate testing? What are the most important elements to optimize? We’ll cover the answers to these questions and much more in this guide. It’s time to optimize, accelerate, and get back to your passion.


Let’s dive into the basics.





What are they and which is right for me?





At it’s most basic level split testing is comparing different items of the same type to see which performs better. Let’s get familiar with the two different specific types of split testing and which suits you best. Each method has its advantages and it’s up to you to decide!





A multivariate split test is where you test a variety of components on a page and every possible combination of the components in

question. As you can imagine, running this type of test requires many different versions of a page.


Due to the high amount of tests taking place, you’ll need a high amount of traffic to come to a definitive conclusion. Also, you’ll need an appropriate budget for the time it will take to make all the various page versions. Multivariate testing isn’t for everyone and can become too convoluted to be worthwhile for a smaller operation.

















An A/B split test compares only one component of a page, email or advertisement. For example, to run an A/B test on an email for open rates, in order to see how the subject line affects your open rates, you would change only the subject line while leaving the rest of the content unchanged. This type of testing provides scientific data that’s easy to interpret and take action on.




Since this method is so simple, you can execute an efficient test without breaking the bank and your schedule. Frequently marketers will run A/B/C/D tests, which is the same thing as an A/B test but with four different versions of one component on your testing subject. I highly suggest using A/B testing for simple, affordable and concise results.













Almost all business owners have run ads via Facebook or Google at one point or another, and there’s immense value in this targeted method. When advertising through these mediums, you’ll want to

make sure your campaigns are optimized to bring back as much data as possible. This will help guide your strategy moving forward and make sure your campaigns consistently hit the mark.


There are many different platforms that allow you to run and split test ads: Bing, Twitter, Facebook, Google, LinkedIn, etc. This guide

isn’t a deep dive into each channel, but know


Let the split test run for at least 10 days to make sure your data is true and not just some fluke.



Ad Image



Images are arguably the most important component of your advertisements. You may have the best call to action and description on the web, but if your image doesn’t demand attention, the battle is already lost. When running a test on your ad images, make sure you’re using the exact same text and calls to action. This will ensure your test is scientific, and your results are accurate.





It’s been shown that images with human faces generally perform better. A good split test to run is comparing a picture of a man’s face to a woman’s face. If you can’t come up with a relevant image that involves a human, try comparing color. These are just a couple of ideas to get you started, but the possibilities are truly endless!




Ad Copy



Depending on the type of ad you’re running, you might come across character restrictions (website conversion ads have a max of 90 characters). If you’re running promoted posts you won’t have to worry about ad length. As a general rule, shorter promoted posts tend to perform better, but this isn’t true for all businesses and niches. We suggest testing ad copy only after you’ve found a clear winner for your ad image.








Testing an audience takes a bit more effort. The particular audience you’re sending your ads to is going to affect everything from your click-through rate to your social shares. You’ll want to make sure you’re properly tracking your ad with UTM variables so you’ll know which target audience converts best. Once your ads are finished running, take the total number of opt-ins and divide it by the number of clicks on your ads. This will give you the conversion rate for that specific target, revealing which target gels better with your offer.





Does your specific offer and image convert better on different mediums? With roughly 50% of traffic coming from mobile devices, it’s important to know how your ads appear and perform on mobile. See if your ad performs better on mobile vs. desktop by creating a simple head-to-head split test in your ad provider of choice. To create the test, run one ad set on mobile exclusively while running the other ad set to only desktop users.



[][][* With roughly 50% of traffic coming ] *from mobile devices, it’s important to know how your ads appear and perform on mobile.













It’s essential to optimize your landing pages for conversion. All of the traffic in the world can’t help a squeeze page that doesn’t convert.

While there are some best practices that work well as guidelines, it’s important to remember that just because a red submit button worked better for one person doesn’t mean that it’s going to work best for your page or offer. Always test and make sure that your landing pages are mobile-responsive by visiting them on your phone or tablet!



Form Location





Where is your form located on your landing page? Placement plays a large role in your opt-in rates. As a general rule, you want to keep your landing pages fairly plain — this helps direct a visitor’s attention to the form.





If the plain approach isn’t jiving well with your crowd, add in a countdown timer associated with your offer’s expiration date, or maybe a few brief testimonials. If after several tests you’re not happy with your conversion rates, try directing a different segment of traffic to your page. A low conversion rate on a landing page is generally attributed to a poor page, asking for too much info (sometimes not enough depending on the offer), or low quality traffic that isn’t a fit for your offer.


Call to Action



What call to action are you using? How is it being delivered? Many marketers prefer the video call to action — users hit a landing page and are greeted by a brief video that provides a list of benefits and then asks the users to sign up. Another popular choice is a single sentence call to action. Test what works well with your crowd, and implement the winner!














Email has long been hailed as the king of ROI in the marketing world. It’s inexpensive to deploy and gets you right in front of your audience. That being said, if you haven’t been split testing, you might be leaving money on the table.


We’ve all read the many, many posts out there about email best practices, and, while they often offer very useful insights, they offer little to no personalized documentation or advice. Just because another company’s open rate went up 30% by changing their send times to early on Tuesdays, doesn’t mean that your open rate will increase by doing the same thing.


Many best practices articles don’t take into consideration the unique thumbprint that most lists and businesses have. Lists are going to respond differently based on a number of factors, including culture, geographic region, age, sex, occupation, and interests. Because of this, it’s important to take all advice you find on the Internet with a grain of salt and split test the results using your own business. For an in-depth analysis of email deliverability, check out this guide.


Now let’s get down to what to test in your email campaigns. Note that you should only test one of these items at a time (A/B test). Combining two or more tests will result in skewed data that can’t be evenly compared. Lights, camera, action!





Subject Line



For many audiences, short and sweet subjects (no more than three words) drive more opens. Meanwhile, others may prefer an in-depth description of what’s inside before opening. Test the punctuation and verbiage of your subject lines. Does your audience perk up when they see a subject ending with an exclamation mark? Maybe they prefer vulgar copy? A/B/C/D testing works great for discovering what your audience likes to see in subject lines.


Day/Time of Send



It’s important to test what time of day gets the best open rates. Different people check their inboxes at different times of the day. For example, a stay-at-home mom likely checks her email at a different time than a full-time lawyer. Think about your target customers and what time they would most likely check their email. Schedule an email to go out to half of your list at that time, then choose another time (maybe even another day) to send the same email to the other half of your list. Continue testing this until you find the sweet spot.


HTML vs Plain Text



Does your crowd prefer the straight-to-the-point style of plain text emails? Or do they prefer to look at something pretty and stylized such as an ONTRAmail template? Use the same content, subject line, and calls to action to see which type of email gathers the most clicks and opens, and use that moving forward.





Calls To Action



Opens are great but clicks drive purchases (no one ever purchased by simply opening an email — though that’s the first step). Split test your calls to action by using unique language in each version. Once you hone in on some copy that works, try driving traffic to different sources and see what converts the best.













There’s no end to the list of components you can split test, but we’ve given you a few major items to jump-start your split testing program. By testing and letting data drive your marketing decisions, you can ensure you’re getting the most bang for your buck.


A disclaimer: There will always be experts and specialists out there who will tell you what works best and what doesn’t. Many of these experts have access to thousands of different tests and view results at the macro level. Just remember that your list and your business is unlike anyone else’s, so listen to their advice, but confirm with your own tests.


If you’ve been testing all of these items and nothing seems to work, ask your current customers for insight. Ask them what content they prefer and where they like to see it. Ask your brand advocates to take a survey filled with detailed engagement questions. The answers provided will help you optimize advertising and marketing efforts.



[][]“You don’t want to make conclusions based on a small sample size. A good ballpark is to aim for at least 100 conversions per variation before looking at statistical confidence. If you have a lot of traffic, go for at least 250

conversions per variation. It’ll be more accurate if it’s 350-400 conversions per variation.”









We’re a software company that gives entrepreneurs and small businesses the online tools they need to grow their business instead of getting mired in the day-to-day of their business. Since our product launched in 2008, our mission has been to support entrepreneurs in delivering their value to the world by removing the burden of technology. ONTRAPORT is an incredibly powerful all- in-one tool that fully automates your small business.


ONTRAPORT Founder and CEO Landon Ray created ONTRAPORT while running another small business. He wanted to run his business on one platform

— and realized there was no such product out there! Landon decided if he

couldn’t find it, he’d build it. ONTRAPORT was created soon after.



Today, ONTRAPORT supports thousands of entrepreneurs across the world. We’ve been on Inc.’s 500/5000 list three years running, named twice as one of Forbes’ 100 Most Promising Companies, named as the SIIA Software CODiE Award Finalist for Best Relationship Management Solution and Best Marketing Automation Solution, and that’s not all.


Landon’s dream came true… and yours can, too! Connect with us on Facebook, Twitter, LinkedIn or Instagram and tell us what your dream looks like.




Don’t keep this info to yourself. Share it with your friends and colleagues!


Data-Driven Split Testing

You invested a lot more than just time into your marketing – you spent a small fortune on those ads, emails and landing pages. When you don't see a tangible return on those efforts, you feel like you're throwing money down the drain. Your marketing efforts are supposed to bring in revenue, but right now all they're doing is costing you money! Nothing is more disappointing. Finding out what's working and what's not working in your marketing doesn't have to be complicated. Many small businesses have found huge success online with a simple technique that saves them time and money by showing them exactly what their audience responds to. How do they do it? Split testing. Split testing helps small businesses stop spending money to promote ineffective messages and save time spent guessing what their audience would respond better to.

  • Author: ONTRAPORT
  • Published: 2015-09-15 01:20:17
  • Words: 2362
Data-Driven Split Testing Data-Driven Split Testing