A/B testing Archives - Tuff tuffgrowth.com your growth team for hire Thu, 05 Dec 2024 15:08:30 +0000 en-US hourly 1 https://wordpress.org/?v=6.8.3 https://tuffgrowth.com/wp-content/uploads/2023/12/cropped-Tuff-Logo-32x32.png A/B testing Archives - Tuff 32 32 Creative Testing at Scale: A Framework for Marketing Leaders https://tuffgrowth.com/creative-testing-framework/ Thu, 05 Dec 2024 13:06:52 +0000 https://tuffgrowth.com/?p=35685 Consumer behavior and ad platform algorithms never stop evolving. What drove engagement and conversions last month might not move the ...

The post Creative Testing at Scale: A Framework for Marketing Leaders appeared first on Tuff.

]]>
Close-up of hands typing and analyzing data on a laptop relevant to creative testing framework

Consumer behavior and ad platform algorithms never stop evolving. What drove engagement and conversions last month might not move the needle today. To keep your campaigns effective and your strategies sharp, you’ll need more than intuition—you’ll also need a repeatable, data-driven approach to testing creative at scale. At Tuff, we specialize in growth marketing, and our proven framework for creative testing is designed to deliver insights that translate directly into revenue.

The Strategic Value of Creative Testing

For growth-minded marketing teams, creative testing isn’t a luxury; it’s a necessity. With millions of dollars at stake, every creative decision must be validated through data. Our method of testing empowers your team to identify what truly works and double down on strategies that yield the best return on investment. In an era where marketing budgets are scrutinized more than ever, creative testing ensures that your spend isn’t just well-allocated—it’s optimized.

By investing in a robust creative testing framework, marketing teams can achieve more efficient ad spend, improved campaign performance, and a deeper understanding of what drives their audience to action.

How to Build a Scalable Creative Testing Framework

We know your marketing ecosystem is complex. Managing diverse campaigns, handling significant budgets, and meeting the expectations of stakeholders demands a sophisticated approach. At Tuff, we’ve developed a streamlined testing framework that delivers clarity and impact. Here’s how you can put it into action.

Step 1: Conduct Research and Develop a Hypothesis 🔬

Your first step isn’t launching ads—it’s analyzing data. Dive deep into historical performance to identify patterns and opportunities. What type of imagery has historically driven the highest click-through rates? Does a specific message style or call-to-action consistently outperform others?

🙇 Example from Tuff: When we reviewed performance data for a B2B SaaS client, we found that ads featuring real employees, particularly the founder, had 35% higher engagement compared to more product-centric visuals. This led us to a hypothesis: “If we feature the founder prominently in our creative, we’ll see an uptick in click-through rates and conversions.” From there, we developed a series of ad variations to put this theory to the test.

The stronger your hypothesis, the more actionable your insights. At Tuff, we never base hypotheses on guesswork—only on concrete data.

Step 2: Define Crystal-Clear Success Metrics

Before you even think about running your test, lock down your metrics. Which KPIs will you use to determine whether a creative variation is a success or failure? For our campaigns, we often look at metrics like CTR, conversion rate, CPA, and return on ad spend (ROAS).

🙇 Example: In the founder-focused campaign, we zeroed in on CTR and volume of conversions as our North Star metrics. We set historical benchmarks to understand whether our creative was an incremental improvement or a game-changer.

Tip: Don’t get lost in the data. Focus on metrics that directly impact your bottom line. This keeps your creative analysis relevant and actionable.

Step 3: Address Critical Questions Before Execution

A well-executed test starts with answering these foundational questions:

  1. How Much Budget Should You Allocate? For large-scale testing, dedicate 10-15% of your total monthly ad budget to experimentation. This budget gives you the bandwidth to test multiple creative concepts without sacrificing the performance of your primary campaigns.
  2. How Long Should the Test Run? The duration depends on the size of your audience and the budget. Large audiences paired with significant budgets can yield statistically significant data in 7-10 days. Niche audiences may require a full two weeks.
  3. Which Audience Should You Target? Choose a top-performing, upper-funnel audience for faster, more reliable data. This approach not only speeds up your learning but ensures you’re testing on an audience segment that matters.

Marketers need to move quickly but thoughtfully. Making informed decisions about budget and audience can make or break the success of your test.

Step 4: Build a Well-Structured Campaign for Reliable Data

At scale, even small inefficiencies can lead to big problems. Here’s how to set up your campaign for maximum clarity:

  1. Create a Dedicated Campaign: House all creative variations within one campaign to control variables and simplify your reporting.
  2. Configure Ad Sets Thoughtfully: Each ad set should represent one variation. Distribute your budget evenly to ensure a fair test. This prevents any single creative from dominating the budget and skewing results.
  3. Leverage Advanced A/B Testing Tools: Meta’s A/B testing feature is a game-changer. It divides your audience evenly and prevents overlap, ensuring each creative variation gets equal exposure. The cleaner your data, the more confident you can be in your conclusions.

Clean, unbiased data leads to clear, actionable insights. Our structured approach ensures that you’re not just gathering data—you’re gathering the right data.

Step 5: Analyze the Results with Precision

Once your test has run its course, it’s time to dig into the metrics. Did your winning creative truly outperform the others? Are the results statistically significant, or could they be due to chance?

🙇 Example Results: In the founder-focused campaign, we observed a 23% lift in CTR and an 18% increase in conversions compared to the next best variation. Compared to our historical benchmarks, this winning creative also delivered a 9% reduction in CPA. The data was clear: Founder-led visuals resonate deeply with our audience and drive more profitable results.
Don’t just celebrate your wins. Document why they worked and consider how to scale these learnings across your entire marketing operation.

Step 6: Document Findings and Make Optimization Ongoing

Your testing doesn’t end once you have a winner. To keep improving, treat each test as a learning opportunity. Document what worked, what didn’t, and why. Share these insights across teams to build a knowledge base that can inform future campaigns.

Make optimization a habit. At Tuff, we regularly revisit our creative strategy, using new data to inform our next round of testing. This cycle of continuous improvement keeps our clients ahead of the curve.

Navigating Challenges in Creative Testing

Even with a solid plan, testing at scale comes with hurdles. Here’s how to address common obstacles:

  1. Stakeholder Alignment: Big campaigns often involve multiple stakeholders, so it’s important to align on what we want to learn before testing begins. Are we validating messaging, testing new creative formats, or exploring audience behavior? Clear goals at the start help everyone understand why strategies and briefs are set up a certain way. Use simple reporting tools to keep communication transparent and show how the data connects to those goals.
  2. Data Fatigue: Analyzing vast amounts of data can be overwhelming. Simplify your focus to a few key metrics that matter most to your business objectives.
  3. Audience Overlap and Fatigue: Avoid overexposing the same audience to multiple variations. Rotate your creative regularly to keep content fresh and engaging.

Tuff’s Approach: We use segmentation strategies and rotate creative frequently to avoid fatigue while keeping messaging aligned with campaign goals.

Scaling Insights Across the Org

Once you’ve nailed down what works, it’s time to scale. Apply successful creative elements to different campaigns, regions, or even departments. Share your wins and learnings broadly to maximize their impact.

🙇 Scaling Example: We worked with a global retail brand to expand a winning social media campaign into email and paid search. By maintaining consistent messaging and visuals, the brand experienced a 15% lift in overall campaign performance.

Elevate Your Creative Testing for Growth

We believe that strategic, data-driven creative testing is the key to unlocking your marketing team’s full potential. Our framework is designed to maximize efficiency, deliver insights that drive growth, and make sure every dollar of your ad budget is working for you. If you’re ready to transform how you approach creative, let’s talk. Tuff is here to help you scale smarter and faster.

Book a call with us!

The post Creative Testing at Scale: A Framework for Marketing Leaders appeared first on Tuff.

]]>
How to Set Up an A/B Test so You Get REAL Results https://tuffgrowth.com/a-b-test-creative/ Mon, 05 Dec 2022 15:42:49 +0000 https://tuffgrowth.com/?p=33775 When analyzing ad creative, it can be hard to tell which element really makes a piece resonate with the viewer. ...

The post How to Set Up an A/B Test so You Get REAL Results appeared first on Tuff.

]]>

When analyzing ad creative, it can be hard to tell which element really makes a piece resonate with the viewer. Is it the bold color? The little bit of sass? The scroll stopping typography? Or is it how you articulate the value you’re offering? The audience you’re addressing? 

There are so many variables that it can feel overwhelming. That’s when it’s time to start A/B testing!

At Tuff, methodical A/B testing has allowed us to both make small yet powerful tweaks to well-performing ads and also generate powerful business-level insights. Most importantly, it helps us drive up click-through rate, and drive down costs. This article will take you through our process from the initial creative development all the way to the implementation and analysis of A/B tests so you can supercharge your ad creative too.

A/B Testing 101

An A/B test is when you run 2 versions of the same ad, but with one variable that’s different. You can then look at the ad performance to see which version more effectively resonated with your audience in order to gain insights that will help your overall campaign.

Below is an example where we were A/B testing different hero images to see if people would be more likely to engage with language about “virtual photoshoots” or language that targets a more sophisticated creative demographic that might be looking for “photo assets.”

Here are the results: 

  • A Variant (photoshoot company)
    • CTR = 0.36%
    • Purchases = 20 
  • B Variant (photo assets) 
    • CTR = 0.24%
    • Purchases = 1 

ad creative a b test

Notice that every variable is the same from the logo color and placement to the lay-down imagery to the highlight color. While it would have been possible to truly isolate one variable here in this ad (like JUST changing one word) there’s strategic decision making that went into our approach so, before you put pen to paper, it’s important to ask yourself…

Where to Start?

A/B testing for A/B testing’s sake is just about as useless as hanging your clothes out to dry in a rainstorm. Without some forethought and a strategic outline, you’ll be left with results and also a resounding, “so what?”

So, at the outset of creative production, we’ll get our full team together. This usually consists of a growth marketer, or big-picture strategist, a campaign manager, or the person deep in the day-to-day on-platform optimizations, and the creative strategist, or the person on the hook for developing creative that’s on-trend, on-brand, and platform-specific. We’ll ask ourselves three major questions:

1. Why do we want to run this test?

An A/B test that originates in an actual impactful question makes a significant difference in its effectiveness. This can be as simple as: “is our creative more compelling when it features people or just products?” It can be as complex as: “should we position ourselves as a disruptor in the market or a powerful alternative to other stale options?”

2. How do we measure success?

It’s important to outline your KPIs for an A/B test before you develop your ad creative. 

If click-through rate (CTR) is your primary metric that determines success, then you’ll want to seriously consider your hook, the headline, and potentially the CTA as elements that move the needle. 

If you’re more focused on a metric like cost per action (CPA), then you’ll want to consider your holistic message and how well the information, look, and feel of the ad matches the landing page experience in order to determine success. 

Finally, if you’re considering a metric that tells you more about general awareness like cost per 1,000 views (CPM) or watch rate, then you’ll need to talk about what elements of creative will be responsible for driving those metrics and honing in there. 

3. What do we expect to learn from it?

Remember 8th grade science class? Then you’ll remember how important a hypothesis is. It’s critical to have a clear understanding of what you hope to learn, or what you expect to discover by running your A/B test so that when it concludes, you’re one step closer to driving optimal performance with your ad creative. This is closely related to point #1 above, but takes it one step further. Here’s what that looks like using the two examples.

“is our creative more compelling when it features people or just products?”

  • Hypothesis: the presence of people demonstrating our product in use is a crucial part of generating interest and sales of it. So, we expect to learn that more education about HOW our product works is an important step in the user journey.

“should we position ourselves as a disruptor in the market or a powerful alternative to other stale options?”

  • Hypothesis: people are curious about us as disruptors, but are more efficiently compelled to sign up/request a demo when they understand how to compare us to one of our competitors or a “poor alternative.”

4. How are we going to go about it?

This is where the real cross-team fun begins! Before you begin your brief and define how to build your creative to get the findings you’re after, it’s critical to collaborate with your campaign managers to understand their approach to creating an appropriate environment within the right channel. Is this a test for Meta, LinkedIn, or YouTube? What’s the right audience? How about budget and duration? Aligning on these points will set you up for success. 

Developing Creative for Your A/B Test

One of the most popular elements to start A/B testing is the hook of a video.  A hook refers to the first 2 seconds of a social video ad because this is the time it takes most viewers to either get engaged or keep scrolling. It’s basically the 2022 equivalent of the age-old newspaper term “above the fold.” 

A/B testing can be done on a fresh ad idea (like the still images example in the previous section), but we’ve also found success through using A/B testing to revamp a struggling ad that we believe still has potential. This is useful when we know a video ad has elements and messaging that has worked in other iterations, but for some reason, it is failing to draw an audience. This is a great opportunity for developing an A/B Test around a new hook. If we can get a successful hook, viewers will stick around for the valuable meat of the message. 

An example of this was for our product photography partner, soona. On our first round of creative, we were optimistic about a testimonial concept that ended up underperforming. The ad had a lot going for it, the testimonial was sincere, the edit was fun and the messaging was similar to ads we’ve seen succeed in the past. So instead of scrapping the idea, we created an A/B test where we produced a B variant with a new hook. 

We hypothesized that the ad struggled because the viewer didn’t know what soona was quickly enough, so for the B variant, we added a 2 second frame with copy explaining it (a virtual product photography company) with photo examples before jumping into the testimonial. 

Then, we isolated these two videos in a fresh campaign with a modest defined test budget to ensure that the performance of the original wasn’t impacted by an audience that had already seen it.

Here’s our original (on the right) and our B variant (on the left). The two videos were exactly the same except for the fresh 2 second hook in variant B. Headline and text copy remained exactly the same between the two:

testing ad creative

Ultimately, the results were surprising:

  • The original video (with Rob’s face as the hook) showed a significantly stronger click-through rate:
    • CTR = 0.25%
  • The B variant:
    • CTR = 0.11%

So, we proved our hypothesis wrong! And in the spirit of science, we didn’t ball our test up and throw it out, we dug in deeper to try to uncover what it is about this asset that lagged behind. Ultimately, we ran several more additional tests and discovered that short, quippy, brightly-colored stills tend to generate our strongest click-through rates, reaching upwards of 0.8%. 

Top Tips for Structuring Your Test Campaign

Let’s get into the nuts and bolts of how to set up your campaign structure in order to walk away with definitive findings like the ones we generated above.  

The first thing you’ll want to do when planning how to execute your A/B test is put together a strategic plan that outlines the following:

  1. How you’ll test the two variables against each other 
  2. Where the test will take place
  3. How long you’ll be running the test
  4. KPIs for success

There are two ways to implement a A/B test on paid social –– using Facebook’s A/B tool or isolating the variables and measuring the data manually. 

To use the A/B testing tool on Facebook, you’ll want to create two new ad sets using the same audience (one ad set with variable A and one ad set with variable B). From there, you’ll click the A/B test button in the main navigation which will take you to a page that looks like this to set up your test. 

testing creative on facebook

For the A/B test above with the two different text graphics, we wanted to make sure that we were isolating the two different assets and testing them in the same audience on Facebook so that way we could have concrete data on which graphic performed better. Since we know that the KPIs vary for different audiences based on where they fall in the user funnel, we typically like to execute the A/B test in one prospecting audience and one retargeting audience. 

From there, we decided that we’d run the test for 10 days or until each audience reached $500, which would give us enough significant data to analyze and determine which variable was the winner. 

Our main KPI for this A/B test was CTR. We also layered on number of purchases and cost per purchase as secondary KPIs. 

Think Like a (Creative) Scientist

Ultimately, to get REAL results from your A/B test you’ve gotta keep an open and unbiased mind and lean in with curiosity. If your test doesn’t generate the results you were hoping for, change up the creative and test again! When you’re after insights greater than “which color button performs better,” you’re reckoning with a number of different variables. So, creating a new hypothesis, and diving back into another test is the best way to keep you inching closer to creative assets that unlock amazing performance. 

Ready to dive into an impactful A/B test with us? Let’s talk. 

The post How to Set Up an A/B Test so You Get REAL Results appeared first on Tuff.

]]>