Spotlight On: Creative Testing Best Practices for Q3 2019

Share this:

User acquisition advertising is evolving rapidly. Every quarter for the last few years, either Facebook or Google has made significant changes to their platforms that make it more and more possible to automate user acquisition advertising. Because these changes are available to everyone, competition has increased. Any competitive advantage that third-party ad tech tools had given is gone. 

The last thing the machines have not automated or started to automate – creative – ends up being a UA manager’s last competitive advantage. 

This makes every aspect of creative vital to success. 

Most Ads Fail

Creative excellence isn’t easy. In our experience, after spending over a billion dollars in user acquisition advertising, usually only one out of 20 ads can beat the current top-performing ad. 

The reality is, most ads fail. The chart below shows the results from about 600 different ads. Spend was distributed based on performance. Out of those 600 ads, only a handful were responsible for the lion’s share of results.

If you can’t test ads quickly and affordably, your campaign performance is going to be crippled. But testing alone isn’t enough. You also have to generate enough creative to fuel that testing machine. And, because creative fatigues so quickly, you need 20 new creative concepts every month or possibly even every week. 

How to Test UA Creative in Q3 2019 Quickly and Affordably 

If most ads fail, the best way to find a breakout ad is to test a lot of creative. The secret is to do that testing affordably and quickly with a disciplined, methodical practice for testing. 

We use “prototype” ads to get around those pitfalls. Prototype ads are run for short periods of time to determine if they will perform or not. These ads may bend or even break brand guidelines, but this doesn’t matter too much because they’ll typically get only 25,000-50,000 impressions or fewer, and then the winners will be polished and used to scale a campaign.

Image is from

20% Concepts / 80% Variations

Prototype ads can also be used for variations. When we test variations, we take a winning ad and use it basically as a template. Then we test dozens of slight variations to the ad to see if we can’t squeeze better results from it. 

80% of what we test is a variation. This minimizes losses that happen when you test bold new concepts but still leaves room to test enough of those new concepts to keep ads fresh and to keep creative teams from getting stuck in a rut. 


Creative Testing Best Practices

We’ve developed a creative testing methodology to find winners faster and more affordably than is traditionally possible. Our core methodology right now (it is always evolving) is as follows:

  • Creative Audit
  • Competitive Audit
  • Creative Strategy
  • Creative History of Winners and Losers
  • Concept Refresh
  • Winner Variation Testing
  • Asset Folders for Winning Ads

Here are the details for each of these best practices:

Creative Audit

First, we’ll dig deep into the prior performance of a campaign, focusing on the creative assets. Doing an audit like this helps avoid repeating the same tests and mistakes made before. 

Competitive Audit

Competitors’ ads are a bank vault of creative insights – if you know what to look for. Facebook’s new Ads Library is a great way to see which ads your competitors have been running. But it lacks conversion data, impressions, and interaction data. So, we also use tools like Social Ad Scout, Connect Explore, SocialPeta, AdSpy, and others to get that information. 

Image is from

Creative Strategy

This is a plan based on the creative audit and the competitive audit to reduce failure across the team. It should be an evolving map for how to do testing and messaging. 

Creative History of Winners and Losers

This is a document of everything we’ve tested, why we tested it, and the results of each test, updated weekly. This minimizes repeated tests and mistakes and builds on what’s worked.

Winner Variation Testing

Once we have a winning ad, we’ll test every element to figure out which elements or combinations are making the ad work. 

Concept Refresh

After we’ve done the winner variation testing, we know which elements matter most so we can keep the anchor elements that are driving results, and refresh everything else in the ad. 

Asset Folders for Winning Ads

Whenever you get a winner, all the files — the videos, the Sketch files, the Illustrator files, the Photoshop files, the music files, all of it — get dumped into a folder. When you create variations, use the elements from the winning ads folder only.

From min 16:30

Creative development, testing, and strategy are still best done by human beings. The algorithms at Google and Facebook may be able to test creative elements, but they still can’t create those elements. They can’t do competitive analysis, either. And they can’t plan out a coherent creative strategy.

If you’re an acquisition manager, focus on expanding your skills in those areas. You don’t necessarily have to become a creative, but you do need to show creatives how to become data-driven, and you need to be able to distill and interpret data for them so they can deliver better creative.

Brian Bowman is the Founder & CEO of He has profitably managed over $1B in online advertising spend and product development for leading online brands including: Disney, ABC,, and Yahoo.