The open-endedness of display ads can make testing and optimization challenging. The question “Where do I begin?” has run through the minds of account managers and designers alike for many years.

At AdBasis, we analyzed display ad meta data from more than 60 eCommerce and SaaS companies, looking at the variation meta data associated with ad design and comparing it to performance.  The goal of this analysis was to understand and provide five useful insights for creating or iterating display campaigns. With that in mind, this article is designed to give you practical action items that can be implemented quickly and easily.

But first, the sample size. How many ads did we look at?


So yes, we looked at a lot of ads. As part of this evaluation, we extracted design characteristics from these ads and are now going to discuss the elements that had the biggest impact on Click Through Rate (CTR) performance.

Our Approach:

The data for this study has been compiled by identifying each experiment within our database where the specified criteria (an identified ad design element) are being directly tested.

Once these experiments were identified, the percent changes in CTR were calculated for each experiment individually. AdBasis deliberately analyzed each experiment in isolation; no experiment was weighted more heavily than another. As a result, we were able to compensate for differences in ad group targeting, volume, market, timing, etc.


1.         Humans Work Best

When selecting the background or main images to use in your display ads, using images that contain people is the most reliable bet. Across the A/B tests we examined for all 60 brands as a whole, ads that contain images of people have a 12 percent higher CTR than those which do not. The takeaway: people like people!

2.          Use Photos Rather Than Designs 

If using images of people isn’t a valid option for your brand, using photos of other nouns (places or things) is a great option. Ads that contain a simple color or rectangle as the background do not perform as well as ads that use photos.

We studied all of our A/B tests that contained variation comparisons of photos versus “designed” background images. In these cases, the average CTR of ad variations that contained photos as the background images is 9.9 percent higher.



Picture123.         Use the Words “Now” or “Get” – Call to Action

There is some debate amongst marketers about which calls to action work best and which will ultimately get the attention of potential customers. Using action words like “Get” (7 percent) and “Now” (8 percent) both yielded significant improvements to CTR. Instilling a sense of urgency in the ad will increase CTR; the data proves it.


4.         Include Images of Your Products

Whether you are selling a piece of software or a physical consumer product, showing it off will help. We queried our database looking for experiments that used product screenshots or product images.

Within experiments that tested product screenshots or product images vs. other image types, the ad variations containing product images improved CTR by 18.2 percent.

5.         Blue is the Best Button Color

We also queried our database for ads that contained rectangles that made up less than 33 percent of the canvas. We would consider these “buttons.” We ran a query on the typical color of buttons created in our tool and posed the question, “Which button color works best?”

The answer is blue. When blue buttons are compared to others, the blue button color yields an average increase to CTR (6.19 percent). No other color had a common trend of increased CTR.


Action Plan

This was a very interesting study for us to perform; hopefully it is equally valuable to our readers. While it is great to use hunches to determine your optimal ad design, be sure to test your hunches and make decisions based on data.

These insights will give you a nice “jumping off” point, but be sure to test these findings for your brand individually!



  1. Olena October 28th, 2015

    Jason, thank you for the article. Really interesting article and results, but is number of clicks correct? It seems like ctr was only 0.1% and each ad on avg had just 2 clicks.

  2. Jason Puckett October 28th, 2015

    Hey Olena- Great question! The sample size is correct, .1% is about average for GDN CTR (that we see in our database). Great question about the clicks per variation… the reason this is below what you might expect is because this information comes from ad tests, and a large percentage of ad variations that are tested do not yield high click numbers and are ultimately thrown away by the advertiser. These figures do not include samples from once a wining ad variation has been found, which can hurt the per variation clicks. Hope this helps!

  3. Olena October 28th, 2015

    Thank for reply! CTR thing is clear. But I still fill a bit confused about clicks thing. Could you help out?
    151,456 ads are = “failed” ads (0-1 clicks) + “good” ones that had enough of stats for significant results (but there were really few of those then?)?
    151,456 ads are only “failed” ads (0-1 clicks) stopped by advertisers, so that spotted patterns in the article are based on other ads not included in those 151K, but with numbers enough for significant testing?
    Did I understand correctly? If it’s option #2 then why number of those ads are not included?

  4. Jason Puckett October 28th, 2015

    Hey Olena:

    I am more than happy to share the full report with you! Please email me at

  5. Jake December 3rd, 2015

    Jason, great article. This was very informative and helpful and will definitely save me time as I build my business.

Leave a Comment

Jason Puckett
Jason Puckett is the CEO & Founder of AdBasis. Ad creative optimization is the name of Jason's game. AdBasis is an A/B and Multivariate testing platform for search, display, remarketing & mobile ads. Jason is a digital marketing strategist, ad optimization expert and conversion rate enthusiast.