This is the subhead for the blog post

I recently attended the Facebook webinar Get Holiday Ready: How to Build and Test Creative for Holiday Success. Don’t get too jolly, though; I’m not here to tell you about holiday. I’m here to talk about one slide that sparked a test (I only quoted the webinar for journalistic integrity). The slide said:

Ad sets that have both static and video assets see a 17% lift in conversion volume compared to ad sets that have static assets alone.  

Facebook emphasized that running both static and video in conjunction would result in better performance than just running one over the other. They didn’t have a statistic for percent lift in conversion volume compared to ad sets that have video assets alone. Perhaps they didn’t include this because most people don’t run video alone, but I’m not most people. I was rather curious. This one slide sparked the idea for a test.

Let me back up and give you some history. My client, a B2C in the higher education vertical, has been running Facebook campaigns to drive leads for online graduate programs. We have been hearing our engine reps emphasize video. “Make it move!” is the key directive we’ve been walking away with from every conference, summit, and webinar we’ve attended over the past year. Animating templates for each graduate program and developing ads with specifications for Instagram stories became our key creative focus. We bought into video so much that we paused static assets once we had developed enough video to have multiple variations for each graduate program’s ad set. After pausing our static assets, we started to see performance slump. Slightly fewer impressions, slightly higher costs per lead. Nothing drastic – we were running some tests with dynamic creative and CBO. Could those be dragging down CPL? Maybe the impressions dip was because our budgets were slightly thinner than they had been historically?

Then we attended the Facebook webinar. “Might it be worth a test? Could the static assets help bump up performance?” we asked ourselves. We reintroduced static assets for one graduate program. We immediately saw improvement in our lead volume, and cost per lead decreased. We realized that we were able to scale efficiently by adding static assets and invested 56% more in advertising this program on Facebook. Even though we only increased spend 56%, we saw lead volume increase 107%.

Had we kept investment the same with animation and static and seen the same CPL, we would have driven 58 leads. This would have been a 33% lift in lead volume. To recreate the slide from Facebook:

Ad sets that have both static and video assets see a 33% lift in conversion volume compared to ad sets that have video assets alone.  

We are getting to the point where we need to go beyond the A/B test. Even though in an A/B test, video may outperform static, we can’t just pause all static. We need to ask questions about the whole, rather than the sum of its parts. How does an ad set perform when there is video and static creative as opposed to one ad format alone?

This is a shift in digital marketer’s thinking. We used to A/B test everything. We are asked to trust that the algorithm is working to show the right assets to the right person at the right time, but we can still test. We just have to learn how to ask new questions – how does the whole perform if I change x?