Today’s post is by Terry Whalen, Managing Director at Sum Digital, a San Francisco digital marketing agency. Terry has managed millions in PPC spend for consumer and B2B advertisers since 2006.

AdWords Campaign Experiments (ACE) is testing functionality that was introduced in June 2010. It was all the rage back in summer 2010, but it was difficult to use and the functionality wasn’t great. Although ACE still carries the “beta” label, its functionality has improved since it was first introduced.

What ACE Doesn’t Do

You cannot experiment with campaign-level settings using ACE. This is a huge disappointment. Campaign-level experiments would mean we could test things like bidding strategies, bidding technologies, various campaign structures, geo-targeting, and day-part strategies.  Were ACE implemented in a more useful way, we could split-test a control campaign against an experiment campaign, which would allow users to test specific campaign-level settings or many at once. We could even pit two PPC managers against each other. As of June 2012 you can’t do this with ACE, which is an epic fail.

What ACE Does

ACE does allow for some other interesting types of testing, such as ad group structure, ad testing, and bid tests. For example, you could test how percent changes in bids affect conversions and marginal cost/conversion.

Here’s how to do it. The easy part – go into campaign settings, scroll down to the bottom of the page, and click ‘ExperimentBETA’ and then ‘specify experiment settings.’ Name it and click ‘save.’ You don’t need a start and end date since you’ll hit the ‘start’ button when you’re ready, and you’ll hit the ‘stop’ button when you’ve gotten the data you need. And the default 50/50 split between control and experiment is perfect for most scenarios.

Once you click ‘save,’ you will see little experiment beaker icons – as long as you have navigated within a campaign where an experiment has been saved.  The default beaker icons indicate ‘control and experiment’ data. (More on that later.)

You will also see an additional ‘experiment’ option in the ‘segments’ dropdown menu in the ad groups, keywords and ads tabs, but again, only when you have navigated within a campaign where an experiment has been saved.

When you click the experiment option in the segment dropdown, you will see new row breakouts as follows: ‘outside experiment’ indicates data that was recorded while there was no experiment running. (This means it’s not super-important to remember the date when you started an experiment, since AdWords isolates this data as outside of experiment data.) ‘Control’ means you are looking at control data. The ‘experiment’ row is also just what it says. Importantly, remember that the ‘control’ row will have *all* control data: the data from elements set as ‘control-only’ as well as the data from elements that are labeled ‘control and experiment.’ Similarly, experiment-only data will include data from elements that are labeled as ‘control and experiment.’ This means that if you are experimenting with only a subset of your campaign data, your results may be “watered down” when viewed at a rolled up level such as the campaign level.

Now that you have saved an experiment, you are ready to do the slightly harder part: set up your experiment test elements. To set your experiment variables, you need to remember to use the ‘experiment’ option in the segment drop-down – otherwise, your control/experiment rows will not be visible to you. Let’s say you are testing a 20% keyword-level bid increase and its effect on campaign performance. Pull the account down into AdWords Editor or get recent changes so the account is up-to-date. Now add the ‘max CPC multiplier’ column and set your max CPC (bid) multiplier. Note – the maximum allowed number of experiment elements per campaign is 1,000, so you may have to prioritize which keywords you choose to modify as experiments. (I have never discovered a way around the 1,000-element limit.) Upload your experiment changes.

Once you have your experiment bid multipliers set, you are ready to click ‘start running experiment’ at the bottom of the campaign settings area for the campaign in question. Now you are split-testing traffic to 1,000 or less keywords in the campaigns where you have experiments running. As data accumulates, AdWords elegantly indicates whether there is statistically significant data between control and experiment splits. Arrow icons that are grayed out mean that you have no statistically significant data for the data set you are viewing. Icons in dark blue signify that there is a statistically significant increase in the metric, while icons in light blue indicate a statistically significant decrease. One arrow signifies 95% confidence; two arrows mean 99%; and three arrows indicate 99.9%. But in order to see the data, you must remember to click ‘experiment’ in the segments dropdown!  Also, the experiment break-out can be viewed at the keyword, ad, ad group or campaign level.

Statistical significance indicators are relevant to whatever level of data you are looking at. As an example, at the campaign level it may be the case that there is no statistical significance in your experiment data; but it may be the case that looked at on an ad-group or keyword level, you do have significantly different results between control and experiment. Of course you may still need to export to Excel to do your analysis, including if you’d like to look at experiment data in a more granular fashion. For example, we once found that increasing bids by 20% for a certain client account caused a significant increase in conversions at a marginal cost per conversion that was only slightly above our average cost per conversion; but this was the case only for keywords with an average ad position of 2.5 or higher on the page (we excluded brand traffic, of course).

Experiment Wrap-up

Once you feel that you have enough data (or once AdWords indicates that you do), you are ready to turn the ACE functionality off. Navigate to campaign settings and click ‘stop running experiment.’ Next, AdWords forces you to either ‘apply: launch changes fully’ or ‘delete: remove changes.’ If you have reasons you want to keep control and experiment elements running, you are out of luck.

 

But there is a simple fix – go back into AdWords Editor and change your ‘control only’ and ‘experiment only’ splits to ‘control and experiment.’ Now you are free to click ‘launch’ or ‘delete’ and truly end your experiment while making no changes to the account. Google does specify any elements that will be deleted upon clicking ‘launch’ or ‘delete,’ and if you re-set everything back to ‘control and experiment’ then you should see zeros in the ‘the following experiment changes will be deleted’ notification.

One final note – once you end an experiment you will no longer be able to view experiment splits, so make sure to save the data in Excel if you think you may want to look at it again. Happy testing!

Terry WhalenTerry Whalen, CPC Search

10 Comments

  1. PPCAssociates June 13th, 2012

    AdWords Campaign Experiments – Try It (Again)!:… http://t.co/0TJkKbwt

  2. IanRhodesUK June 13th, 2012

    Good read – Step-by-step guide to building Adwords Campaign Experiments – http://t.co/pVQdfmIp – via @cpcsearch

  3. GetFoundFirst June 13th, 2012

    RT @Mel66: Good reminder of a great feature! AdWords Campaign Experiments – Try It (Again)! | PPC Associates Blog http://t.co/pdnYAj7F

  4. tdwhalenho June 13th, 2012

    My guest post: AdWords Campaign Experiments – Try It (Again)! | PPC Associates Blog http://t.co/2hASy0Uh

  5. rodnitzky June 14th, 2012

    AdWords Campaign Experiments – Try It (Again)! – Today’s post is a guest entry from Terry Whalen, Managing Director … http://t.co/ZnAoDQmH

  6. RocketClicks June 14th, 2012

    How to use AdWords Campaign Experiments http://t.co/p1a8Ka9n via @PPCAssociates

  7. Nansky June 14th, 2012

    RT @LunaMetrics: AdWords Campaign Experiments – Try It (Again)! – http://t.co/l64l4FPT via @PPCAssociates

  8. Szetela June 15th, 2012

    RT @ppcbuyers: Great post about #AdWords Campaign Experiments by @tdwhalenho http://t.co/P0FnFOOP

  9. togidolls June 16th, 2012

    RT @jmthefourth: AdWords Campaign Experiments – Try It (Again)! http://t.co/p4uzIfAH @cpcsearch @tdwhalenho @ppcassociates

  10. SemBarista June 17th, 2012

    RT @Szetela RT @ppcbuyers: Great post about #AdWords Campaign Experiments by @tdwhalenho http://t.co/pFw7CXmA

Leave a Comment

Terry Whalen
Terry Whalen is Managing Director at Sum Digital, a San Francisco digital marketing agency. Terry has managed millions in PPC spend for consumer and B2B advertisers since 2006.