Posted by & filed under PPCChat.

Julie F Bacchini hosted this week’s PPCChat session. It was an interesting session where PPCers expressed their views on the regular testing in their accounts, the kind of tests they are running in search and social ad accounts, which is experts favorite test to run on any platform, and more.

Q1: Are you performing regular testing in your accounts? Has your ability to run tests changed in the last 12 months? If so, why do you think that is?

With RSA’s being forced on us, ad testing has, IMO, become a bit of a nonsense these days. @stevegibsonppc

I’m still testing ad copy, LPs and hesitant to apply RSA in my accounts. For accounts were we still bid manually we test audiences, demos etc. Currently I’m trying budget bid strategy in sa360. @Anna_Sorok

Yes and no. We have a lot of interest and enthusiasm for testing, but short campaigns + automated optimizations make for very muddy data. Lots of cross-channel creative/audience testing. @JuliaVyse

Testing has gone through a hiccup this last year due to Covid. Had to change messaging and create new ads. Ran these against existing creative. Covid creative now winding down. @Realicity

Yes, I am always testing something! That being said, I have found it more difficult to test things as automation has been “encouraged”. For example, ad testing is nearly impossible in G Ads and has been for a while. @NeptuneMoon

Still testing ad copy regularly. When you throw in RSA though it all starts to go to crap. Even when an RSA “wins” you have no idea why because the reporting is basically nonexistent. @robert_brady

I test the following regularly (as in one major test per quarter): 1. Creative: ads and landing pages 2. Keyword champions: picking a different variant or idea entirely. 3. Bidding strategies: if the client objective has changed, will factor that in. (t) @navahf

Yes, extensively, but none are earth shattering, it is mainly just these small one offs that I’ve wanted to do for long time. @JonKagan

Yes but with less visibility into what actually works (RSAs) and less control. Finding ourselves having to trust the black boxes more…not always easy. @AndrewCMiller

Yes! With the shifts in automation and the lack of visibility in search terms, our testing has changed. @snaptechmktg

Always Be Testing! Our accounts are typically lower volume or niche, so tests run for longer. The constant shifting and the lack of visibility in search terms has affected us. Also the “rotate indefinitely” not being a thing / not like it worked anyways. @amaliaefowler

Yes, now I am testing frequently as the Google has clearly shared their motive like they are going towards automation so we have to do testing from the purpose of keyword matches, bid strategy locations and more…@cmshah93

Yes, always testing something. No, we may not be able to run the exact same tests we used to or even necessarily measure it the exact same way but we have found ways to keep going. @selley2134

Lack of search term info stunts some tests (I’m less likely to use DSA as a starting campaign type and I’m more prone to use “proven” keyword concepts). I will be interested to see if @Hoffman8‘s theory around display placements comes to pass: will limit tests. @navahf

Kind of. Google is forcing a lot of automation on us so it’s been a little hard especially with testing ads because of RSAs. @ameetkhabra

Yes! Different automated bidding strategies, multi match type ad groups, different competitor strategies, ad types, campaign types (DSA)… the list goes on ad variations have been slightly more difficult to test but otherwise no major issues. @sonika_chandra

Q2: What kinds of tests are you running in search ad accounts (Google or Microsoft Ads)?

Audience testing is major for us. On a generic keyword, like ‘rebate’ where many types are offered, which audience is most likely to generate a lead? On a generic keyword like ‘burger’ which audience is most likely to head to a drive thru? @JuliaVyse

Ad copy and bid strategy tests are the two we run the most often. Although we often have to do it before/after due to a lack of volume for simultaneous testing. @snaptechmktg

Usually ad copy, landing pages & bid strategy! Although the latter not as often. Given the frequency of changes my motto with bid strategy is if it ain’t broke, don’t test it. @amaliaefowler

(Google Ads) Device wise campaigns, Bidding strategy Manual vs Automatic with same keywords and targeting, etc… @cmshah93

And in microsoft, it’s all about audiences vs linkedIn targeting. I love using both, but in separate ad groups so I don’t narrow the pool. @JuliaVyse

Bidding strategies, ad copy, landing pages, demographics/audiences, etc. The new one I have been adding is different keyword tests. seeing how different match types work across campaigns/accounts. @selley2134

Lately – bidding strategies. Wow can they tank performance, or even ads being served… Also doing more with audiences and RSAs (responsive search ads). @NeptuneMoon

We mainly test bid strategies, try to do as copy as best as we can and landing pages! @ameetkhabra

Microsoft has been putting out new things like crazy that I would love to start testing. Just waiting on the right client/budget. @selley2134

Mainly ad copy and landing pages. But I’m also layering audiences. @stevegibsonppc

Begrudingly…smart campaigns. @JonKagan

Standard text and CTA tests for sure, but also experiments for landing page URLs for brand terms. Trying to answer “which pages convert best for [brand + modifier]?” Also testing LPs vs on-site pages for each. @AndrewCMiller

(will divide by network): @GoogleAds tests: 1. Keyword champion tests: picking different variants/match types. 2. Ad schedule/geotargets: helping budgets find their best chance at ROAS. 3. Display targets: campaign for audience vs topic to see which performs better. @navahf

Ad copy, bid strategies (specifically bidding down more on manual bidding), and keyword match types. @anna_arrow

(gc) 4. Call only ads vs local service ads: some markets still outperform the LSA format & I don’t know until I test. 5. Ad vs extension copy: testing creative in ads vs extensions to see if it’s needed to hook in folks. 6. Campaign structure tests. @navahf

@MSAdvertising 1. Running campaigns that leverage ad group level schedules/location targets. 2. Layering in industry audiences as bid adjustments 3. DSA: especially excited about the static headline. I do a lot of the same tests as on the google side too. @navahf

Landing page tests, ad schedule test to align with linear tv ads (capturing searches motivated by our commercials) and bid strategy tests (g ads/sa360) @Anna_Sorok

Lol oops answered this already also plugging incrementality testing and most importantly PREPARING for future incrementality tests. Or as I like to say “focusing on optimizing the crap out of it” @nataliebarreda

Q3: Are there tests that you’d like to be running (or used to run) that have become more difficult or impossible with increased automation in Google Ads? How has your testing changed in the last year?

Ad copy testing has become quite frustrasting now – which someone did mention earlier. Especially with RSAs. How does anyone efficently know what’s working in those cases? @TheMarketingAnu

Nothing has become impossible but our parameters and definition of success have had to change. Often we’ve had to scrap results or ensure they’re not random due to a Google shift or external event. @snaptechmktg

I wouldn’t say we’ve had anything become impossible, with our low volume accounts testing was always going to be more difficult. I will say its definitely been tough having to stop some / throw them out due to external circumstances. @amaliaefowler

I miss the days of testing ETAs without RSAs getting ALL of the impressions. @selley2134

Only match type related changes has affected my tests otherwise the testing ability with the automation has increased significantly for me… @cmshah93

Per my earlier answer, ad variant testing is virtually impossible to do. You have to pause the “winner” to even have another ad serve more than a tiny % of the time. Keyword testing is harder too – G will match pretty broadly to all types. @NeptuneMoon

Automation hasn’t really stopped particular testing for me. Honestly the big problem is clients and partners who throw a few creatives into one audience and see which one the algorithm picks and call that ‘testing’. It’s attitude more than tools. @JuliaVyse

I shared this before – in one account we wanted to test turning off keywords of a particular theme. Did so, and G Ads kept on serving queries for those terms to other terms. No negative option to force it either. @NeptuneMoon

It just comes back to ad copy testing. It’s becoming harder these days which is super annoying. @ameetkhabra

This is going to sound trite, but not really. If pressed, my keyword champion tests are a tad tougher without search term data, but overall, I’ve shifted my management strategies to work within the new “restrictions”. @navahf

I love re-running non-search media absence and search media absence tests on overarching site performance. Helps reprove theories and show larger impact. @JonKagan

I am aggressively on team “Adopt Automation” so in my POV, if it’s been automated I can focus somewhere else that actually matters. Granted, I realize that comes with the privilege of managing a large account. @nataliebarreda

Also, we are running a lot of Display ad creative tests. It’s becoming more difficult with Google’s push on RDA. Google ads is just not providing meaningful data to run tests through the RDA @Anna_Sorok

Q4: What kinds of tests are you running in social ad accounts (such as Facebook, Twitter, LinkedIn, Snapchat, Quora, etc.)?

Social accounts are all about audience audience audience. Some light placement targeting at certain points, but it’s really about what audience is reached effectively within social channels. @JuliaVyse

On the social networks testing for me generally boils down to creative and audience. So many clients are still completely unprepared for the sheer volume of creative that is needed to run ads on social platforms… @NeptuneMoon

I tend to have a much shorter “lifespan” for tests, campaigns, and ad sets due to the nature of the platforms. With that said, here are my tests: 1. Creative: testing stock vs “on brand” creative 2. Lead Ads: quality is suspect, but they can work. (t) @navahf

When it comes to social, it is unofficial/unsanctioned tests. Such as, can we scrape enough negative comments to turn them into fun ads. @JonKagan

We aren’t! Adjusting to Facebook shifts with IOS right now, and then we’ll be looking at how to test again. Brief pause. @snaptechmktg

Mainly creative + audiences. @ameetkhabra

We run mostly FB / IN outside of Google Ads and will often test creative. Right now with the IOS changes and shifts, we’re taking a pause on that testing. @amaliaefowler

Placements: which subset of a channel does better 4. Audiences: checking quality of broad audiences against customer list ones (shout out to @justunosocial for helping folks build those lists). @navahf

I am really interested in running paid influencer engagements on FB/IG. Anyone tried those? @360vardi

Q5: Are there tests that you’d like to be running on social platforms? Why aren’t you running them? Have they become more difficult or are they just not possible or is it something else?

Of course! Often lower budgets or lack of assets prevent us from running the scale of tests we’d like. But that’s okay – we do what we can with what we have. @snaptechmktg

Yes and no. I really like custom audience vs interest audience testing. my public sector clients can’t always do that for privacy reasons. @JuliaVyse

I’d love to be able to do more creative tests, but the logistical barrier (time to create, time to get approved, time to run) can be a challenge. One day all clients will magically believe in all our great work and auto-approve everything ^_^ @navahf

Lack of assets for social testing is HUGE and is even problematic for many of our enterprise clients. @beyondthepaid

At this point, due to brand safety, tests on social platforms are a lot like sticking your hand in a running snow blower to remove some blocked ice. On that specific note, here is what I did back in December. @JonKagan

More creative tests would be fun but getting the creative made and approved can be tough. @ameetkhabra

Q6: What is your absolute favorite test to run (any platform)? Why is it your favorite? Has platform automation changed your answer here?

Ad copy testing is still my favorite. The insights are so fascinating, and often come quickly. This has always been my favorite thing and it’s made way easier with @Adalysis @beyondthepaid

Landing Page Tests. Love seeing little (and Big) changes that make a real impact on Conversions. @Realicity

I love a good bid strategy test! @amaliaefowler

Usually boring old Google search, get the results/data the fastest. @JonKagan

Creative is the most fun to test – either images or ad copy! We get to do more with ad copy but are always surprised at the insights a fresh creative test can bring. @snaptechmktg

Post click testing is often overlooked, as we are so focused on getting the right clickers to then convert. But the landing page is SO important to this process. I love testing LP variations. This is one aspect that automation has not yet touched! @NeptuneMoon

It’s a tie between creative and keyword champions, but since the keyword is fading into obsolescence, I’ll give the nod to creative. I factor the following into creative tests: 1. Trends data (how people search/think/speak) 2. Demographics 3. Buyer personas. @navahf

Mine has always been ad copy. Messaging a strong indicator from the consumer as to what’s important to them. Which is the lack of being able to do it as well is more frustrating. @TheMarketingAnu

I really like keyword vs DSA vs RLSA. just give me all the formats!!!! @JuliaVyse

Automation hasn’t really changed the answer – marketing innovations and changes in human behavior have. @navahf

Anybody else notice that ads are often displayed with 2 headlines vs 3? This kind of messes with a test. @amaliaefowler

That I can control, LP tests because they can have a huge impact. That I can’t control but influences my $$ A LOT would be incrementality tests. Not a ton to do during the test but if the results are in your favor it’s ULTIMATE BRAGGING RIGHTS. @nataliebarreda

Most tests against automation – It makes it feel like I have a real foe. @selley2134

I’m not sure if I know anymore. I loved ad copy testing but, as mentioned before, Google is making that harder. @ameetkhabra

Q7: What, if any, types of tests do you either not like or think are a waste of time? Why? Has platform automation changed your answer here?

Is about to come for me ad copy tests. Again, I think this comes from a place of privilege but generally, as long as your copy is decent I strongly believe there are much better places to focus your time. @nataliebarreda

I get frustrated by testing for the sake of testing. If you’re trying to figure out one thing, awesome! Testing all the things and not setting up a testing framework is a waste of time and resources. Also, tests that go on too long due to volume issues are. @navahf

Ad copy that’s too granular. this adjective vs that adjective. keep it simple, brand vs offer. move on. @JuliaVyse

The “lets not run brand ads and see if organic catches everything” test. I have to run that one a couple times a year. Spoiler: it doesn’t. Run Brand @selley2134

Getting into the minor details in an account that is on an automated strategy is a waste. As @PatrickJGilbert has taught me in his delightful book (which I recommend), the algorithm needs room to breathe. Micromanaging schedule & device doesn’t allow that. @amaliaefowler

I will echo @navahf here too – make sure you have a good reason and plan for your testing. Don’t test to just check a box. Know why you’re doing it. @NeptuneMoon

Make sure there is a plan, a hypothesis, and a way to measure. Not having those makes the test a waste of time quickly, regardless of what you are testing. @snaptechmktg

PPCChat Participants

Related Links

Stop wasted ad spend with Karooya

Stop the wasted ad spend. Get more conversions from the same ad budget.

Our customers save over $16 Million per year on Google and Amazon Ads.

Leave a Reply

Your email address will not be published.