Here’s a challenge:
A high growth, lead generation startup reaches out to you with a Facebook ad problem.
CPMs are 33% higher than what they used to be
CPC is nearly 99% more expensive than what they used to be
Leads became 131% more expensive, the highest they have ever been
And the company isn’t nearly as profitable as it could be
All of that happened because Facebook ads became too expensive.
The Challenge
The client is already savvy media buyer. Spending over $150,000/mo in advertising on Facebook on a monthly basis is not for the faint of heart.
Campaigns were profitable.
Numbers used to be mind blowing.
Everything was working.
Then it stopped.
So the client started doing what any media buyer would have done:
- They created new images
- They created new copy
- They tested segmentation
- They tested advertising angles
- They tested new optimization techniques
Even though there was hope for some optimization and things improved momentarily, they didn’t last long term.
Enter Amplifii: The Process
When jumping into any account, we do a deep dive into our ad scorecard methodology.
The ad scorecard methodology includes 7 ingredients that craft a world class advertising campaigns:
1) Offer
2) Avatar
3) Funnel
4) Nurturing
5) Ad
6) Ad optimizations
7) Reporting
Most agencies jump straight into the ad optimization, we take a holistic approach to auditing, analyzing and making decisions based on the ad scorecard methodology.
Unfortunately, there is limited information we can share based on confidentiality, but here goes.
Here’s where there WERE NOT any problems, therefore we didn’t focus on these elements:
1) Offer: the offer was crafted to perfection. This offer is for a group of job seekers designed to help them find jobs. The value proposition, speed of finding employment, backend filtering systems and execution were all aligned. Check.
3) Funnel: the funnel simply converts. The sales process is partly online and the other process is offline. Both work and are proven. Check.
4) Nurturing: retargeting campaigns, email follow ups, phone interviews and a lot more is in place. Don’t touch it if it’s not the problem. Check.
7) Reporting: the reporting data and accuracy of the data was phenomenal. This isn’t the case for most advertisers and small business. We usually start here, but were impressed at the quality and accuracy. So, check!
That leaves with us:
2) Avatar
5) Ad
6) Ad optimizations
2) Avatar: there were clear and distinct job seekers, but the advertising wasn’t calling out the different types of job seekers. There were job seekers based on geographic locations, based on their skillset and based on their level of experience. This was an opportunity!
5) Ad: there are nuances to the personality types and types of job seeker available in the marketplace. We found that it even though it was clear internally who was the ideal job seeker, there was a disconnect with what ended up in the Facebook ad. Another opportunity!
6) Ad optimizations: while there were many ad setup, the ad account structure was designed correctly as it was hard to make decisions on tests that performed vs. that didn’t perform.
So across these 3 main opportunities, here are a few of the the tests we ran (that we can share publicly):
- Avatar – define 3 types of avatars and focus on the largest percentage of the market we could tackle
- Avatar – design major ad angles based on those avatars
- Ad – design ad images for the respective avatars and respective ad angles
- Ad – design ad copy for the respective avatars and respective ad angles
- Ad optimizations – restructure the campaign and ad sets to isolate variables
- Ad optimizations – test group of angles separately to isolate performance per angle/creative
- Ad optimizations – test placements aggressively while testing assumptions of past performance
- Ad optimizations – test optimization based on different pixel fires (ie: standard events in different steps of the funnel)
One issue: from an optimization standpoint, decision making based on ad performance was a real challenge. People completed the final action that we were optimizing for as much 7-14 days after the lead was generated.
This meant that we turned off many ad sets that we assumed weren’t performing, only to find out a few weeks later that they were responsible for a large percentage of performance….whoops.
The Outcome
This is the stuff we love sharing because this information is objective. The numbers speak for themselves.
The total budget provided was approximately $50,000; Amplifii spent $49,612.77 in a 30 day period.
Here are the main performance numbers, side-by-side:
- Unique Link CTR = 1.51% (Amplifii) vs. 0.69% (original) that means 118.8% improvement
- CPC = $3.49 (Amplifii) vs. $13.28 (original) that means 74.7% improvement
- Unique website clicks = 15,915 (Amplifii) vs. 3,736 (original)
- Cost per main action = $1,152.79 (Amplifii) vs. $2,935.93 (original) that means 60.7% reduction
- # of main actions = 36 (Amplifii) vs. 17 (original)
*The only reason further step-by-step funnel information isn’t shared is to protect the client’s funnel metrics from their unique business model.
This is the bottom line:
118.8%
Improvement on CTR
74.7%
Improvement on CPC
60.7%
Reduction In Ad Costs