1. Tracking (the setup)
Goal: Make sure the campaign is designed and tracked so that any later analysis is trustworthy.
- Did we have a clear theory before we started?
- Example: "We believe LinkedIn will drive higher ACV (Annual Contract Value) than Facebook."
- Did we target the right segments?
- Was the reach too broad or too narrow?
- Were creative assets delivered on time?
- Did tracking pixels and attribution tags work correctly from day one?
- Ensure consistent naming (for example, not
utm_source=fbone day andutm_source=facebookthe next). - This consistency is what makes channel and campaign performance comparable.
- Invisible code placed on the company website.
- When a user clicks a Facebook ad and lands on the site, the Facebook Pixel fires.
- If the user buys a subscription, the pixel fires again to send that conversion back to Facebook.
- You work with engineers to make sure these pixels fire at exactly the right moments.
2. Data quality (the optimization)
Goal: Ensure the live campaign data is accurate and actionable so you can optimize in-flight.
- Did we spend the budget evenly over the flight?
- Or did we panic‑spend at the end?
- When we saw ad A failing, how quickly did we pause or replace it?
- Did the sales team actually follow up on the leads generated?
- Or did leads sit in the CRM for days? (A very common failure point.)
- Duplication:
- Did the system count the same sale twice?
- Currency issues:
- Did we mix US Dollars and Swedish Krona (SEK) without converting?
- Missing costs:
- Do we have revenue data but forgot to upload or sync ad spend data?
3. ROI (the impact)
Goal: Understand the true financial and business impact of the campaign.
- Hard financial metrics such as Cost per Acquisition (CPA) and revenue.
- Did this campaign generate incremental sales?
- Or did it just capture people who would have bought anyway?
- It is rarely as simple as looking at day‑one ROI because of Long‑term Value (LTV).
- Example:
- A campaign looks negative on Day 1: it costs $100 to get a customer who pays $50.
- Knowing LTV, you flag this as a good campaign because that customer is expected to pay $600 over time.
- Capture the one key thing to do differently next time (for example, better audience definition, a different creative angle, or a changed bidding strategy).
Real‑world scenario: the influencer campaign (Lovable example)
"The Influencer Campaign" – e.g., for Lovable
- The marketing manager says: "We are sponsoring a YouTuber named TechGuru."
- You generate a specific link for TechGuru, for example:
- You verify that when someone clicks it, the visit registers correctly in your analytics dashboard.
- The video goes live and traffic spikes.
- In the dashboard you see 10,000 visits but 0 signups.
- You investigate and discover the landing page is broken on mobile devices.
- You alert the engineering team immediately so they can fix it and save the campaign.
- The campaign ends.
- You calculate ROI and look at incrementality:
- Did TechGuru simply cannibalize existing signups (people who would have signed up anyway)?
- Or did the campaign bring in truly new users?
- You conclude that 80% of users were brand‑new (incremental), so the campaign is a massive success.