Counterfactual Analysis: The Fastest Way to Answer “What If?” in MMM 

  • What if we hadn’t cut X channel?
  • What if we’d stayed with the plan?
  • What if we’d shifted spend earlier/later?

Every planning cycle ends with the exec team asking questions like the above, and most marketing teams have no hard data to answer them.

That’s because most teams plan one budget and execute another – not because they’re sloppy, but because, well, reality forces them to. CPMs spike: maybe a channel underperforms, finance asks for cuts, or an opportunity shows up mid-quarter. Either way, plans change.

But when we diverge from the plan, we fall back on stories that, even though sometimes they are true, are not testable. They don’t tell you whether the decision you made helped or hurt. And they don’t help you make a better call next time.

Traditional reporting doesn’t help much here. It’s very good at telling you what happened. Spend went down. Revenue followed. ROAS changed. But it can’t tell you what would have happened if you’d made a different choice.

Then it all turns into debates. Finance wants accountability. Marketing wants context. No one can quantify the impact of the deviation itself. Until you can measure the cost – or benefit – of the changes you made, you’re stuck. That’s why we need counterfactual analysis.

What Counterfactual Analysis Is – and Why It Changes Everything

Counterfactual analysis answers one simple question: “Compared to what?”

In MMM terms, that means holding everything else constant – seasonality, demand, macro effects, competitive pressure – and changing just one thing: spend level, timing, channel mix, or a commercial decision like promotions. The model then estimates the incremental difference between the world you lived in and the world you didn’t choose.

That’s fundamentally different from attribution, which answers who got credit. And it’s different from regular reporting, which answers what happened. Counterfactuals answer what would have happened instead.

For example: upper-funnel spend gets cut mid-quarter to protect margin. A few weeks later, revenue softens. In the post-mortem, the story is that demand slowed, brand weakened, and timing issues compounded. But without counterfactual analysis, you can’t answer the question that actually matters: what was the incremental cost of that cut?

Was the revenue decline mostly driven by external factors? Or did the spend reduction materially contribute? Was the cut a smart defensive move or a false economy?

This is also where finance starts to lose confidence. Instead of arguing about why something happened, you can estimate how much a specific choice helped or hurt. 

How Counterfactual Analysis Works in Practice

The mechanics are straightforward. Your MMM needs three inputs:

  • Your original plan – the spend levels, timing, and channel mix you intended to execute
  • What you actually did – your real spend, captured in whatever reporting system you use
  • Observed results – the actual revenue, conversions, or KPI you’re measuring

The model then runs three scenarios:

  • Scenario A: Predicted outcome if you’d followed the plan exactly
    • The model estimates what would have happened with your planned spend, holding everything else constant – seasonality, baseline demand, competitive effects, all of it.
  • Scenario B: Predicted outcome based on what you actually spent
    • Same model, same conditions, but with your real spend levels and timing.
  • Scenario C: What actually happened
    • This is ground truth. It shows you how accurate the model’s forecast was, and lets you validate the counterfactual. The gap between Scenario A and Scenario B is the incremental impact of your deviation from plan.

For example: you planned $5M in TV for Q3. Budget cuts forced you down to $3M. The model estimates your plan would have driven $12M in incremental revenue. Your actual spend drove $9M. The $2M cut cost you roughly $3M in revenue.

Now you know. The decision had a real, measurable cost. Maybe it was still the right call – protecting margin might have been more important. But at least you’re not guessing.

Counterfactuals as a Truth-Seeking Loop

When you integrate counterfactuals into your operating system, it creates a compounding loop: 

Forecast → Act → Compare plan vs. actual → Measure the delta → Learn

You start with a plan and a forecast. Then reality intervenes. After the fact, instead of asking “Why did this happen?” you ask “What was the incremental impact of the choices we made?”

Over time, that feedback loop tightens. Forecasts get better because they’re exposed to reality. Plans get less risky because you understand the true cost of deviations. And decisions get easier to defend because they’re grounded in measured impact.

And now you can go to your CFO and earn their trust because you can demonstrate accountability. You can say, “here’s what we planned, here’s what we did, and here’s the incremental impact of the difference.”The goal isn’t perfect hindsight. No model will tell you exactly what would have happened. The goal is to make better next decisions and get smarter quarter after quarter.

Scroll to Top