How to Communicate Data So It Drives Better Business Decisions 

“Storytelling with data” sounds great in theory – we all would love to have a single source of truth that always outputs one clear, actionable insight with no uncertainty.

But in practice, most real-world data doesn’t work that way. 

Especially in media mix modeling and incrementality work, the outputs are noisy, probabilistic, and have plenty of nuance. There’s often no single clean answer – only ranges, trade-offs, and competing plausible explanations.

And that’s fine. The problem comes when we force those into a story that creates false clarity when there isn’t any.

The way we look at it at Recast is simple: the point of analysis is not to persuade but to inform better decision-making. Everything else is dangerous and counterproductive.

This piece will cover the lessons we’ve learned on how to better communicate data to execs and the common challenges we’ve seen teams face when doing so.

Uncertainty Is the Whole Point (Not a Caveat)

Every meaningful marketing analysis comes with uncertainty: A/B tests, MMM, lift studies, survey data, blended performance metrics. None of them produces absolute truth. They all produce ranges, probabilities, and trade-offs. And that’s exactly what they should do.

But you don’t need certainty to make decisions. You just need to get very clear about what you do know, what you don’t, and what that means for risk.

Let’s say a model tells you that streaming audio has a 5x ROI. That sounds great, but that number alone is useless. A good model should also tell you that it might come with a 90% credible interval ranging from 1x to 10x. 

And that’s not a failure of the model. That’s the actual signal in the data. The uncertainty is the information. It tells you: “This channel might be undervalued – or it might be risky. Proceed, but proceed carefully.”

We understand that this can be hard to manage – especially when you compare that to platform ROAS or Google Analytics dashboards. No confidence intervals, error bars… just a number you can take to your exec team right away. Except that behind that number, they’re hiding all the underlying assumptions, correlations, and noise.

Decision-makers need the whole picture – best case, worst case, and base case. That’s what allows you to make optimal bets across channels and budgets. But you need to accept and manage uncertainty well. 

Thinking in Bets: A Better Mental Model for Marketers

Most marketers don’t realize it, but every budget decision you make is a bet. Whether you’re scaling Meta, testing Streaming Audio, or holding the line on paid search, you’re just placing bets with varying levels of expected return and risk.

Let’s say your model shows Meta with an ROI of 3–4x and narrow confidence intervals. Streaming Audio shows a possible ROI of 1–10x – higher upside, but also higher uncertainty. 

The right question isn’t: Which number is higher? It’s: What portfolio of bets gives us the best expected payoff, given our tolerance for risk and our appetite for learning?

That’s how investors think. It’s how marketers need to think, too.

That’s what the Recast incrementality system is designed to do: use MMM to flag promising but under-tested channels, run experiments to validate their impact, and refine your portfolio based on what you learn. It’s a dynamic loop: test, learn, reallocate, repeat.

In modern marketing, optimization isn’t about having “the answer.” It’s about making better bets, more often, with a system that helps you learn as you go.

Now that we’ve covered uncertainty and bet placing, let’s talk about communicating those to leadership.

How to Make Data Useful Without Oversimplifying It

One of the most common traps we’ve seen is analysts trying to “clean” the data before it gets to the executive team. 

Looking at the example above, you or your team might feel pressure to strip away all the nuance and simplify the message because “the CEO just wants a number and a decision.” So instead of saying “streaming audio has a 5x ROI with a 90% credible interval ranging from 1x to 10x,” you say “streaming audio has a 5x ROI.” And that’s so problematic. 

The best analysts we’ve seen say: here’s what we know, here’s what we’re still uncertain about, and here’s the best course of action given those constraints.

From our experience, executives don’t reject MMM because they dislike nuance. They reject it because no one helped them make the conceptual shift:

  • They weren’t briefed on the limitations of attribution. 
  • They weren’t walked through how MMM works differently, or why the old systems weren’t showing the full picture. 
  • They weren’t given the space to reorient their mental models before being asked to make new decisions.

One clarifying note here: CEOs don’t want you to pontificate. They want a recommendation that incorporates real uncertainty. If the stakes are high – let’s say we’re talking about shifting millions in spend – “managing uncertainty” does not mean to present a wall of caveats.

The answer is to do the work before the meeting. That means talking to stakeholders (finance, operations, brand, sales…), aligning on what an “acceptable risk” looks like for the business, and building the recommendation with A, B, C, X, Y, and Z stakeholders before you ever walk into a room with the CEO.

Once you’re in the room, it’s not the time to explain how MMM works or why confidence intervals matter. It’s the time to say: “we’ve considered the risks, we’ve talked to the right people, and given this specific uncertainty, this is the bet we recommend.”

Prepare better. Don’t simplify.

How to Build Organizational Trust Through Data Clarity (Not Clean Stories)

When media mix modeling fails inside organizations, it’s usually not because the model is wrong –  it’s often because the communication fell flat.

Executives do not hate nuance, but they do hate unexpected surprises. If you walk into a meeting and show a slide that contradicts 18 months of Google Analytics data or platform ROAS – without preparing anyone for it – you’ve already lost the room.

But again, the solution isn’t to water down the results or try to get a “story.” It’s to teach teams – especially leadership – to think probabilistically and manage uncertainty better.

About The Author