Drew Barontini

Product Builder

Issue #88
14m read

Data Narratives

I spend a lot of time looking at data. Every day, week, and month is filled with rhythms to review the data, find signals, and translate those signals into a strategic direction.

But data is such a broad term. When I think about data, I categorize it as:

  1. Quantitative data for numbers.
  2. Qualitative data for feedback.
  3. Intuitive data for judgement.

The first two are common, but intuition? I include it because your gut is a data point. And I’d argue it’s the strongest one. Having a strong intuition helps cut through the noise when you look at the mass of numbers and the chorus of feedback. A number alone shouldn’t justify a decision. One or more people’s feedback should strengthen the argument, but usually isn’t enough, either. But intuition is the multiplier—one you improve and cultivate through intentional critical thinking.

There’s a reason taste (or judgement or intuition) is the popular topic when it comes to knowledge work and AI. Knowing what works, what looks good, and what to focus on is still a uniquely human endeavor.

I can ask Claude to pull data from PostHog (analytics) and generate a report. I can even ask it to review the data more deeply and give insights into what the data says.

But you can’t prompt insights.

Insight is earned. Clarity emerges from the slow and, often, painful process of thinking, writing, and learning—from the steady extraction of each chain of reasoning.

Just this past week Claude proposed an outlandish and overly confident theory about a data irregularity in our product. I sent it to a colleague and they immediately said “that can’t be right.” And it wasn’t! Claude confidently expressed what, to it, seemed a plausible reason for the irregularity. It was trying to translate the data into a coherent narrative, but an LLM is a highly advanced autocomplete. It went with the most confident response, not the right one. I asked Claude to detail its chain of reasoning to that conclusion and it pointed out the errors in its own logic.

An LLM won’t say “I don’t know” like a human. An LLM can never admit defeat. It’s trained on a corpus of data written confidently by humans. And how did those humans reach such confident conclusions? Through rigorous research, continuous debate, and following a logical chain of thought to question assumptions, validate theories, and make each iteration better than the last one.

The Claude Codification of Work skips that.

We just jump to the pretty artifact or the confident-sounding conclusion. An LLM is a caricature of reasoning, masking intelligence with imitation. It looks plausible until you poke, probe, and question its logic. Claude is like a kid in a candy store of information. And I wouldn’t trust my kids to make reasonable choices in a candy store. AI is a powerful tool, but not without its flaws (like us humans). It’s in the understanding of those flaws where we can use it without eroding our understanding.

The more I learn about intelligence, neuroscience, and the evolution of our brains, the more I realize there’s so much we don’t know or understand about the human mind.

Data—quantitative signals, qualitative feedback, intuitive reasoning—requires translation to become meaningful. If you only stare at numbers or listen to voices or follow your gut, you miss the holistic approach to finding the best path forward.

You need a narrative.

Humans love stories. Stories are vessels for meaning and understanding. They’re how I’ve developed a richer understanding of what is going on and where we’re going. Data without a narrative is meaningless information.

Data Narratives is a new way to find meaning in the data. You take data and frame it as a cohesive narrative to generate testable ideas and inform coherent strategies.

The three pillars are:

  1. Signal Collection to take measurement.
  2. Story Formation to create motion.
  3. Strategy Translation to find meaning.

Signal Collection

The term “signal” has a lot of meanings. I use it frequently when talking about product work because improving your ability to identify signals is how you move from observation to action. A signal incites an action.

There’s a spectrum of signals. Some signals are stronger than others; some require more time incubating; some are important, but not right now important.

Signals are information seeking attention.

The OAT Process

Signals move through distinct stages:

  1. You seek and notice the signals.
  2. You make a decision about the signal.
  3. You translate it into meaningful action.

Observation → Assessment → Translation

Let’s call it The OAT Process.

Signal Collection is the measurement phase.

Think of a doctor. When they see a patient for the first time, they don’t prescribe a solution before taking measurements. They take vital signs like heart rate, blood pressure, blood oxygen, and temperature, as well as lab tests to check blood counts, cholesterol, glucose, and imaging results. These are the quantitative signals. They don’t tell the entire story, but they are the hard numbers to build from.

Then there’s the questions. I’m sure you’ve been in a doctor’s office, repeating the same answers to an endless supply of medical personnel, including the doctor. They take your clinical history and listen to how you describe what you’re feeling. After all, no one can describe how you feel but you. These are the qualitative signals. They are rich descriptions that frame the hard numbers.

In isolation, quantitative and qualitative signals can steer you in the right direction.

But not always.

That’s where the intuitive signals come in. The doctor has built a unique intuition from years of experience. They’ve developed pattern recognition and mental models to read the hard numbers and rich descriptions, sensing an emerging pattern that matches previous cases. The best ones go further and validate their assumptions. If they don’t, they can lose a patient. This is their craft experience.

So they develop a narrative about you. They take the data in whole and use it to craft the story that most likely leads to the conclusion.

Software is the same process.

You capture product metrics as the hard numbers. You talk to customers and capture rich feedback. And then you leverage the collective intelligence of your team to act on those signals to create impact.

To head in the right direction, though, you need a clear narrative to guide you.

Story Formation

Our product offers a 7-day free trial. But before you begin the trial, we require a credit card through a simple Stripe checkout. We do this to prevent abuse of the system.

When the anonymous-to-trial conversion rate—the percentage of anonymous visitors who start the free trial—dropped, I noticed the signal in my regularly weekly review.

✅ Observation

So I added a survey that pops up when users abandon the trial checkout. The data came in, with most people stating they’re not ready to commit right now. The survey had a set of options and then a free response. In the free response, some users mentioned the trial not being free. Huh? But it is free. So something must be making them think otherwise.

✅ Assessment

The trial activation modal shows two columns side by side. The left column is the value proposition, detailing what you get access to along with social proof. The right column is a short housekeeping list about the trial. Then the call-to-action button lets them continue to the Stripe checkout where they fill out credit card information without paying. This is where we were losing people.

So what’s the story? My theory: people are skimming the modal and hitting the checkout thinking they have to pay for the trial. So I updated the modal, removing the left side and focusing exclusively on simplifying the language to make it abundantly clear they won’t be charged, they will be reminded before it renews, and that they can cancel their trial at any point. The modal was doing too much before. The core concern at that stage of the process for most people was wanting to try the product for free before committing to a paid subscription. We weren’t designing the experience to make that clear.

✅ Translation

The hard numbers led to rich descriptions and a narrative based on my intuition.

We’re seeing the anonymous-to-trial conversion rate drop. We believe it means people are interpreting the “free trial” flow as not actually free (because they hit a credit card checkout too quickly and assume they’ll be charged). So we’ll test a simplified trial activation modal that makes “you won’t be charged + you’ll be reminded before renewal + you can cancel anytime” unmistakably clear, to see if trial starts (and checkout completion) increase.

I could be wrong. That’s okay! The creation of a narrative is how you formulate a testable theory so you can design an intervention. You can’t consider data in isolation. And you can’t commit to major changes without finding small probes you can place to measure the effects, increasing your confidence at each step. If the conversion rate doesn’t improve, then I still have a narrative to build from.

Maybe it’s a value problem; or the length of the trial; or it’s natural attrition in the process.

If Signal Collection is the measurement, Story Formation is the meaning. It creates a holistic view so you can turn data into direction.

Strategy Translation

Increasing new trials and efficiently converting them is an important process, but larger strategic changes stem from larger signals.

We released a major new feature a few months ago. It created a new surface area for the product—one we’re really excited about. The reception has been amazing, and the signals for its impact are coming into focus now.

People who use this feature are converting at significantly higher rates, and they’re sticking around much longer, too. It’s exciting when you make changes that meaningfully change the landscape of your product. It’s a major signal, and a clear opportunity to understand why it’s working so well. Luckily, this feature aligns directly with one of the strategic themes we are working towards.

Strategy Translation is how you take the narrative you developed and translate it into the strategic direction of the product. You move from measurement to meaning to motion, charting a defined path forward.

In Work Formation, I talked about how work moves through distinct phases:

  1. Aggregation when ideas form clusters.
  2. Activation when clusters form structures.
  3. Stabilization when structures strengthen.

Stories mature through the same phases as the fidelity of data grows and your understanding from experimenting crystallizes.

The early signals create a narrative around an immature cluster of data. Then the signals develop into batches and streams as you develop and test your theories. Finally, you move in a specific direction against clear bets expressed as defined projects. Through each stage, the story gets sharper, like focusing a magnifying glass on a sample. Details become clear, experiments more intentional, and reasoning becomes focused design. Each iteration amplifies the value. It creates a frame you can use when determining what to focus on—a priority frame.

The Practice

The practice of Data Narratives turns mixed product signals into strategic direction through a repeatable logic chain. You’re not just reporting data; you’re telling a coherent story.

I call the practice Signal to Strategy.

See → Sense → Choose → Test → Learn

1. Gather Signals

Collect the numbers, feedback, and intuition.

  1. Numbers: What changed in the metrics?
  2. Feedback: What are customers, users, sales, support, or internal teams saying?
  3. Intuition: What are we sensing based on product judgment, team context, and strategic direction?

This is the raw material you work from.

What are we seeing, hearing, and sensing?

2. Write the Story

Now turn the raw material into a cohesive narrative.

Data Story Template

The story we’re seeing is [strategic interpretation]. The numbers show [quantitative signal], the feedback shows [qualitative signal], and our intuition suggests [strategic meaning].

Example:

The story we’re seeing is Feature X is becoming one of the stickiest parts of the product. The numbers show a 2x increase in trial conversions and a 10x increase in both weekly and monthly retention. The feedback shows customers want more usage and flexibility around the feature, and our intuition suggests it directly supports our strategic bet to expand the product surface.

This creates a shared interpretation of the data you can talk about with your team.

3. Explain the Direction

Once the story is clear, explain why it leads to a specific strategic direction.

Direction Logic Template

Because of this, we’re focusing on [strategic direction]. If this direction is right, we expect [leading outcome], followed by [lagging outcome], which we’ll validate through [signals].

Example:

Because of this, we’re focusing on expanding Feature X’s capabilities and making it more visible across the product experience. If this direction is right, we expect stronger repeat usage first, followed by reduced churn and ARR growth, which we’ll validate through feature usage, repeat usage, churn rate, and ARR growth.

This is the reasoning bridge:

4. Place the Bet

The final step is to move your meaning into a clear and testable strategic bet.

Strategic Bet Template

We believe [strategic action] will create [expected behavior change] because [data story]. We’ll test this by [initiative / experiment] and measure [leading + lagging signals].

Example:

We believe expanding Feature X’s capabilities and exposing it more throughout the product will increase repeat engagement because users who engage with it are converting and retaining at significantly higher rates. We’ll test this by adding higher-usage capabilities and introducing Feature X in more product moments, measuring repeat usage, retention, churn, and ARR growth.

The narrative becomes operational.

5. Measure the Narrative

Once the bet is in motion, track whether the signals confirm, weaken, or change the story.

Measure:

  1. Leading Signals as the earliest signs the direction is working.
  2. Behavioral Signals as the deeper product behaviors you want to create.
  3. Lagging Outcomes as the business outcomes that should follow.

Measurement Template

We’ll measure this by watching [leading signals], [behavioral signals], and [lagging outcomes]. If [expected pattern] happens, the narrative is strengthened. If [counter-signal] happens, we’ll revise the story and adjust the bet.

Example:

We’ll measure this by watching feature engagement, repeat usage, and adoption of expanded capabilities as leading signals; recurring workflow usage and customer requests for more access as behavioral signals; and retention, churn, and ARR growth as lagging outcomes. If Feature X usage grows first, repeat behavior strengthens next, and retention improves over time, the narrative is strengthened. If usage grows without retention improving, we’ll revisit whether Feature X is truly a retention driver or simply a popular feature among already high-intent users.

Full Template

Each template is useful on its own as you build the narrative, but here’s a combined version:

Signal to Strategy Template

We’re seeing [story], supported by [numbers], [feedback], and [intuition]. Because of this, we’re focusing on [direction]. We believe [bet] will create [expected change]. We’ll test this through [initiative] and measure [signals]. If [expected pattern], we’ll continue. If [counter-signal], we’ll revise.

Example:

We’re seeing Feature X emerge as a possible retention driver, supported by a 2x increase in trial conversions, a 10x increase in weekly and monthly retention among users who engage with it, customer requests for more usage and capability, and our intuition that Feature X expands the product surface for recurring value. Because of this, we’re focusing on expanding Feature X’s capabilities and making it more visible across the product experience. We believe deeper capabilities and broader exposure will create stronger repeat engagement. We’ll test this through expanded Feature X functionality and new product entry points, measuring feature engagement, repeat usage, capability adoption, retention, churn, and ARR growth. If usage grows first, repeat behavior strengthens next, and retention improves over time, we’ll continue investing in Feature X. If usage grows without retention improving, we’ll revise the narrative and reassess whether Feature X is truly a retention driver or simply popular among already high-intent users.

The Throughline

Data is more than numbers. It’s a combination of the quantitative, the qualitative, and the intuitive—the numbers, the voices, the senses. Each in isolation tells part of the story, like flashes of memories. It’s only when you stitch them together that you see the coherent narrative begin to emerge. The narrative contains assumptions to test, measure, and refine with each iteration.

Collect signals for each type of data.

Turn the data into a coherent story.

Use the story to define a direction.

Make a bet against your narrative.

Measure, learn, and repeat.

Connected Ideas

Data Narratives lives in the Clarity Codex of the Claritorium and Value Creation of Equilio. And it connects to other ideas:

Clarity Codex Value Creation

Enjoying this issue? Sign up to get weekly insights in your inbox: