The psychology behind why stakeholders say yes (or no) to your analysis - 7 triggers


You know that feeling when you show someone your analysis and… nothing happens?

The numbers are solid.

Your work is spot-on.

Everything makes perfect sense.

But somehow your ideas just sit there.

Nobody's acting on it.

Most of the time, this is the reason: data doesn’t convince people - understanding how people think does.

The Situation

The analysts whose recommendations actually get implemented aren’t always the ones with the fanciest techniques or the cleanest data. They’re the ones who know a simple fact about decision-making: people decide with emotions first, then justify with logic later.

The Lesson

If you want your insights to drive action, you can’t just present "the truth".

We're wired to think it's enough, but actually it isn't.

You need to frame it in ways that match how our brains naturally process information.

Here are 7 simple approaches that work because they reduce cognitive effort, tap into built-in biases, and make it easier for people to say "yes".

Quick Hits: 7 Simple Tricks That Make People Actually Listen to Your Data

1. Talk About What They’re Losing (Not What They Could Gain)

Instead of: “This optimization could save us $50K annually”

Try: “We’re currently losing $50K per year to this inefficiency”

People hate losing money more than they like gaining it.

Reframe gains as preventing loss.

2. Show Them What Similar Companies Are Doing

Instead of: “Industry benchmarks show…”

Try: “Companies like ours who tried this saw…”

We naturally copy the behavior of our peers.

Make the change feel like the safe, obvious choice.

3. Use Real Numbers, Not Percentages

Instead of: “This represents a 23% improvement”

Try: “This means 2,300 more customers served every month”

Percentages make people think harder.

Real numbers feel concrete and immediate.

4. Don’t Make Current Processes Sound Stupid

Instead of: “We should change our approach”

Try: “Our current approach made sense when we started three years ago, but things have changed…”

Respect past decisions while opening the door to new ones.

5. Always Give Context

Instead of: “Customer satisfaction is 78%”

Try: “Customer satisfaction dropped from 84% to 78% this quarter, putting us below our main competitor at 82%”

Numbers alone don’t mean much - always compare to something.

6. Stick to Three Main Points

Instead of: “Here are seven key findings…”

Try: “Three big things jumped out from this analysis…”

Our brains process three ideas well.

More than that and attention drifts.

7. Mention the Work Behind It

Instead of: “This analysis shows…”

Try: “After looking at 18 months of customer data…”

A quick nod to the effort signals credibility - just don’t overdo it.

Why This Matters

Your analysis can be 100% correct and still fail if no one acts on it. Some great analyses and insights get shelved simply because they were presented as “optimization opportunities” instead of “here’s what we’re losing every day we wait.”

These aren’t manipulation tactics.

They’re human-first communication strategies that ensure good work isn’t ignored.

Try This Week

Take one insight you’ve struggled to get traction on. Pick the approach that fits your biggest obstacle:

  • If people seem indifferent: Reframe with loss language (#1)
  • If they’re skeptical: Add social proof (#2) or mention your effort (#7)
  • If they’re overwhelmed: Simplify to three points (#6) and use concrete numbers (#3)
  • If they’re defensive: Honor current processes first (#4)

Rewrite just your opening line or subject line using that trigger - and see what happens.

One Thing to Remember

Your analysis is only as good as the action it creates. The gap between insights that change things and insights that get forgotten often comes down to how well you understand the person on the receiving end.

Share Your Take

Which of these triggers resonates most with your experience? Have you noticed other psychological patterns in how stakeholders respond to data?

That's it for this week. Hope you found this issue helpful.

Til next week,
Donabel

P.S.

I used to think “letting the data speak for itself” was the right approach. Turns out, data is terrible at speaking - it needs someone who understands both numbers and people, and bridge the gap accordingly.


Have you checked out these resources?

🔨

Build to Sell - 7 Days to Your First Digital Data Product

Check out the course →

🤖

Teach Data with AI

View the Newsletter →

🎗️

The Introverted Analyst's 5 Day Guide to Professional Visibility

Start Getting Recognition at Work →

Learn Practical Data Skills

Join 5K+ subscribers who receive weekly, bite-sized, practical and actionable lessons for the data professional. | Free video tutorials at youtube.com/sqlbelle | Teaching data? Incorporate AI - tips and prompts at https://teachdatawithai.substack.com/

Read more from Learn Practical Data Skills

You know the signs: glazed eyes during your presentation, people checking phones while you explain a process, or the dreaded interruption - ”Sorry, but why does this matter to me?” It happens because we lead with how things work instead of what breaks when they don’t. We assume people want to understand the process when they really want to understand the consequences. The Gap Between Data Professionals and Everyone Else Here’s what usually happens: you spend time crafting a clear technical...

Hello there, Quick question: Have you ever designed a metric that created exactly the opposite behavior you wanted? If you’re nodding, you’ve discovered Goodhart's Law in action: "When a measure becomes a target, it ceases to be a good measure." This suggests the fundamental truth about human nature: As soon as people know a number is being watched or used to make decisions, they start optimizing for that number - often at the expense of what it was meant to represent. Here’s why this matters...

Hey there! I've been getting a lot of messages lately about my old Tableau practice sets (you know, the Superstore ones). The feedback has been consistently positive, but there's also been a pattern: "These were great for learning the mechanics, but my real work data is... messier." This got me thinking. A lot. The Problem with Perfect Practice Most Tableau training (including my own previous work ... hello Superstore data set!) uses sanitized datasets. Everything adds up perfectly. No...