Your 5-Minute Guide to AI Prompting (No Fear Required)


Your 5-Minute Guide to AI Prompting (No Fear Required)

Hey there,

So you’ve probably tried ChatGPT or Claude by now. Maybe you asked it to write some SQL and got something that sort of worked. Or maybe you’re still staring at that blank prompt box wondering what exactly you’re supposed to type.

Either way, you’re not alone.

But what we fail to recognize sometimes is:
Good prompting is just good communication.

That’s it. No magic formulas. No secret techniques.

You already know how to explain what you need to a colleague or document your analysis process. Same skills, different audience.

Here's What's Really Going On

Most prompting advice you'll see online isn't wrong, but it can be overwhelming when you're just starting out.

"Use few-shot learning!"
"Implement chain-of-thought reasoning!"
"Engineer your prompts!"

Those techniques work.
But if you're new to AI tools, they can make the whole thing feel impossibly complicated.

Here's the thing:
you don't need to master advanced techniques to get value from AI.

For most day-to-day data tasks, you just need to be clear about what you want.

Start simple.
Get comfortable.
The fancy stuff can come later.

What Actually Works

I use AI tools daily now - for SQL debugging, chart reviews, explaining analysis to stakeholders. Here’s what I’ve learned:

1. Give Context (Because AI Doesn’t Read Your Mind)

Think about it this way: if someone walked up to you and said “Cook food,” what would you do?

You’d ask questions, right?

  • Cook for how many people?
  • What occasion?
  • Any dietary restrictions?
  • What ingredients do you have?

Without context, even simple requests become impossible to fulfill properly.

AI has the same problem, just faster and more confidently wrong.

Instead of:

“Fix this SQL”

Try:

“This query should pull Q2 sales by region, but I’m getting duplicate rows. Can you spot the issue?”

The first version could mean anything.

  • Maybe the syntax is wrong.
  • Maybe it’s running slow.
  • Maybe the logic is off.
  • AI will guess, and probably guess wrong.

The second version tells AI exactly what success looks like and what’s currently broken. Now it can actually help.

Here’s what happens without context:
You get generic advice that doesn’t fit your situation. AI will give you textbook answers for problems you’re not actually having.

With context? You get targeted help that actually works.

2. Be Specific About What You Need

Don’t make the AI guess what you’re after:

  • “Explain this in simple terms”
  • “Find errors in my logic”
  • “Suggest three ways to visualize this data”
  • “Help me clean up this messy query”

The more specific you are about the deliverable, the better the response.

3. Follow Up When Needed

The first response often isn’t perfect. That’s fine.

“Make it simpler” or “Actually, I need more detail on the forecasting part” or “This doesn’t account for our data structure.”

It’s a conversation, not a one-shot deal.

A Simple Template That Works

Once you get the hang of providing context and being specific, here’s a simple starting structure I use for many task-related requests:

I’m working on [specific task].

Here’s what I’m trying to do: [your goal].

Here’s what I have: [data/code/situation].

I need [specific help].

Any issues, constraints or problems: [constraints or problems].

Example:

I’m working on a sales dashboard.
Here’s what I’m trying to do: show monthly trends by product category.
Here’s what I have: this messy SQL that started running slow.
I need help investigating and optimizing it.
Any issues, constraints or problems: can’t change the underlying tables. cannot add new indexes.

That covers all the context AI needs without overthinking it.

When It Doesn’t Work

Even with good context and clear requests, sometimes AI gives you wrong answers.

Confidently wrong answers.

When that happens, ask it to double-check:

“Walk me through your logic here”

or

“What could go wrong with this approach?”

AI tools are great at catching things you missed.

They’re terrible at business context and knowing when rules have exceptions.

Always review and sanity-check the output.

The Bottom Line

You already have the communication skills for this. You know how to explain problems, provide context, and ask follow-up questions.

AI prompting is just using those skills with a new tool.

The difference between people who get value from AI and those who don’t? It’s usually about approach, not technical skill.

Try This Today

Pick something small you’re working on:
- A SQL query that’s not quite right
- A chart that needs improvement
- Explaining your analysis to someone non-technical

Open ChatGPT or Claude.
Be specific about what you need help with.
See what happens.

Don’t overthink it.
Just try it.

Learning to use AI effectively is just another skill in your toolkit.

And honestly, you probably have better instincts for this than you think.

Chat soon,
Donabel


P.S. Speaking of AI prompting - I’m launching “Build to Sell: 7 Days to Your First Digital Data Product” on July 24th.

AI prompting is actually a core piece of the course, and
you’ll get a complete prompt library specifically for data professionals wanting to build their own digital data product.

Plus some bonuses I’m excited about:


The newsletter alone is a game-changer for people learning to use AI effectively in data education and training. Teach Data with AI has 2-3 new posts per week, many of them with accompanying full prompts (for example,
how to explain 95% confident in plain English)


Here's a sneak peak at the course content and the accompanying AI Prompt Library:

Learn Practical Data Skills

Join 5K+ subscribers who receive weekly, bite-sized, practical and actionable lessons for the data professional. | Free video tutorials at youtube.com/sqlbelle | Teaching data? Incorporate AI - tips and prompts at https://teachdatawithai.substack.com/

Read more from Learn Practical Data Skills

You know the signs: glazed eyes during your presentation, people checking phones while you explain a process, or the dreaded interruption - ”Sorry, but why does this matter to me?” It happens because we lead with how things work instead of what breaks when they don’t. We assume people want to understand the process when they really want to understand the consequences. The Gap Between Data Professionals and Everyone Else Here’s what usually happens: you spend time crafting a clear technical...

You know that feeling when you show someone your analysis and… nothing happens? The numbers are solid. Your work is spot-on. Everything makes perfect sense. But somehow your ideas just sit there. Nobody's acting on it. Most of the time, this is the reason: data doesn’t convince people - understanding how people think does. The Situation The analysts whose recommendations actually get implemented aren’t always the ones with the fanciest techniques or the cleanest data. They’re the ones who...

Hello there, Quick question: Have you ever designed a metric that created exactly the opposite behavior you wanted? If you’re nodding, you’ve discovered Goodhart's Law in action: "When a measure becomes a target, it ceases to be a good measure." This suggests the fundamental truth about human nature: As soon as people know a number is being watched or used to make decisions, they start optimizing for that number - often at the expense of what it was meant to represent. Here’s why this matters...