< Back home

Why People Buy: Three Lessons from a Market Research Meltdown

by | Nov 11, 2025

The oldest—and hardest—marketing research question is: “Why do people buy?”

Most people simply delegate the answer—feeding a US $140 billion research industry (ESOMAR).

The problem? Many results are skewed, unreliable, or plain wrong.

I learned that the hard way.

Recently, I ran a large global growth leadership study—a classic “driver analysis” that promises to reveal what really moves revenue.
Top agency. Thousands of respondents. 60+ countries.

The report looked fine.
It just didn’t make sense.

So I asked my friend Frank Buckler—who runs a boutique research firm—to run a neural network analysis.
The results flipped. Completely.

A key variable’s importance shifted from +9 percent to −2 percent.
Drivers that came out irrelevant shot to the top.

Marketing has a serious research problem.

So I spoke to several CMOs and researchers about why we still get growth leadership research so wrong.
Here’s what I learned.

Lesson 1: Learn how the growth leadership math works.

Ice-cream sales and drowning deaths rise at the same time.
The correlation is almost perfect.

Does that mean ice cream kills people? Of course not.
When it’s hot, more people buy ice cream—and more people go swimming.

Correlation and cause are different.
Yet many marketers still use correlation as proof of success.

Traditional regression has flaws too.
It assumes something like: “Y changes by a constant amount when X changes by one unit.”

It’s a straight-line world.
We all know that’s nonsense.

Neural networks aren’t perfect, but they’re closer to how real decisions happen.
They don’t assume anything is linear.
They learn patterns directly from data—whether those patterns are straight, curved, or tangled like spaghetti.

You don’t need to become a statistician.
You can google most of this.
But you need to ask the right growth leadership questions—or you may end up with expensive nonsense.

Lesson 2: Treat researchers like peers, not suppliers.

Many researchers are super smart. They would love to have a peer-level debate.
In reality, most get asked for “quick analysis,” and rarely see what happens to their work.

That kills curiosity.
It keeps the same old methods alive.

Take ad tracking.
System1 does one of the world’s best ad-performance measurements.
It doesn’t mean people jump on it.

Insecure analysts stick to what’s safe.
This trickles down to agencies.
The whole system repeats itself: mediocre growth leadership insight, mediocre growth.

Lesson 3: Trust your gut, not the logo.

When my first results came back, I could have accepted them.
Big agency name.
Credibility guaranteed.

But they didn’t feel right.
So I dug deeper.

The new analysis revealed what the first had missed.
Don’t get me wrong—I’m not blaming the agency.
They’d happily run alternative methodologies.
The problem is that few clients ever ask.

The truth about “why people buy” isn’t simple.
If you’re serious about growth leadership, get your hands dirty.

Research is only as good as the curiosity behind it.