Skip to content

R&D Spending Is the Most Misleading Number in Business

The government collects the real R&D split from every public company. It's locked away by federal law. Here's how to estimate it anyway.

Phil McKinney
Phil McKinney
9 min read
The Innovators Studio is available on Apple, Spotify and YouTube. Subscribe Today.

Every public company's R&D number is a lie hiding in plain sight.

Not because anyone falsified it. Because the number was never built to tell the truth. It was built to satisfy an accounting standard written in 1974. And for fifty years, boards, analysts, and CEOs have been making billion-dollar innovation decisions based on a number designed by accountants to solve a different problem entirely.

Here's what makes this genuinely strange. The real number exists. The government has been collecting it from every major US company for decades. It would answer the question every innovation leader and investor actually needs answered. And it is locked away by federal law. Confidential. Never published. Never seen by the people who need it most.

It's sitting in a federal database right now. And there's a way to estimate it for any public company, without asking anyone's permission.

I know it exists because I spent years building it from the inside.

.. or listen to the podcast:

Why the R&D Signal Was Blurry

When I was running innovation at HP, we discovered this problem firsthand. We had a connection between R&D investment and gross margin that held up across decades of HP history. Better than anything Wall Street was using. But the signal was blurry. None of us could figure out why.

The answer came from a question someone on the team asked almost as an aside.

What if R&D isn't one thing?

Research and Development Are Not the Same Thing

Think about what actually lives inside a typical R&D budget.

There's a team somewhere investigating whether a new approach could enable a capability that doesn't exist yet. No product defined. No spec written. Asking whether something is even possible.

And there's a team building the next version of a product that ships in eighteen months. Spec locked. Timeline set. Engineering executing against a defined target.

Both show up on the same line in the budget. Both get called R&D. Both count equally toward the number that gets reviewed every quarter.

They are not the same thing.

One is Research. The other is Development.

Research is the work you do when you don't yet know what you're building. The output is understanding. New knowledge that might enable future products nobody has designed yet. You can't know exactly what you'll find. If you already knew, it wouldn't be research.

Development is the work you do when you know exactly what you're building. The spec exists. The product is defined. The question isn't what to make. It's whether it can be made, on time, at cost, at quality.

One creates the future. The other delivers the present. And for fifty years, every public company in America has been required to report them as one indistinguishable number.

When we split the HP data along that line, Research on one side and Development on the other, the signal sharpened immediately. Research spend, measured against gross margin three to five years later, was a meaningfully stronger predictor than the combined number had ever been.

The blur hadn't been in the gross margin data. It had been in the R&D number itself. Two fundamentally different things, averaged together, producing a number that looked precise and predicted almost nothing.

But splitting R from D at the company level was only the beginning. The model was still lying to us. Just more quietly.

Enjoying this? Studio Sessions delivers innovation decision insights to your inbox.

Why Company-Level R&D Splits Still Mislead

Even with the split, something was still soft. HP wasn't one business. It was dozens. Printers, PCs, servers, software, each running on different timelines, different technology cycles, different competitive dynamics.

What if the R/D split meant something different depending on where it was applied?

We pushed it to the product line level. Then further, to the platform level within product lines.

Printers were the clearest example.

HP's printer business wasn't one story. There were platforms built on established technology. Mature ink systems, proven print head chemistry, products that had been shipping for years. And there were platforms built on genuinely new core technology. New chemistry. New mechanisms. New approaches to fundamental problems that nobody had solved yet.

Research investment by platform told a completely different story than Research investment by product category. The Research going into new technology platforms had a completely different relationship to future margin than Research going into mature platforms. Different time horizons. Different risk profiles. Different margin implications years down the road.

Laptops told the same story. A traditional consumer laptop line and a high-performance portable workstation weren't the same investment. One was Development-heavy. Defined product, known market, engineering executing against spec. The other had genuine Research behind it. Unsolved thermal problems, new form factor constraints, and materials questions that hadn't been answered yet.

When a single R&D assumption is applied across all of that, treating every dollar the same regardless of what it actually does, the signal disappears into the average. Peanut butter across the portfolio.

The model only got honest when it got specific. Research by platform and Development by platform, matched against the margin performance of those specific platforms years later. Which platforms were building future margin? Which ones were running on margin that past Research had already bought?

We could see it because we were inside the company. The question is whether anyone on the outside could ever see the same thing.

The R&D Data the Government Collects and Won't Release

Outside the internal budget process, everyone sees the same thing: a single line on the income statement.

The US government recognized decades ago that the combined R&D number was analytically useless. So they built a system to collect the real one.

The National Science Foundation runs a survey called the Business Enterprise Research and Development survey. The BERD survey. Every year, roughly 47,500 US companies are required to report their R&D spending broken into three categories: basic research, applied research, and experimental development. The split that every board and every investor needs to see. Mandatory. Collected. Verified.

And then locked away.

The firm-level data is confidential under federal law. The NSF publishes only industry-level aggregates. So every company fills out this survey and reports its real R/D split to the government. That data sits in a federal database. And the boards, investors, and analysts who need it most cannot access it.

Researchers at Northwestern and Boston University were given rare access to that confidential data. What they found is striking. When companies face financial pressure and cut R&D, they don't cut Development. They cut Research. Almost entirely. Development barely moves.

Every earnings squeeze. Every activist campaign. Every cost optimization program. Systematically targeting the one part of R&D that builds future margin. And because the combined number barely moves, nobody on the outside sees it happening.

That's not a coincidence. That's the accounting standard doing exactly what it was designed to do: produce one clean number for the income statement. It was never asked to protect the future.

How to Estimate the Research-to-Development Split Without Inside Access

So what can actually be done without access to the locked data?

More than most people realize.

Step 1. Find the industry baseline. The aggregate BERD data is public at the sector level. Ask an AI tool for the Research-to-Development ratio for the relevant industry. That's the benchmark. Everything else gets measured against it. A company spending 8% of its R&D on Research in an industry where the average is 25% is telling you something the combined number never would.

Step 2. Look at the gross margin trend compared to peers. Gross margin over time is the most honest external signal of Research health. A company with a declining margin relative to peers, while reporting flat or growing R&D spend, is almost certainly shifting the mix toward Development. The math works in the other direction, too. An AI tool can pull this comparison for any public company in minutes. This is exactly the signal that was invisible at HP until it was too late.

Step 3. Look at patent trends compared to peers over time. Patents are an imperfect but useful directional indicator. Not because more patents always means more Research. It doesn't. But a sustained decline in patent output relative to peers, alongside flat R&D spend, suggests the investment is maintaining existing products rather than creating new knowledge. Combined with the gross margin trend, it starts to triangulate where the split actually sits.

None of these three steps requires access to an internal budget. All of them can be done in an afternoon with public data and an AI tool. Together, they produce a working picture of the R/D split that the income statement was never designed to reveal.

What the R&D Split Revealed at HP That No One Outside Could See

When Hurd took over in 2005, HP was spending $3.5 billion on R&D. Roughly 4% of revenue. By 2009, his last full year as CEO, that had dropped to $2.8 billion. Revenue had grown significantly over that period, so the percentage had fallen further still, to under 2.5%. Both the dollar amount and the ratio were declining simultaneously while the company got larger.

Wall Street tracked the combined number. The board reviewed it. Nobody raised a structural alarm.

The Research component within that total was well below the industry average for comparable technology companies. Not slightly. Significantly.

The margin consequences arrived years later. They always do.

What Happens When the Definition of Research Doesn't Exist

The R/D split gave us a real predictive signal. We ran with it. The conversations were sharper. But the team kept pulling on a thread that nobody expected.

When we looked closely at what was actually being called Research, project by project and budget line by budget line, things that didn't feel the same kept appearing. Work aimed at fundamental discovery. Work aimed at solving a specific defined problem using entirely new methods. Both labeled Research. Up close, they behaved differently, predicted different things, and when budgets got tight, got treated very differently.

So we went looking for the agreed definition. The official standard that would tell exactly where to draw the lines inside Research.

It didn't exist. Not the way we needed it to. And without it, everything we'd built was sitting on sand.

How do you build a predictive model on a definition that doesn't exist?

That's the next episode.

Innovation insights from Phil McKinney

Four decades of decisions. Delivered to your inbox.
Free or paid — your choice.


Endnotes/Sources

  1. "a number designed by accountants to solve a different problem entirely": Financial Accounting Standards Board, Statement of Financial Accounting Standards No. 2: Accounting for Research and Development Costs, October 1974, codified as FASB ASC 730. The standard was issued to resolve whether R&D costs should be expensed or capitalized — an accounting treatment question, not an analytical one. It requires all R&D to be expensed as incurred and reported as a single income statement line item. Companies are not required to disclose the split between research and development activities because both receive identical treatment. The standard solved the problem it was asked to solve.
  2. "locked away by federal law": National Center for Science and Engineering Statistics and Census Bureau, Business Enterprise Research and Development Survey, 2023, NSF 25-354 (Alexandria, VA: U.S. National Science Foundation, 2025). https://ncses.nsf.gov/pubs/nsf25354 . The BERD Survey collects R&D spending disaggregated into basic research, applied research, and experimental development from approximately 46,000 US companies annually. Firm-level responses are confidential under Title 13 and Title 26 of the US Code and are never published at the company level. Only industry-level aggregates are released. The 2023 aggregates show US businesses spent $722 billion on R&D: basic research $43 billion (6%), applied research $110 billion (15%), experimental development $568 billion (79%). Summary figures are also published in the companion InfoBrief: NSF 25-353. https://ncses.nsf.gov/pubs/nsf25353 .
  3. "they cut Research. Almost entirely. Development barely moves": Filippo Mezzanotti and Timothy Simcoe, "Research and/or Development? Financial Frictions and Innovation Investment," NBER Working Paper 31521, August 2023. doi:10.3386/w31521 . https://www.nber.org/papers/w31521 . Using confidential Census BERD data with firm-level disaggregation, the authors studied approximately 1,100 large US firms during the 2008 financial crisis. Companies facing refinancing pressure made larger cuts to R&D, but the reduction was concentrated almost entirely in basic and applied research. Development remained essentially unchanged. Mezzanotti is at the Kellogg School of Management, Northwestern University; Simcoe is at the Questrom School of Business, Boston University.
  4. "The Research component within that total was well below the industry average": HP fiscal 2009 annual report, filed with the SEC, confirms total R&D of $2.819 billion on revenue of approximately $114.6 billion (2.5% of revenue), down from $3.490 billion on revenue of approximately $91.7 billion (3.8%) in fiscal 2005, Hurd's first full year as CEO. The Research component within that total is estimated by applying the internal R/D split methodology developed by the Innovation Program Office during Phil McKinney's tenure as CTO. External secondary sources: "HP Faces Criticism for Shortchanging Research and Development," The Rational Walk, August 2020. https://rationalwalk.com/hp-faces-criticism-for-shortchanging-research-and-development. "Valuing Hewlett Packard vs. Tech Titans IBM, Dell and Xerox," Seeking Alpha, June 2015. https://seekingalpha.com/article/289436 .
  5. "a broad and sustained shift away from Research and toward Development": Ashish Arora, Sharon Belenzon, and Andrea Patacconi, "Killing the Golden Goose? The Decline of Science in Corporate R&D," NBER Working Paper 20902, January 2015. doi:10.3386/w20902. https://www.nber.org/papers/w20902 . The paper documents a shift away from scientific research by large US corporations from 1980 to 2007. Publications by company scientists declined across industries even as total R&D spending rose. The value attributable to scientific research dropped while the value attributable to patents held steady — consistent with firms preserving development output while withdrawing from research investment.
Studio SessionsInnovation Signal IndexR&D Spendingresearch and developmentInnovation Investmentinnovation metricsgross marginHewlett-PackardHPHPQinnovation leadership skillsBoard of Directorphil mckinney

Phil McKinney Twitter

Phil McKinney is an innovator, podcaster, author, and speaker. He is the retired CTO of HP. Phil's book, Beyond The Obvious, shares his expertise and lessons learned on innovation and creativity.

Comments


Related Posts

The Innovation Metric Bill Hewlett and Dave Packard Used

HP used this R&D benchmark for decades and still managed to forget it. Most companies never found it.

Image of Bill Hewlett and David Packard sharing a secret

PayPal Spends $3 Billion on R&D. Almost None of It Is Research.

Enrique Lores just inherited a research gap a decade in the making. Here's what it will take to fix it.

PayPal Spends $3 Billion on R&D. Almost None of It Is Research.

The R&D Metric Mark Hurd and HP Got Wrong

How one flawed benchmark drove years of R&D decisions and quietly drained HP's innovation pipeline.

The R&D Metric Mark Hurd and HP Got Wrong