Why you should care

Because behind the jargon is something much smarter. 

Renny McPherson is a tech entrepreneur and former Marine intelligence officer. Joe Flood is a reporter and data analyst. They co-host the new data-wonk podcast Numbers & Narrative.

Just like big data” before it, perhaps the most overused phrase in the Wall Street-Silicon Valley corridor in recent years has been “Moneyball for (fill in the blank).” Moneyball for sales and marketing, for startups, even for the military and academics — it’s so ubiquitous you’d think the phrase had origins far more recent than a 2003 Michael Lewis book and a 2011 Brad Pitt movie based on it.

The appeal of the phrase to consultants and pitchmen is clear: a three-syllable pop culture reference that stands in for clunky terms like “integrated data analysis.” But turning “Moneyball” into an all-things-for-all-people piece of jargon has obscured what a “Moneyball approach” really means — and how wrong a misuse of it can go.

For those who don’t know, Moneyball is the fantastic story of how the Oakland Athletics, one of the poorest teams in Major League Baseball, used statistical analysis to outperform far richer clubs. General Manager Billy Beane and his staff identified market inefficiencies (like how on-base percentage was underrated) and used those insights to acquire “undervalued assets” — little-known players with those skills.

A fantastic book has been transformed into a kind of algorithmically enabled snake oil.

But the book’s true lesson is not about data or statistics, per se. It’s that the scientific method — a process of questioning assumptions, testing hypotheses and adjusting accordingly — can yield huge benefits, especially for underdogs needing an edge. At its heart, Moneyball is about having the smarts to challenge orthodoxies and the humility to admit mistakes.

The problem is that it’s been perverted into an orthodoxy of its own. Bespectacled quants “hacking” their way to whiz-bang easy answers are good. Older, experienced employees and intuitive approaches are bad. Quantitative certainty is good; qualitative nuance is bad. A fantastic book has been transformed into a kind of algorithmically enabled snake oil.

Easy orthodoxies like this are the very antithesis of a real “Moneyball approach,” because the scientific method is a foe to shibboleths of all stripes. Some of the old, sad scouts who Brad Pitt eviscerated around a conference table in the movie were (partially) right all along: Drafting players straight out of high school isn’t a bad investment, as Beane once thought, and catchers actually have a huge impact on pitchers’ performance based on how they “frame” pitches, which affects whether an umpire calls a borderline pitch a ball or a strike.

These corrections aren’t a problem with Moneyball or Michael Lewis or Billy Beane any more than Aristotle or Ptolemy were bad scientists for thinking the sun revolves around the Earth: The scientific method doesn’t produce answers in perpetuity, it creates hypotheses that last until new data and analysis disprove them.

You don’t have to go very far from Moneyball to see how badly turning an early set of statistical conclusions into an unchallenged orthodoxy can go — just read Michael Lewis’ other books. His first, Liar’s Poker, is about Lewis’ time working for Salomon Brothers during Wall Street’s “greed is good,” 1980s heyday. One fascinating character is bond trader John Meriwether, an early quant who used a Billy Beane-esque series of computer programs and data-driven insights to capture huge profits, and ultimately change how the financial world worked.

When Meriwether later set up his own hedge fund, he partnered with two Nobel Prize-winning godfathers of statistical finance, Myron Scholes and Robert Merton, gave free rein to his traders and used borrowed money to multiply their bets. The old gray-hairs at Salomon Brothers would have never taken such risks, but for a few years the system worked brilliantly. Then the firm, Long-Term Capital Management, went belly up in a fashion so spectacular it nearly crippled the financial system.

The same process occurred again a decade later when Wall Street’s predictive modeling systems, like the much-vaunted “value at risk model, convinced the quant luminaries that there was no way the mortgage market could decline significantly. They, too, placed huge bets with borrowed money. Again, the once-dominant “wise old men” of Wall Street would have never allowed such aggressive risk-taking based on a poorly understood computer model, but the quants had no such compunction, and the resulting financial crisis remains with us seven years later. Once again, Lewis was there to chronicle the collapse in his book The Big Short, which profiles investors — some relying on common sense, others on data analysis — who bet against the bad mortgages and computer-model acolytes and made a killing.

This isn’t to say that statistical analysis has no value. In an increasingly complex, data-centric world it’s a necessity for any institution. But it’s also necessary to balance out statistics with other methods — even traditional, intuitive ones — and to not blindly trust in what the computers or consultants tell you. “What happened on Wall Street was sort of Moneyball gone wrong,” Lewis told me a few years ago. “It’s just as easy to misuse statistics as to misuse intuition.”

Data analysis is a great way for smart people to think hard about complicated issues. The problem is when a “Moneyball approach” becomes an excuse for lazy leaders to not think about the challenges they face.

This OZY encore was originally published March 17, 2015.

Comment

OZYPOV

Interviews, op-eds, and analysis to help you make sense of the news of the day and the news of the future.