For most of Major League Baseball’s history, the distance that a player hit a home run was anybody’s guess. But starting in 1990, measuring the distance home runs traveled became a science as one researcher adapted a medical imaging technology into a quintessential baseball tool. Syd Mandelbaum helped transform what had previously been a speculative pissing contest into a legitimate statistic. It even helped settle some old scores.
On Thursday, the morning of the MLB’s annual Home Run Derby, Mandelbaum, my friend’s dad and the man who first figured out how to calculate the distance a monster dinger traveled, explained the unexpectedly medical-inspired history of home run measurement. For years, Mandelbaum worked on medical imaging technologies used for in vitro fertilization (IVF) as well as for measuring the distance between blood cells. He recalls the moment he realized he could use microscopic measurements to solve a macroscopic problem.
“One day I was watching a Yankee game, and Jesse Barfield had hit a home run,” recalled Mandelbaum, who is now the founder and CEO of food waste reduction nonprofit Rock and Wrap It Up!. The Yankees’ color commentator at the time, Phil Rizzuto, speculated that Barfield had hit the homer 400 feet, sparking a live debate with his colleague Bobby Murcer. That was Mandelbaum’s eureka moment.
“All of a sudden, it occurred to me that if I could take the micro-measurement algorithms, they could become macro-measurement algorithms, as long as I learned how to correct for the difference of magnification within a ballpark,” he says. “I was able to use an overhead photo of a ballpark, taken directly down 90 degrees. I entered that into my computer, and we were able to enter X and Y measurements and measure specific distances on the photograph.”
The principle behind this innovative approach came directly from Mandelbaum’s work with Optech Instrument Corp., where he had worked on a microscopic measurement technique that helped overcome a common problem with in vitro fertilization, a fertility technique where sperm and egg are mixed together in a petri dish: Doctors couldn’t accurately gauge the best part of an ova’s zona pellucida — its protective membrane — to pierce during IVF, and often accidentally destroyed ova while attempting to fertilize them. By imposing a simple X and Y axis on an image and using the associated measurement algorithm, doctors could then ensure that they aimed for the thinnest part of the zona pellucida, therefore saving the valuable oocytes from accidental destruction.
So Mandelbaum scaled this technique up. He used the known distance from first to third base as an X axis (it’s always 127.28 feet) and the line from home plate to the center of the outfield wall (which is different for each field, but always known) as the Y axis.
“The first game that we used it in was April 16, 1991,” says Mandelbaum. He set up his tiny computer and a borrowed monitor in a suite at San Francisco’s Candlestick Park (with Grateful Dead manager Dennis McNally along for the ride), and measured the first home run ever measured in a game: The San Francisco Giants’ Will Clark hit a 381-foot home run to right field in a game against the team’s rival, the Los Angeles Dodgers. Shortly after the run, an operations staffer for the Giants ran up to the booth to tell Mandelbaum that a physical measurement from the back of the fence matched the computer model.
“I knew that I had sold him.”
From that moment, Mandelbaum’s technique grew in popularity, and people started approaching him to retroactively measure home runs that had happened before the days of measurement. And while this measurement technique is no longer used today, the current technologies, like ESPN’s Home Run Tracker, use the same general idea and add a few more variables. So, when you tune in to the MLB Home Run Derby on Thursday night and that number pops up on your TV to tell you how far Aaron Judge or Giancarlo Stanton just hit a dinger, remember Syd Mandelbaum and his microscopic measurement algorithms.