Since 2015, Major League Baseball has used a system called Statcast to collect data about the game, measuring things like the speed of pitches or the launch angle for hits. One of Statcast’s most popular features is “projected HR distance”, which purports to tell just how far away a home run would have landed if not for obstructions such as fans and back walls. These stats don’t have any impact on the game, but they do give commentators and fans more to talk about.
During the recent 2025 Home Run Derby, however, Statcast went a bit too far. In the first round of the contest, Brent Rooker and Cal Raleigh were tied with 17 home runs hit, meaning the distance of their longest home run would be used to determine which man would advance. However, each of their longest dingers had been measured at 470 feet, seemingly another tie. The announcers asked if Statcast could provide more precise distances. Shortly after, Raleigh was awarded the tiebreaker, with his 470.6 foot home run besting Rooker’s 470.5 footer.
You might be thinking “Can this system accurately estimate a 1/10 of a foot difference?”, and you would be right to be skeptical. That’s 1.2 inches or just over 3 centimeters, and it’s difficult to believe it could actually be that precise. But don’t worry, it got much worse.
A bit later, the following on-screen graphic was shown:

[Photo credit: @matttomic]
That table reads:
Brent Rooker: 470.5351740593
Cal Raleigh: 470.6171452141
You may remember significant figures from a high school science class. They’re used to describe the precision of a numerical result, with more decimal places indicating more precision. The basic idea is that because your measuring tools aren’t infinitely precise, you don’t report digits you can’t trust.
Someone at Statcast might need a refresher. While the above graphic lacks units, it’s talking about feet. That means they’ve provided a measurement down to ten-billionths of a foot. That’s smaller than the width of a single hydrogen atom, and it’s very, very dumb.

