More often than not, making predictions is an invitation to codify dumb opinions into novelty forecasts, flushed and forgotten as soon as they're made. The NCAA Tournament, though, provides us with a unique opportunity to evaluate just how full of crap expert opinions are. Specifically, we can compare the bracket accuracy of experts to the bracket accuracy of average, non-experts.

For this project, I collected data on all 11 million completed brackets (out of 12 million which were started) on's Tournament Challenge. Then I identified the brackets of 53 experts from CBS Sports, ESPN, Fox Sports, SB Nation, SI, and USA Today. (We would have loved to have had more, but experts making their entire bracket public is more rare than you'd think.) This list, while certainly not exhaustive, includes pundits, announcers, and sportswriters who regularly weigh in on college basketball-related topics, and who had a bracket which was made public. The table with the data about these experts' brackets is available here.

To evaluate how much better (or worse) the experts were at predicting this year's tournament, I considered three criteria: the number of games correctly predicted, the number of points earned for correct picks, and the number of Final Four teams correctly identified. Generally the experts' brackets were slightly better than the non-expert ones, although the evidence isn't especially overwhelming. The analysis suggests that next year you'll have just as good a chance of winning your office pool if you make your own picks as if you follow the experts.

11 Million Brackets Vs. ESPN, CBS, And Fox Experts: Who Was Better?

Points Earned Per Bracket

ESPN's scoring formula devalues picking correct upsets in early rounds (10 points to correct Round of 64 picks, 20 points for Round of 32, 40 points for Sweet Sixteen, etc.), but let's start here since the point of filling out a bracket is winning it. On the graph above, the 11 million non-expert brackets are represented by the red histogram, while the 53 expert brackets are shown as blue dots underneath the graph. Going by points, experts tend to fall slightly above the mean for non-experts.

The difference is substantial, though not huge. Expert brackets earned an average of 651.0 points, while non-experts earned 604.4 points. The difference of 46.6 points is statistically significant (SE = 11.6; p < .01), but it means that, on average, the experts picked only about one extra Elite Eight qualifying team (or two extra Sweet Sixteens) than the non-experts. Dan Wolken of the USA Today earned the most points among the experts with 830, although there were 215,737 brackets (1.96 percent) which outperformed him.

(And since you're wondering the big spikes you see at the 680- and 80-point marks are for the massive amount of brackets that picked all favorites and all underdogs, respectively.)

11 Million Brackets Vs. ESPN, CBS, And Fox Experts: Who Was Better?

Number Of Correct Games

This graph shows the distribution of the number of games correctly picked by non-experts and experts. As you can see, it appears that experts are not substantially better at picking overall winners than you or me.

On average, experts accurately predicted 37.9 games (out of 63) while non-experts correctly picked 35.6. This difference is statistically significant (SE = 0.53; p < .01), although a difference of just 2.27 games probably wouldn't overwhelm most people. In total, here were 4566 ESPN brackets (0.04 percent) with more correct picks than Jerry Palm and Luke Winn, who had the highest number of correct picks, 46, among the experts.

Correct Final Four Picks

It's possible that experts aren't especially good at picking early upsets, but they make up for it by doing well at predicting the Final Four. If this is typically the case, this year was a huge exception—which is obvious enough, given a 7- and 8-seed made the championship game. Not only did zero of the 53 experts pick UConn to win the national championship, not a single one of them had UConn or Kentucky in the championship game!

Nine of the experts picked half of the Final Four teams correctly, and on average the experts picked just one Final Four team correctly. This is not that much better than the non-experts who picked 0.87 Final Four qualifiers, on average (SE=0.08; p = 0.12). If Dayton, UCLA, or Stanford won the South region instead of Florida, experts would have actually looked worse at picking the Final Four (0.25 correct picks out of 4) than the non-experts (0.28).

The obvious critique of this analysis is that comparing experts to all the brackets on's Tournament Challenge isn't a fair comparison. This is true, but in many ways it should be generous to the experts. No good Cal Poly graduate can go without filling out at least one bracket where the Mustangs win the championship (you know, just in case). And then there are the other various bracket quirks, like the 31,449 people thinking they're clever with an all-upset bracket. Since they're all left in this analysis, they should really make the experts look better in comparison.

Even though the gaps aren't huge, I do find that experts were statistically significantly better at picking games and earning points than non-experts. If I trim off the 9.8 percent of brackets with the fewest correct picks, the experts' number of correct picks becomes statistically indistinguishable from the non-experts. Likewise, eliminating the worst 8.6 percent of brackets with the fewest points has a similar results.

So what's the takeaway? As a group, the experts are better than the public, but be mindful if you're just planning on copying one of their brackets. The average expert isn't far removed from just plain average.

Charts by Reuben Fischer-Baum

Stephen Pettigrew is a PhD candidate at Harvard University, where he studies political science and American politics. He also has a master's degree in statistics from Harvard. In his spare time, he writes about sports analytics, particularly in hockey and football.