Nate Silver

From RationalWiki
Jump to navigation Jump to search
Someone's looking for a Pulitzer.
God, guns, and freedom
U.S. Politics
Icon politics USA.svg
Starting arguments over Thanksgiving dinner
Persons of interest

Nathaniel Read "Nate" Silver (1978–) is an American statistician and author and probably also a witch.[1] After making a fuck-ton of money through online poker (in its relative infancy when it mostly attracted people with more money than poker skill), he made a name for himself by becoming a political statistician, founding the blog FiveThirtyEight to analyse US elections and becoming a staple on the New York Times. FiveThirtyEight has since been moved to a larger partnership with sports network ESPN.Wikipedia[2] Ultimately, Silver would be forced to leave FiveThirtyEight in 2023 because he was laid off by the Walt Disney Company.[3] By September, FiveThirtyEight would be moved and redirected to ABC News.[4]

In contrast to pundits, Silver's approach was to use both mathematics and evidence to make his predictions — notably the use of Bayesian statistics, which take into account prior information on how states voted. Putting it down to being a bit of a nerd for numbers,[5][6] this has turned out to be remarkably successful.

Election predictions[edit]

With a combination of this skill[note 1] and a bit of luck, he successfully predicted the outcome of the 2012 U.S. Presidential Election with almost perfect accuracy. This was notable because at the time, most mainstream media were predicting the election to be "too close to call" — Silver, on the other hand, actually looked at the data and found it to tell a different story, one where the incumbent Barack Obama had a 90% chance of winning, and by a significant margin. It was the second time he pulled this off, getting 49/50 states right in 2008.[note 2]

2016[edit]

Silver, while mostly accurate in his predictions, made several rather uncharacteristic mistakes during the 2016 elections. While he indicated that Hillary Clinton would win the Michigan Democratic primary,[7] it actually went to Bernie Sanders — although virtually every pollster got that one wrong, so it's not entirely his fault.[8][9]

What is his fault, however, is when he endorsed and co-wrote an article titled "Bernie Sanders may win Iowa and New Hampshire - and lose everything else,"[10] which certainly wasn't very accurate at all. (This prediction was so inaccurate that a fictional parody pundit named Carl Diggler was found to be better at predicting election outcomes than Silver was.[11])Another fuck-up occurred when he insisted that Donald Trump would not win the Republican presidential nomination,[12] despite repeated polling evidence showing that Trump was leading; Silver himself admitted that he was dead wrong about this one.[13] He's since admitted to using punditry for this election cycle rather than just sticking to his raw-numbers analysis.[14]

538's model uses high correlations between the states, to the extent that a candidate winning a single swing state doubles their probability of winning the election. For example, if Clinton wins Nevada, in the model her probability of winning goes up to 91, not because the 6 electoral votes from Nevada are so very important, but because according to Nate's model, the swing states behave in a very similar manner, so whoever wins one swing state will sweep all of them. In this case, that obviously favored Trump. He needed to win all swing states plus at least one "blue wall" state. FiveThirtyEight's final prediction for the 2016 election had Clinton's probability of winning at about 70% and Trump's at 30%, but Trump ended up winning because he did in fact pull off the once-thought-to-be-impossible feat of winning most of the swing states (including, crucially, Florida and Pennsylvania) as well as two other states thought to be solidly Democratic (Wisconsin and Michigan, albeit winning the latter very narrowly indeed). Clinton did win Nevada, but losing those key northern states destroyed her once infallible position in the Northeast, and that coupled with the loss of Florida lost her the election. Therefore, Silver's predicted result for the election using this model was way off, although he was correct that one candidate was going to sweep most of the swing states — it's just that he picked the wrong one. It should be mentioned, however, that Silver frequently pointed out that Trump was just "one standard size polling error" away from winning the electoral vote. And the national popular vote (which went to Clinton by two points) was well within the margin of error (the polls had her up by roughly four points). In fact in a "Dewey defeats Truman" election, Silver was pretty much the only mainstream source without partisan bias or motivation to give Trump more than a "Yeah, and pigs can fly" chance on election day. But because Democrats were so shocked at losing the election after "everybody" had them comfortably winning it and Republicans had had an axe to grind with him for 2012, and combined with the general public apparently being incapable of grasping the likelihood of a "30%" event actually happening — roughly one in three times — Silver and his website now has to deal with hecklers saying "Yeah and you were wrong on the 2016 election, so shut up".

This was the worst possible result for Nate Silver personally. He correctly warned of a split result between the Electoral College and the popular vote, and repeatedly said that Trump had a better chance at winning than most gave him credit for (compare The Huffington Post claiming that Clinton had a 98% chance of winning).[note 3] However, because his model failed to predict the actual winner, Silver's reputation as an election-forecasting wizard has arguably gone down the tubes. Instead of being remembered for correctly predicting all 50 states between Obama and Romney, he may be remembered for his methodology (like everyone else's) failing to predict the final 2016 presidential result. Also, the sheer avalanche of election "updates" (usually overly-long blog posts stating the obvious) gave the impression that fivethirtyeight.com was massaging the "horse race" for clicks.

Conservative disdain[edit]

As Silver was both predicting a win for Obama and turned out to be staggeringly correct about it, he has become something of a pariah and figure of hate for those on the US far right, who still hold their heads in the sand over the election.[15] Being openly gay certainly hasn't helped there, as comments from Dean Chambers, who runs unskewedpolls.com, seemed almost entirely ad hominem in nature — dismissing Silver as nothing more than an "effeminate," "soft-sounding" "Castrat[o]."[16]

Predicting the world[edit]

He is described by Black Swan author Nassim Nicholas Taleb as "seriously knowing his shit,"[note 4] although the two appear to disagree on how predictable the world actually is. In his book The Signal and the Noise, Silver claims that the 9/11 terrorist attack was, to a degree, predictable — and furthermore that statistics may suggest another, even larger, attack is coming within the next decade or so — while Taleb's Black Swan theory would hold that such events are outside the realm of predictability.

Becoming the pundit he once disparaged[edit]

After 2016, Silver started drifting from his original focus of data-driven electoral analysis towards a new role as a purveyor of data-free political "hot takes" on Twitteressentially becoming the very same pundit that FiveThirtyEight was founded to counteract. As typical for any pundit, many of his political tweets were half-baked at best and on topics Silver had little to no expertise in while lacking the sort of numerical analysis he was once known for.[17][18]

This was deeply apparent when Silver made some particularly stupid and irresponsible tweets during the COVID-19 pandemic, many of which outright bordered on anti-vaxxer/anti-lockdown talking points popular among fringe extremists. Lowlights include bizarrely comparing school closures caused by the pandemic to the Iraq War,[19] believing that he was smarter than public health care professionals and making some ill-thought out criticism of a report (based on a misunderstanding of one slide) from scientists at the Advisory Committee on Immunization PracticesWikipedia regarding how best to allocate the vaccine,[20] questioning the decision to pause distribution of the Janssen COVID-19 vaccineWikipedia due to 6 cases of blood clots without evidence (and much hyperbole) claiming that "this is going to get people killed" and "it's going to create more vaccine hesitancy",[21] and a bad take where, based on one poll, Silver complained that "mixed messages" from the media and medical experts were unnecessarily increasing fear among the public of breakthrough cases of the disease (a stance that got plenty of push-back on Twitter from anyone with kids or who knew immunocompromised individuals).[22]

Silver still occasionally posts entries on his blog with the type of numerical analysis he was once famous for.[23] Sadly, however, in recent times, Silver is more known for his Twitter feuds and questionable political takes (some of which actually seem to ignore numeric evidence that contradict his conclusions)[24] instead of his prior data-driven analysis paradigm.

Into the dark side[edit]

Though Silver was frequently critical of political betting marketplaces in the past,[25] in July 2024, Silver was hired by Polymarket, a cryptocurrency-based event forecasting (gambling) firm.[26] Two months prior to this, Peter Thiel had invested $70 million in Polymarket.[27] Despite the new consulting job, Silver still ran election prediction models on the 2024 U.S. presidential election. Some commentators criticized Silver's models as being a conflict of interest, as Polymarket allowed you to gamble on the outcome of this election.[28]

See also[edit]

External links[edit]

Notes[edit]

  1. Not that it's difficult, just very uncommon amongst the usual political talking heads.
  2. In both cases, "he successfully predicted the outcome" means that the candidate Nate Silver gave the highest probability of winning won the given state. Silver's predictions (and others') are probabilistic, so over enough elections, the prediction should often be "wrong" - for example, if it's predicted that a candidate has a 60% chance of winning in each of ten states, over enough elections the candidate should lose an average of about four. Unfortunately the US presidential election does not happen often enough to really assess the accuracy of prediction methods, and the popular media (and the public) do not understand statistics well enough to distinguish "Candidate A has a 60% chance of winning Virginia" from "Candidate A is guaranteed to win Virginia".
  3. As detailed in a previous note, the nature of a presidential election as a singular event makes the latter "achievement" essentially pointless. If person A's model predicts a fair coin will land on heads only 25% of the time, and person B's model predicts it will land on heads 99% of the time, then flipping a coin once and getting a result of heads does not mean person B's model was somehow better.
  4. Okay, so the actual quote is "He's the real deal."

References[edit]

  1. As of press time, Nate Silver is probably a witch.
  2. And the Times has filled in the vacuum with The Upshot.
  3. Disney is shrinking FiveThirtyEight, and Nate Silver (and his models) are leaving, Sarah Scire and Laura Hazard Owen, Niemen Lab 25 April 2023
  4. Welcome to the new 538 website, Nathaniel Rakich and Amelia Thomson-DeVeaux, ABC News 18 September 2023
  5. Nate Silver: it's the numbers, stupid, The Guardian
  6. xkcd
  7. http://projects.fivethirtyeight.com/election-2016/primary-forecast/michigan-democratic/
  8. http://fivethirtyeight.com/features/why-the-polls-missed-bernie-sanders-michigan-upset/
  9. http://fivethirtyeight.com/features/what-the-stunning-bernie-sanders-win-in-michigan-means/
  10. http://fivethirtyeight.com/datalab/bernie-sanders-could-win-iowa-and-new-hampshire-then-lose-everywhere-else/
  11. https://www.washingtonpost.com/posteverything/wp/2016/05/09/our-fictional-pundit-predicted-more-correct-primary-results-than-nate-silver-did/
  12. http://fivethirtyeight.com/features/dear-media-stop-freaking-out-about-donald-trumps-polls/
  13. http://fivethirtyeight.com/features/how-i-acted-like-a-pundit-and-screwed-up-on-donald-trump/
  14. http://fivethirtyeight.com/features/how-i-acted-like-a-pundit-and-screwed-up-on-donald-trump/
  15. Nearly 50 high-profile wingnuts.
  16. [Gawker.com: archive.is, web.archive.org Shooting the Messenger’s Numbers: Nate Silver’s Struggle (The Redux)], Gawker
  17. "The Fall of Nate Silver" by Aaron Timms, New Republic, 2019 November 18
  18. "Nate Silver Is Making This Up as He Goes" by Jacob Bacharach, TruthDig, 2019 October 30
  19. "Nate Silver prompts outrage by likening school closures to the Iraq War" by Gustaf Klander, Independent, 2022 January 6
  20. "Nate Silver Draws Criticism for COVID Vaccine Report Interpretation" by Matt Cannon, Newsweek, 2020 December 20
  21. "Nate Silver Feuds With CNN Medical Analyst on Twitter After She Dismisses His ‘Arrogant and Uninformed’ Take on J&J Pause" by Katherine Huggins, 2021 April 14th
  22. "Nate Silver Told to 'Shut Up' As Twitter COVID Musings Face Huge Backlash" by Brendan Cole, 2021, September 9
  23. "Fine, I'll run a regression analysis. But it won't make you happy.", Nate Silver, Substack, 2023 October 1
  24. "Enough of the Gospel According to Nate Silver", Electoral-Vote.com, 2024 February 20
  25. "Polymarket and Nate Silver Want to Reshape Political Forecasting" by Nitish Pahwa, Slate, 2024 August 16
  26. Polymarket Hires Nate Silver After Taking in $265M of Bets on U.S. Election: Report by Oliver Knight (Jul 17, 2024, 8:49 AM PDT) Yahoo! Finance.
  27. Peter Thiel Invests In Polymarket Political Betting Platform—But The Future Of Gambling On Elections Remains Unclear by Zachary Folk (May 14, 2024, 02:19pm EDT) Forbes.
  28. "Nate Silver faces backlash for pro-Trump model skewing" by Griffin Eckstein, Salon, 2024 September 6