In today’s Wall Street Journal, Carl Bialik writes about "statistical time travel" performed by number-crunching researchers.

"In recent years," writes Bialik, "statisticians have created time machines to answer a wide range of historical hypotheticals, from how today’s Supreme Court would have voted on Roe v. Wade to what sort of scientific papers Einstein might write today."

One of the researchers highlighted in the article is Princeton computer scientist David Blei, who has done a computational analysis of more than a hundred years’ worth of Science magazine.

This is how Bialik describes Blei’s research:

"His system identifies topics from scratch and assigns topic scores — say, 80% neuroscience and 20% philosophy, or 40% biology and 60% chemistry. Any papers that have the same topic scores could then be grouped together, even if they are decades apart and keywords or concepts didn’t yet exist. (Think of quarks or H1N1.)

"Here the critical bridge — the necessary overlap to relate past decades to the present — were keywords that were associated with others before they faded… Such techniques connected an 1880 paper on orangutan brains with a 1976 paper on monkey brains.

"That technique helps dig up research that was ahead of its time. For instance, these very time machines, including Dr. Blei’s, make use of so-called Bayesian statistics, which were developed decades before there was sufficient computing power to use them fully."

You can hear Blei talk about his work in this 2007 Google Tech Talk. Blei’s recent research includes papers on "finding latent sources in recorded music," "a computational approach to style in American poetry," and "augmenting social networks with text" — this last paper being coauthored with former student Jonathan Chang, now at Facebook, who in a recent blog post describes various visualizations he created of theonion.com’s twitter traffic.