In June of 2013, Columbia University political scientist Donald Green was approached by UCLA graduate student Michael LaCour with some remarkable findings about canvassing work done by Los Angeles LGBT Center: Empathetic conversations between gay canvassers and local residents were able to make lasting converts to the cause of marriage equality. Straight canvassers also had an impact but on a much smaller level. LaCour’s findings stood in contrast to a large body of scholarship that shows canvassing rarely changes public opinion in any sustained way. The journalistic rule to be wary of a story that seems too good to be true is one that academics could benefit from as well.

“I thought they were so astonishing that the findings would only be credible if the study were replicated,” Green recently told the website Retraction Watch. LaCour and Green’s article, “When Contact Changes Minds: An Experiment on Transmission of Support for Gay Equality,” was published in the December 2014 issue of the prestigious journal Science. It immediately caught the attention of journalists and political activists, serving as the basis for an episode of This American Life, and articles in The New York TimesThe Wall Street JournalBloomberg Politics, and other publications. According to Ira Glass, host of This American Life, the study seemed to show that the canvassers of the LGBT Center had “invented something new, a new tool to change people’s opinions.”

Unfortunately, it increasingly looks like what was invented was not a new tool of persuasion but rather the evidence of the study itself. Challenged by subsequent researchers who have not been able to replicate the findings of the 2014 article and evidence that LaCour made false claims about funding for his research, Green has asked Science to retract the article.

In trying to make sense of this fiasco, it’s important to realize that the implicit trust Green placed in LaCour was perfectly normal and rational. While science includes gatekeeping measures to weed out inferior research, in their day-to-day collaborative activities scientists have to assume that the people they are working with are not pathological liars, that they won’t simply make up data. This is the kind of social cohesiveness that led one professor to tell This American Life, “I trust anything Don Green publishes.” In this particular case, that trust was misplaced but some level of collegial confidence is the necessary lubricant to allow research to take place.

The publication of so dubious an article is likely to embarrass the many parties involved; not just Green but also the LGBT Center, the journal Science, and the fellow social scientists who greenlit the article during the peer review process, among others. While the publication of the article is an embarrassment to the journal Science, it is paradoxically a vindication of the discipline itself.

When he was first presented such surprising findings, Green did exactly what he was supposed to do: asked to see them repeated. “Michael LaCour and Dave Fleischer [of the LGBT Center] therefore conducted a second experiment in August of 2013,” Green told Retraction Watch, “and the results confirmed the initial findings. Convinced that the results were robust, I helped Michael LaCour write up the findings, especially the parts that had to do with the statistical interpretation of the experimental design.”

Because he didn’t have Institutional Review Board approval from Columbia University to conduct the research, Green relied on LaCour’s summary of his findings rather than an inspection of the raw data. In effect, even though he was skeptical of the results, Green decided to trust that LaCour was working with evidence in good faith—a not unnatural decision in academic fields that rely on collaborative work where fact checking colleagues would slow down research.

The peer review process failed, but peer review shouldn’t be fetishized as the only gatekeeper between dubious junk and good science. Peer review is part of a larger process that includes subsequent research that tries to replicate findings. If a study is published in a peer reviewed journal and has a significant finding (as the LaCour and Green paper did) then other social scientists will try and build on it. If they can’t duplicate the earlier findings, that in itself shows the results were not robust.

As the editors of Science noted in their response to Retraction Watch: “No peer review process is perfect, and in fact it is very difficult for peer reviewers to detect artful fraud,” they wrote. “Fortunately, science is a self-correcting process; researchers publish work in the scholarly literature so that it can be further scrutinized, replicated, confirmed, rebutted or corrected. This is the way science advances.”