Earlier this month, a forthcoming article in the American Journal of Public Health made a major media splash by concluding that states with higher rates of gun ownership experienced higher homicide rates among police officers. In a nutshell, the conclusion among anti-gun pundits was that civilian gun ownership is killing our nation’s cops. Of course, the media had a field day.
The one pesky problem about the study’s conclusions was that they didn’t hold water. Economist and firearms advocate John Lott of the Crime Prevention Research Center quickly stepped up to the plate, explaining to anyone willing to listen—as, apparently, most of the media outlets that broke the story were not—that the study featured major problems with experiment design. He even showed that when the flaws were corrected, the resulting numbers showed exactly the opposite result—that more guns leads to fewer police deaths!
There ought to have been a storm of retractions following this stunning takedown, but unsurprisingly most media outlets were not keen to change their tune. We asked Dr. Lott about the fallout from this study and the increasing amount of skewed “research” that anti-gunners have been cooking up recently. He even showed that when the flaws were corrected, the resulting numbers showed exactly the opposite result—that more guns leads to fewer police deaths!
America's 1st Freedom: What were the biggest things that the authors of the study on police deaths in the American Journal of Public Health did wrong?
John Lott: The main problem with the study was that they didn’t use traditional control variables that people who do this type of research employ on so-called panel data (where you look at a lot of different places over time). This study never explains why they aren’t doing the experiment the way that it would normally be done, nor do they talk about whether their unusual way of conducting the study makes a difference. Nor does the study even mention previous research that has been done on this very question that comes to the exact opposite conclusion. You would think that if a researcher is going to do a study in a very different way from others, he would provide some explanation for that.
One central problem can be explained this way. Suppose a state passes a gun-control law at the same time that crime rates are falling nationally. It would be a mistake to attribute the overall drop in national crime rates to the law that got passed. To account for that concern, researchers normally see whether the drop in the crime rate for the state that had the change is greater or less than the overall national change. This study doesn’t account for that basic control.
A1F: The authors of the study said that they couldn’t find enough data on firearm ownership rates from the Behavioral Risk Factor Surveillance System, so they substituted “firearm suicides as a percent of all suicides.” You indicated that this really doesn’t work as a proxy. Would you elaborate on why that is?
JL: Using firearm suicides as a percentage of all suicides makes no sense. Using that is much more likely to pick up demographic characteristics of the population that tend to use guns to commit suicide. For example, men and older people, as well as people in more rural areas, are more likely to commit suicide with a gun.
The Behavioral Risk Factor Surveillance System has some drawbacks in that it only covers three years (2001, 2002 and 2004) and that it leaves out Washington, D.C., which would clearly be problematic for the conclusions desired by public health researchers who want to question gun ownership.
A1F: This article was published in a major academic journal and passed the peer review process, despite what you show to be a pretty glaring hole in their methodology. Is that surprising to you? Do you ever get the impression that studies with anti-gun conclusions get fast-tracked without anyone examining the experiment design too closely?
JL: No. Unfortunately, the low quality of publications in public health journals does not surprise me. The quality of statistics in public health is very low—they seem to be much more concerned with the conclusion than how the research was conducted. The Crime Prevention Research Center website and my book More Guns, Less Crime both provide a number of examples of the problems with the public health research. Usually doing the statistical tests the way that they are traditionally done (and they are done that way for a reason) reverses the claimed results in these public health studies.When it comes to studies that have the right views on guns, there doesn’t seem to be any interest in providing balance.
A1F: Did any of the original media outlets that trumpeted the results of the study ever reach out to you when you revealed its flaws? Did you notice any retractions, or any backpedaling whatsoever, or did they just pretend it hadn’t happened?
JL: No one reached out to me before or after writing up their pieces. One hopes that the media will provide balance and ask critics what they might think of the study, but I did not see any of the media discussions talking to anyone who might be critical of this report. When it comes to studies that have the right views on guns, there doesn’t seem to be any interest in providing balance.
A1F: There seems to be a new wave of firearm-related research, much of it coming from explicitly anti-gun sources like Mother Jones or various lobbyist groups. Does what you’re seeing now strike you as better or worse than in the old days when the CDC had a corner on the market? Do you think that people are as likely to be persuaded by it?
JL: I think that it is much worse in a couple of ways. The sheer volume of studies already coming out is unprecedented, and you will see a tidal wave of these studies coming out next year so as to be timed with the election. I think that Bloomberg is a much smarter opponent. He is working to ensure all this research comes out, that it will get the proper news coverage by working with Columbia University’s prestigious journalism school to make sure that reporters are “properly” trained to cover these studies, and then combining the studies that come out with the political push that Bloomberg’s Everytown is doing.