Yesterday, a breaking story came out reporting that new research linked air pollution to higher coronavirus death rates. Here's an example of the story from the New York Times.
They even go so far to claim under the top photo that Atlanta is likely to suffer more deaths than the adjacent Douglas County due to this effect.
I have not had time to read and digest the study. It could be a good one. It might be important. However, when I went to the web site for the paper, I noted that there is a disclaimer immediately under the title that says "This article is a preprint and has not been certified by peer review. It reports new medical research that has yet to be evaluated and so should not be used to guide clinical practice" [my italics].
Peer review has been criticized for many reasons, most commonly is the slowness of the process. This is particularly problematic during crises like the coronavirus pandemic, when we want to accelerate the race to understanding and solutions.
It is probably essential that scientists share pre-peer-reviewed results in a transparent way at this time. In fact, this is not entirely unusual. Trained scientists have the capacity to evaluate papers and determine their strengths, weaknesses, and utility. No paper is perfect, so the reality is that we are constantly doing this as part of our research efforts.
What is concerning currently, however, is the sharing of pre-peer-reviewed results and other observations publicly. This has the potential to do serious harm as people look for treatments based on anecdotal findings or understanding based on papers that may contain flaws or misinterpretations.
There was a time when I was taught that one should not publicize results prior to peer review. If you read science articles regularly, you will often see statements like, "in a paper published today in Science", which at least tells you that the paper has been through some level of peer review. It may not have been scrutinized by the broader scientific community, but it has had an initial check.
The paper above might be very good and important. It might be good, but need some changes in interpretation. Perhaps it will be rejected. I don't know. I have some capacity to evaluate portions of the paper, but I don't work with the sorts of datasets that are used for this type of research, so that's an area where I'm blind. I consider my expertise to be specialized and limited when I examine results in other fields.
As we move forward through this crisis, I encourage you to consider the status and significance of stories about scientific findings very carefully. Treat those based on anecdotal evidence with extreme caution and those based on pre-peer-reviewed results with at least some additional checking. For published results, consider that the nuances are often not reported. Look to experts who have sifted through the full spectrum of scientific results and have domain knowledge for information. The scientific process isn't perfect, but this helps you separate the wheat from the chaff and enables you to move forward with the best available information.