The proper answer is not much. Any given report is actually one eyeball and that is never really good enough. You want comparable work by other eyeballs that also allow you to have a chance of check king the underlying data relied on.
It is all a work in progress and great care must be applied when using new information Are you really replicating? In medicine, that really matters.
I have often noted that several great mines have been located on highways driven by every geologist in the country. Remember those eyeballs. Are we paying him to have a positive statement regarding muddy facts?.
i would like to be more positive but it really does not work that way.
Can You Trust What Medical Journals Publish?
November 29, 2019
I have repeatedly questioned the validity of medical journal claims in regards to politically charged issues like air pollution and climate change, as well as global warming here at AT. More recently, I showed how a major medical journal violates basic rules on scientific inquiry.
There
is another important problem with medical research as reported in
medical journals and then often expanded by the lay press as big news:
that medical journal articles are often proven wrong for unreliable
results or promotion of treatments that are not beneficial or not any
more efficacious than treatments they propose to replace.
I was reminded recently of this problem by an article in Emergency Medicine News, a medical specialty newspaper, that reported on a study
by Dr. Vinay Prasad, a comprehensive review of randomized clinical
trials in the Journal of the American Medical Association, The Lancet,
and the New England Journal of Medicine identifying 396 medical
reversals. Reversals are cases where medical journal articles are found
to be faulty, misleading and just plain wrong.
When
high-flying medical researchers on environmental issues use bad methods
and report false results, it is motivated by political agendas usually,
but when medical researchers report what end up being unreliable
results in other areas, it is often due to biases and fallacious
thinking and lack of effort to assiduously test their results and repeat
them to assure that the hypothesis is valid and reliable and the
results are testable and verified.
Some
"rules" turned out to be wrong, for example tight blood sugar control,
mechanical chest compressions, protocols for treatment of sepsis
(infections with severe complications). The unreliability problem is
troublesome, since the study shows that many recommended treatments and
strategies are not efficacious.
Here are some additional specifics from the Prasad study:
- Mechanical compression was not better than manual compressions for CPR. (JAMA. 2014;311[1]:53)
- Early and aggressive methods for care of patients with sepsis (severe infection) were no better than usual care. (JAMA. 2017;318[13]:1233)
- The REACT-2 trial found that routine use of an immediate total-body CT did not impact mortality or benefit compared with conventional imaging and selective CT scanning in patients with severe trauma. (Lancet. 2016;388[10045]:673)
- Platelet transfusion after acute hemorrhagic stroke was found by the 2015 PATCH study to worsen survival in the platelet transfusion group (68%) compared with the standard care group (77%). (Lancet. 2016;387[10038]:2605)
The
authors were so alert to the problem that they created a website for
best practices that, like other such practice websites, intends to alert physicians to the realities of the research mistakes and misinformation.
Medical
reversals and rejection of medical protocols and suggested treatments
are too common and the result of bad methods and scientific
dishonesty. Real science honesty would identify the problems and
discover the unreliable information, and the studies would not be
published.
The reports of this or that new breakthrough should be assessed with care by the public and medical professionals.
In 2005, an obscure Greek physician, John Ioannidis, published a groundbreaking article
on the unreliability of medical research, "Why Most Published Research
Findings Are False," and he became famous — so famous that he is now at
Stanford, heading a study project on scientific integrity, funded by a
philanthropist. What Ioannidis found was that medical
research is driven by ambition, intellectual passion, and fallacious
thinking. He didn't say researchers are dishonest; he just said they
often put out false claims and make false assertions.
I
have, in these articles at AT, tried to warn the readers of the
problems of dishonesty and malfeasance in medical research — the lay
reader is warned to apply these rules as a way to avoid being taken in
by bad research methods or just plain cheating and dishonesty.
There are some basic rules to help avoid being taken in by charlatans.
- The study should be a human study, or, if it is an animal study, the limits of such a study should be declared.
- The study should follow basic rules about how to determine causation, and avoid the trap of claiming that "association" or "coincidence" is proof of causation.
- The study should avoid surveys and questionnaires as a source of "evidence" since recall bias is always a problem in survey or response studies.
- The study should always be measured in terms of the magnitude of the "effect," and the rule is that magnitude of effect should be "robust" — at least 2 or 3 times the increase in effect over the baseline.
- The study should establish a mechanism to explain the causal effect asserted — for example, ice cream consumption is associated with an increase in drowning deaths, but it is not a cause of those deaths.
- Although I could argue that peer review and publication are not a good standard for reliability, the source of the research and the reputation of that source as well as the reputation of the journal the research was published in is often worth something. How much it is worth is the question.
The
important thing is that professionals and citizens should be careful to
question and evaluate what is pronounced by medical journals. Too
often, they are overwhelmed by self-esteem and ambition.
No comments:
Post a Comment