Login required to started new threads

Login required to post replies

Take two eye of newt and call me in the morning: Why systematic reviews of randomized controlled trials matter?
Quote | Reply
How can we determine if an article about some particular area of research accurately assesses the validity of the research on which any conclusions are based? For instance, how can we trust that wearing compressions socks improves running performance when we read an article in a magazine that makes this claim?

An author may attempt to increase the validity of a claim by stating that, “the research shows …”. However, this alone is a fairly weak argument to utilize the results discussed in a magazine as a basis for deciding whether we use a particular piece of equipment for racing. Assuming that the author provides a reference, it may be that the author is “cherry picking” the research to support their argument because it fits with their interests (social, political, financial, etc). In the scientific community, this is known as a “limitation” because of selection bias.

It is also important to not only identify the research but also to question whether the results from a study discussed are accurately described? For example, if a group of athletes were given compression socks prior to a 10 km running race, and subsequently completed the race with personal records, we cannot conclude that the improvement was a result of wearing compression socks. That is, we cannot say that there is a direct “cause and effect”. If an author makes such a claim, they are dismissing any confounding variables (other factors that may account for the difference) such as weather conditions, current training and nutritional status, etc. The only way to accurately determine a “cause and effect” relationship in a case such as this is to compare the results of athletes wearing the compression socks with a second group of athletes wearing a placebo (sham) compression socks.

How then can we make an informed decision about which training equipment to use or what training program will optimize sport performance? The best ‘level of evidence’ we currently have are systematic reviews of randomized controlled trials (RCT). This type of review incorporates systematic methods to select studies in order to limit selection bias and provide reliable conclusions on a given topic (2).

Finally, what is so important about using a randomized controlled trial? Randomization is a process of allocating study participants into an intervention group and a control group (1). By randomizing subjects we can account for confounding variables as well as limit selection bias (1). In addition, incorporating a control group allows us to compare the results of the actual intervention with a sham intervention. This way we can actually determine if there is a “cause and effect” relationship with a given intervention. If there is no statistically significant difference between groups following the intervention we can conclude that the intervention is no better than the placebo effect.

REFERENCES

1. Akobeng A. Understanding randomised controlled trials. Arch Dis Child 2005;90:840-844.

2. Liberati A, Altman D, Tetzlaff J, Mulrow C, Gøtzsche C, Ioannidis J, Clarke M, Devereaux P, Kleijnen J, Moher D. The PRISMA statement for reporting systematic reviews and meta-analyses of studies that evaluate health care interventions: explanation and elaboration. J Clin Epidemiol 2009;62:e1-e34.

Michael Rosenblat

http://www.evidencebasedcoaching.ca
Evidence-Based Coaching: Making science work for athletes.
Last edited by: mrosenblat: Jan 26, 15 11:48
Quote Reply
Re: Take two eye of newt and call me in the morning: Why systematic reviews of randomized controlled trials matter? [mrosenblat] [ In reply to ]
Quote | Reply
You mean we should read the research paper rather than a magazine article? Love it (seriously).

to go one step further, the article should be peer reviewed.

I for one, read the actual article instead of the magazine version, whenever realistically possible. Hell, I even downloaded the Floyd Landis USPS deposition against Lance. I read 60+ pages, much of it really really boring. All that, just so I could be informed.

As a fun aside, check out "I fucking love science" or "FL science"as my wife makes us call it at the dinner table. It's concerned with......science of many varieties, but gives heaps of links to the scources of the information. I get lost there most evenings, while my wife watches Grays Anatomy or something similar.

Keep up the critical thinking!

TriDork

"Happiness is a myth. All you can hope for is to get laid once in a while, drunk once in a while and to eat chocolate every day"
Quote Reply
Re: Take two eye of newt and call me in the morning: Why systematic reviews of randomized controlled trials matter? [mrosenblat] [ In reply to ]
Quote | Reply
In the context of giving out coaching advice, this is something people should keep in mind. both coaches and athletes. Nutritional supplements are constantly being recommended as performance aids but must of them lack any scientific backing - even poorly designed studies.


Dtyrrell
Quote Reply
Re: Take two eye of newt and call me in the morning: Why systematic reviews of randomized controlled trials matter? [mrosenblat] [ In reply to ]
Quote | Reply
Well, yes and no. Randomization does cure many ills (we use to refer to it as 'go see Dr. Random), but not all. It does not neutralize confounded variables. Even in the example of the compression socks, randomizing the sample population, and including a control group, won't randomize out the error due to other confounded factors. While this analysis technique is recommended, I worry that people put far too much faith in the 'significance' of the results. We see many 'one factor at a time' RCT that show significance, when there are actually second and third order factor interactions that are not being accounted for rendering the significance suspect...meaning for the reader....'Your Mileage May Vary'
Quote Reply
Re: Take two eye of newt and call me in the morning: Why systematic reviews of randomized controlled trials matter? [mrosenblat] [ In reply to ]
Quote | Reply
Ordinarily I try to avoid clicking links to websites/blogs of persons posting threads here on ST when I suspect their intent, at least in part, was to drive traffic to such website/blog, but the detailed, scholarly manner of your post intrigued me. This time, I'm glad I did. You have some fantastic articles on your website in which a nerd such as myself can enjoy. So, for what it is worth, thank you.
Quote Reply
Re: Take two eye of newt and call me in the morning: Why systematic reviews of randomized controlled trials matter? [mrosenblat] [ In reply to ]
Quote | Reply
do we need to undertake a RCT to determine the efficacy of parachutes?

:)
Quote Reply
Re: Take two eye of newt and call me in the morning: Why systematic reviews of randomized controlled trials matter? [mrosenblat] [ In reply to ]
Quote | Reply
What you're saying is true, to an extent. However if everyone (re: coach) proceeded to make "generally safe claims" off of nothing other than an RCT, coaching, training, and practically applying research would have us still chugging away at trying to break sub-3:00 for the marathon instead of a sub-2:00. Little extreme for the example, I know, but just making a point.

Limiting a claim or statement, which I'm interpreting as a practical application, limits creativity, outside-box thinking, and elevating the science. Take for example the past two years of run shoe and run technique investigations that have flooded the journals. They are there because ~8+ years ago enough people challenged CW, it caught on, it then caught on to a bunch of researchers, and then the researchers designed and conducted their study. Then they went through the typical process of 1-2+ years to get the article completed and published. Another year for mainstream fanboys to pick it up and only work off of the abstract, and voila', you've got your claim. I realize also that you know this, but others who may be very polar in their application of research need to know that biology is messy, and even the RCT can be suspect. Just look at the RCT's for eggs, wine, cholesterol, etc.

The problems is when people only make their ideology claim off of abstracts. Unless someone has the full article, don't make the claim. If people are going off the abstract only, say so. If the full article is going off of the deep end/conventional wisdom, take it with some critical thinking and state it.

BTW- cherry picking articles is how new theories and directions evolve- don't bite the hand that feeds you. Someone stating "the research shows" (when they should probably word it with "the investigation discovered," or something like that) is their way of saying they have some footing to their statement. Would you rather have them say that The compression sock analogy you give is valid, but your accuracy is off. Remember, that example study drives a retail market, which drives the consumer, which drives the athlete, which puts websites like evidencebasedcoaching.ca and others relevant.

Nice blog- wish more were like it.

http://www.reathcon.com
Quote Reply
Post deleted by VGT [ In reply to ]
Re: Take two eye of newt and call me in the morning: Why systematic reviews of randomized controlled trials matter? [Rob] [ In reply to ]
Quote | Reply
Evidence-based coaching/practice (EBP) is a method of incorporating the best available evidence to support an intervention (exercise, nutritional, equipment, etc). This involves utilizing the scientific literature as well as clinical/field experience. However, this does not suggest that we should incorporate the literature blindly. We must critically appraise the literature by assessing its internal and external validity.

Internal validity reflects the quality of the study design, implementation, and data analysis in order to minimize the level of bias and determine a true ‘cause and effect’ relationship between an intervention and an outcome (1). External validity describes the circumstances under which the results of the research can be generalized.

LEVELS OF EVIDENCE

Level 1: Systematic reviews of randomized controlled trials
Level 2: Randomized controlled trials
Level 3: Case control studies
Level 4: Case series, case reports
Level 5: Editorials, expert opinion

*

Michael Rosenblat

http://www.evidencebasedcoaching.ca
Evidence-Based Coaching: Making science work for athletes.
Quote Reply