Sport research/Critiquing the literature

Just because it's in black and white, doesn't mean it's true. And just because it's published in a peer review (even an A-star ERA journal) doesn't mean it's good either. You, the reader, have to make a judgement and be able to defend that judgement. Here are some tips to help you do that.

Reading the article
Here are some tips for going about reading a scientific article. You may find you prefer a different way, but here's a start in case you're a bit lost.


 * 1) Read the Abstract. It consists of a brief summary of the research questions and methods. It should also state the findings. Because it is short and often (poorly) written in dense jargon, you may need to read it a couple of times. Try to restate the abstract in your own non-technical language.
 * 2) Read the Introduction. This is the beginning of the article, appearing first after the Abstract. This contains information about the authors' interest in the research, why they chose the topic, their hypothesis, and methods. This part also sets out the operational definitions of variables.
 * 3) Read the Discussion section. Skip over the Methods section for the time being. The Discussion section will explain the main findings in great detail and should discuss any methodological problems or flaws that the researchers discovered.
 * 4) Read the Methods section. Now that you know the results and what the researchers claim the results mean, you are prepared to read about the Methods. This section explains the type of research and the techniques and methods used.
 * 5) Read the Results section. This is the most technically challenging part of a research report. But you already know the findings (from reading about them in the Discussion section). This section explains the statistical analyses that led the authors to their conclusions. It will test your knowledge of statistics, as well as research terms such as correlation coefficient, dependent and independent variables, subject variables, main effect, interaction, and inter-rater reliability, to name a few.
 * 6) Read the Conclusion. The last section of the report summarizes the findings, but, more importantly it sets out what the researchers think is the value of their research for real-life application. This section often contains suggestions for future research, including issues that the researchers became aware of in the course of the study.

Critiquing the article
There are lots of ways you can critique an article. Unless you are an expert in the field however, critiquing is hard. How do you know what is good or bad? Better critiquing comes with experience, but there are ways to try and accelerate your ability to critique in a certain area. Firstly, be systematic with your approach (check out the checklist section below), or look at how quality of evidence is often rated (again, just below), and try to apply it to what you are reading. But in terms of the specific content knowledge - the best tip is perhaps talking to researchers in the area, and read, read read. But don't just read, take note of particular sections of existing publications. Introductions will often point out limitations in the literature (and what they intend to do about it) - this will give you some ideas about critiquing specific content. But also pay attention to how methodologies are justified (there will be critique useful information here) and of course how limitations are discussed within papers (this of course is all about critiquing and applicability of findings). The more you read, and the more broadly you read, and attempt to pay attention to these areas, the better you will also be able to critique an individual paper (a scientific critic).

Here is a section by section checklist that will help you ask questions that will help you evaluate what to think about the article. When specifically trying to rate the quality of evidence from a research article, consider tools such as GRADE and the PEDro scale (more about this in Research Design).


 * Is the article up to date? Old and irrelevant? Timeless?
 * Is the article original research? Supporting previous findings? A review of the literature?

Title
 * Does the title give you insight as to what the article is about?
 * Does the title tell you what, whom and how?
 * Does the title entice you to read further?

Abstract
 * Does the abstract contain a brief statement about the purpose, method, results, conclusion and clinical relevance?
 * After reading the abstract did you learn the essence of the article without the details?

Introduction
 * Is the problem clearly stated?
 * Is the literature complete, current and appropriate and is the review objective?
 * Are alternative views acknowledged?
 * Are statements appropriately supported?
 * Are there unsupported assumptions (especially contributing to the articles research question)?
 * Did the author identify a “gap”?
 * Is the purpose clearly stated?
 * Is there a hypothesis or research question?

Method
 * Consider the level of evidence that the research design is able to offer (see: Methods for establishing levels of evidence?)
 * Consider sub-design principles (randomisation, cross-over designs, blinding etc) and how that may effect the level of evidence (consider the information presented in the Research Design section of this wiki)
 * Are subjects well-described?
 * How was the sample selected?
 * How large was the sample?
 * Was a control group used?
 * Is the instrumentation well-described? Calibrated?
 * Is the procedure laid out in detail?
 * Could someone replicate the study?
 * Is there internal validity?
 * Is the data analysis well-described? Appropriate?
 * What p-value was set as the criteria? (or how were the stats presented?)

Results
 * Are the measured data summarised?
 * Are results statistically significant?
 * Are results clinically/practically significant?

Discussion
 * Was the hypothesis accepted? Rejected?
 * Does the author identify weakness/limitations of the experiment? Are there any that are not acknowledged?
 * Are statements appropriately supported?
 * Is further literature cited to address the findings? Are alternative views acknowledged?
 * Are the results applied clinically/practically?
 * Are suggestions for further research indicated?

Conclusion
 * Are the results briefly restated?
 * Do conclusions follow from the results?

Formatting and clarity
 * Is the article clear? Is it formatted appropriately?
 * Are the main points arranged in logical sequence?
 * Do you have difficulty following the argument? Is the case made?

Referencing
 * Are all references precise and clear?

About the Author
 * Is the article (or authors work in general) well cited (cite lists of central articles are generally very important to your research, Google scholar has a good function for keeping up to date this way)?
 * Where is it published? Consider the quality of the journal, if it is pay to publish)
 * Is the author well established in the field?

Potential conflicts of interest
 * Are there Sponsors (funders) of the research (often a statement near the end of the article)? Could there be a financial or other conflict of interest?
 * Employer of the research team (e.g. brand/product based research)?
 * Publisher of the research (e.g. ASCM publishes MSSE, NSCA publishes J Strength Conditioning Research)?

Context of the findings
 * Are the results on a specific population? Can it be generalized to others? Think geographically as well as the descriptors of the participant cohort.

Annotated bibliography
A good way to start critiquing the articles you have found is by creating an annotated bibliography. Janiesantoy provides a useful short video on Writing an Annotated Bibliography. 6:22 minutes.

When I do an annotated bibliography I tend to do it all in the notes section of Endnote as this allows it to be searchable (within Endnote). I also tend to make dot points as opposed to full sentences and try to use key terms across a number of articles to assist the search function. In creating an annotated biobliography (for my own purposes, not something that I necessarily publish), I would tend to:
 * makes notes to summarise findings that are purposeful for what I am looking at. For instance, some times I may only be interested in the method used in a study, not the outcome(s) of the research.
 * use key words to help search for things later (e.g. in the example above I might use the term "methods" as well as something about why the methods are important to me).
 * make comments on the good and bad points of the study as I see it. Essentially I make a dot point version of the critique.

Bibliographic tools
Often its a good idea to use an electronic bibliographic tool/reference manager to keep your references and annotations. Examples include Endnote and Refworks. If you annotate directly into a tool such as this it is often easy to search for certain words and quickly format your references in a document. But this doesnt apply to formatting online documents (at present at least) and is not ideal for sharing annotations when it comes to creating the resource in Activity 2. So you may also want to consider using tools such as delicious, webcite, an electronic document, a wiki or even pen and paper. A free online version with increasing functionality (little to distinguish from commercially available systems now, except it's perhaps more powerful) is Zotero.

Activity
Activities are mini-tasks that will give you some practice with the concepts of each section. Activities should appear here soon, if not, feel free to add some open access ones yourself.

Task

 * 1) Create an annotated bibliography with 10 key references to your research topic
 * 2) Share the annotated bibliography with the research group

Resources
Will Hopkin's offers more information about finding out what's known, a critique of different sources of information and evaluating them.

See Ben Rattray's bookmarks for website resources around annotated bibliographies