Evidence Synthesis for Decision Making 7: A Reviewer's Checklist

A E Ades, Deborah M Caldwell, Stefanie Reken, Nicky J Welton, Alex J Sutton, Sofia Dias*

*Corresponding author for this work

Research output: Contribution to journalArticle (Academic Journal)peer-review

57 Citations (Scopus)
260 Downloads (Pure)


This checklist is for the review of evidence syntheses for treatment efficacy used in decision making based on either efficacy or cost-effectiveness. It is intended to be used for pairwise meta-analysis, indirect comparisons, and network meta-analysis, without distinction. It does not generate a quality rating and is not prescriptive. Instead, it focuses on a series of questions aimed at revealing the assumptions that the authors of the synthesis are expecting readers to accept, the adequacy of the arguments authors advance in support of their position, and the need for further analyses or sensitivity analyses. The checklist is intended primarily for those who review evidence syntheses, including indirect comparisons and network meta-analyses, in the context of decision making but will also be of value to those submitting syntheses for review, whether to decision-making bodies or journals. The checklist has 4 main headings: A) definition of the decision problem, B) methods of analysis and presentation of results, C) issues specific to network synthesis, and D) embedding the synthesis in a probabilistic cost-effectiveness model. The headings and implicit advice follow directly from the other tutorials in this series. A simple table is provided that could serve as a pro forma checklist.

Original languageEnglish
Pages (from-to)679-691
Number of pages13
JournalMedical Decision Making
Issue number5
Publication statusPublished - Jul 2013

Bibliographical note

Date of Acceptance: 20/12/2012


  • cost-effectiveness analysis
  • Bayesian meta-analysis
  • multiparameter evidence synthesis
  • meta-analysis
  • treatment comparison metaanalysis
  • ISPOR task-force
  • systematic reviews
  • controlled trials
  • network metaanalysisq
  • empirical evidence
  • competing interventions
  • methodological quality
  • randomised trials
  • bias

Cite this