Use of external evidence for design and Bayesian analysis of clinical trials: a qualitative study of trialists’ views

Gemma L Clayton*, Daisy Elliott, Julian P T Higgins, Hayley E Jones

*Corresponding author for this work

Research output: Contribution to journalArticle (Academic Journal)peer-review

1 Citation (Scopus)
54 Downloads (Pure)

Abstract

Background
Evidence from previous studies is often used relatively informally in the design of clinical trials: for example, a systematic review to indicate whether a gap in the current evidence base justifies a new trial. External evidence can be used more formally in both trial design and analysis, by explicitly incorporating a synthesis of it in a Bayesian framework. However, it is unclear how common this is in practice or the extent to which it is considered controversial. In this qualitative study, we explored attitudes towards, and experiences of, trialists in incorporating synthesised external evidence through the Bayesian design or analysis of a trial.

Methods
Semi-structured interviews were conducted with 16 trialists: 13 statisticians and three clinicians. Participants were recruited across several universities and trials units in the United Kingdom using snowball and purposeful sampling. Data were analysed using thematic analysis and techniques of constant comparison.

Results
Trialists used existing evidence in many ways in trial design, for example to justify a gap in the evidence base and inform parameters in sample size calculations. However, no one in our sample reported using such evidence in a Bayesian framework. Participants tended to equate Bayesian analysis with incorporation of prior information on the intervention effect and were less aware of the potential to incorporate data on other parameters. When introduced to the concepts, many trialists felt they could be making more use of existing data to inform the design and analysis of a trial in particular scenarios. For example, some felt existing data could be used more formally to inform background adverse event rates, rather than relying on clinical opinion as to whether there are potential safety concerns. However, several barriers to implementing these methods in practice were identified, including concerns about the relevance of external data, acceptability of Bayesian methods, lack of confidence in Bayesian methods and software, and practical issues, such as difficulties accessing relevant data.

Conclusions
Despite trialists recognising that more formal use of external evidence could be advantageous over current approaches in some areas and useful as sensitivity analyses, there are still barriers to such use in practice.
Original languageEnglish
Article number789
Number of pages9
JournalTrials
Volume22
Issue number1
Early online date8 Nov 2021
DOIs
Publication statusPublished - Dec 2021

Bibliographical note

Funding Information:
GLC was funded by a PhD studentship from the Medical Research Council (MRC) Hubs for Trials Methodology Research. DE was funded by the NIHR Biomedical Research Centre at University Hospitals Bristol and Weston NHS Foundation Trust and the University of Bristol. HEJ was supported by an MRC Career Development Award in Biostatistics (MR/M014533/1). JPTH is an NIHR senior investigator (NF-SI-0617-10145), was supported by the National Institute for Health Research (NIHR) Applied Research Collaboration West (ARC West) at University Hospitals Bristol and Weston NHS Foundation Trust and the NIHR Bristol Biomedical Research Centre at University Hospitals Bristol and Weston NHS Foundation Trust and the University of Bristol, and is a member of the MRC Integrative Epidemiology Unit at the University of Bristol.

Publisher Copyright:
© 2021, The Author(s).

Keywords

  • evidence synthesis
  • Bayesian analysis
  • Trials
  • Qualitative
  • Informative prior distributions
  • Meta-epidemiology

Fingerprint

Dive into the research topics of 'Use of external evidence for design and Bayesian analysis of clinical trials: a qualitative study of trialists’ views'. Together they form a unique fingerprint.

Cite this