Skip to content


Cyan circle with white stethoscope icon inside and cyan swooshes on either side

Evaluating the Intervention: Chapter 12

Cover of PROTEUS Guide

In this excerpt (Chapter 12) from the PROTEUS-Practice Guide, you’ll learn about approaches to assess the implementation and effect of the patient-reported outcome (PRO) system.

This webpage contains the entire contents of Chapter 12. You can also download the PROTEUS-Practice Guide by clicking here.

Key Points

  • Evaluate both implementation and patient clinical outcomes to consider the full impact of a patient-reported outcome measures (PROMs) program
  • Consider the pros/cons of experimental and non-experimental designs
  • Communicate findings about the intervention back to the health system and use them to improve future iterations of the program


Evaluating an intervention is a crucial aspect of gauging its success and determining areas for improvement. Measuring the PROM intervention’s effect on both implementation outcomes, such as acceptability and appropriateness, as well as on patient clinical outcomes, such as symptom burden, survival, and quality of life, can provide a fuller picture of how well the intervention met its objectives.

Different methods can be used to evaluate the implementation of PROM systems. These include both experimental and non-experimental designs. These methods span more controlled methods (e.g. randomized controlled trials) to more flexible and adaptive methods (e.g. quality improvement studies). Quality improvement studies may be more appropriate during the initial implementation phases while more rigorous designs may be more appropriate to evaluate the impact of established programs. Just as implementations science frameworks can be used to guide the incorporation of PROs into the workflow (see Chapter 8, Incorporating in Clinical Workflow), they can also inform an evaluation strategy for the intervention.

Questions and Considerations

A. What outcomes can be evaluated?

Implementation outcomes

  • Implementation outcomes describe how well a program is delivered and received
  • Examples include acceptability, appropriateness, feasibility, adoption/uptake, fidelity, reach/penetration, costs, sustainability

Patient clinical outcomes

  • Patient clinical outcomes include those related to the patient’s health and health service use
  • Examples of clinical outcomes for evaluation include symptom burden, use of emergency services, survival, quality of life, physical function, satisfaction with care

B. What evaluation methods can be used?

Experimental designs

  • Experimental designs include randomized controlled trials, cluster-randomized trials, and similar approaches
  • These designs are considered the ‘gold standard’ in evaluating an intervention as they are typically rigorous and maximize internal validity
  • Experimental designs typically test efficacy rather than effectiveness, but some implementation science experimental designs have been adapted to evaluate effectiveness
  • Conducting an experimental evaluation requires substantial monetary and professional resources, typically requires grant support, and often needs institutional review and approval including ethical review
  • These studies may be more appropriate to evaluate established programs

Feasibility studies, quality improvement designs, quasi-experimental designs, and observational studies

  • Includes improvement research, plan-do-study-act cycles, cross-over, case-control, and time series research. These approaches may be better suited to capture effectiveness of an intervention as they are evaluating a hybrid of its implementation and its effect on patient outcomes
  • These studies may be more appropriate to evaluate initial implementations but can also be used to continuously improve established programs
  • The non-experimental nature of the design increases risk of bias and decreases generalizability
  • Both qualitative and quantitative methods can be used to evaluate the system
  • These approaches are typically lower cost but may still require some monetary and personnel resources to administer, as well as possibly requiring institutional and ethical approval

Figure 12.1: Example implementation and clinical outcomes

Graphic showing blue gradient with white wavy line and sans-serif type overlaying
Graphic showing blue gradient with white wavy line and sans-serif type overlaying

C. What should you do with the results of the intervention evaluation?

  • Inform PROM system improvement
  • Communicate learnings throughout the organization
  • Share learnings with the field. Consider disseminating results of the intervention through conferences, publications, or other communication channels to increase awareness of both the PROM program and efforts to evaluate it
  • Consider whether the outcomes meet the goals of the PROM system – and adjust the PROM strategy if they do not

Relevant Primary Resources

The information presented here is an overview of evaluating the intervention. For more detailed information please see the following sources:

Background And Citing The Proteus-Practice Guide

Nothing in this Guide should be construed to represent or warrant that persons using this Guide have complied with all applicable laws and regulations. All individuals and organizations using this template have the responsibility for complying with the applicable laws and regulations or regulatory requirements for the relevant jurisdiction.

Each chapter of the Guide lists the key foundational resources that informed its content. To appropriately recognize the foundational resources, we encourage you to cite both the Guide and the relevant foundational resource(s). Recommended citations are provided here.

Suggested Citation

The PROTEUS Guide to Implementing Patient-reported Outcomes in Clinical Practice
A synthesis of resources. Prepared by Crossnohere N, Brundage M, Snyder C, and the Advisory Group, 2023. Available at:

Further Reading

The Guide draws primarily from the foundational resources cited in each chapter. Please click here to find a selection of other relevant references.

Back To Top