Testing the Impact of an Asynchronous Online Training Program with Repeated Feedback

Aimee Woda, Cynthia Sherraden Bradley, Brandon Kyle Johnson, Jamie Hansen, Ann Loomis, Sylvia Pena, Maharaj Singh, Kristina Thomas Dreifuerst

Research output: Contribution to journalArticlepeer-review

2 Scopus citations

Abstract

Background: Learning to effectively debrief with student learners can be a challenging task. Currently, there is little evidence to support the best way to train and evaluate a debriefer's competence with a particular debriefing method. Purpose: The purpose of this study was to develop and test an asynchronous online distributed modular training program with repeated doses of formative feedback to teach debriefers how to implement Debriefing for Meaningful Learning (DML). Methods: Following the completion of an asynchronous distributed modular training program, debriefers self-evaluated their debriefing and submitted a recorded debriefing for expert evaluation and feedback using the DML Evaluation Scale (DMLES). Results: Most debriefers were competent in DML debriefing after completing the modular training at time A, with DMLES scores increasing with each debriefing submission. Conclusion: The results of this study support the use of an asynchronous distributed modular training program for teaching debriefers how to implement DML.

Original languageEnglish (US)
Pages (from-to)254-259
Number of pages6
JournalNurse educator
Volume48
Issue number5
DOIs
StatePublished - Sep 1 2023

Bibliographical note

Publisher Copyright:
© 2023 Lippincott Williams and Wilkins. All rights reserved.

Keywords

  • debriefing
  • nursing faculty
  • simulation
  • teaching
  • training

PubMed: MeSH publication types

  • Journal Article

Fingerprint

Dive into the research topics of 'Testing the Impact of an Asynchronous Online Training Program with Repeated Feedback'. Together they form a unique fingerprint.

Cite this