The concept of commitment to change (CTC) is based on the principles of encouraging and enabling participant reflection on personal goals and values. It is intended to instill in the participant an obligation to improve his/her personal practice behavior. CTC is a central feature of adult learning and promotes the development of expertise. CTC was first introduced within the framework of continuing medical education by Purkis in 1982, and since then, many CME studies have examined the usefulness of CTCs in explaining self-reported behavior change (Mazmanian et al. 1998, 2001; Lockyer et al. 2001; White et al. 2004).
Consequently, a recent study published in The Medical Teacher, demonstrated that CTC can be used effectively and efficiently in large group CME to enhance learning outcomes.
While there is some evidence in the literature that spontaneously generated commitments can lead to self-reported change, the relationship between CTC and self-reported behaviors is critically important because the latter can be significantly related to actual behavior change.
At a time when CME is searching for meaningful outcomes to demonstrate the efficacy of educational interventions, the author noted that self-reported CTC appears to be a potentially valuable construct. It is a potential method of reinforcing learning and measuring positive educational outcomes. As a result, the author attempted to answer the following questions:
- Are clinicians who make commitments to change based on choice among a predefined list more likely to report changes in their medical practices than those who are not presented with the list?
- Are these changes at least as durable as those made spontaneously by a control group? If so, this would present an efficient and highly replicable intervention to enhance CME outcomes. Currently, the literature does not consistently demonstrate the superiority of predefined lecture-derived commitments over those made spontaneously.
During the fall of 2009, email invitations to participate in a CME study were sent to all clinicians (4745) who registered for the Harvard Medical School sponsored CME meeting “Current Clinical Issues in Primary Care” (aka Pri-Med) Conference in Boston. This study focused on one lecture covering common ambulatory psychiatric issues lasting 45 min in a course that offered 16 h of CME credit over 3 days. Those who responded with interest were sent an introductory packet explaining the basic premise of the study. On the day of the lecture, attendees were again invited to participate in the study by enrolling at the back of the auditorium prior to the lecture; this group was randomized by placing every other enrollee in the intervention group.
No monetary incentive was offered for participation in the study, but a discount coupon was provided from a medical book publisher to all participants as a form of thanks following the intervention. Because this was a voluntary recruitment, the participants in this study were randomly selected.
In the invitation, participants were told that if they agreed to participate, they would be assigned to either the intervention or the control group and that those in the intervention group would be asked to stay for 5 min following the lecture.
Immediately following the lecture, those participants in the intervention group were instructed to remain after the lecture and to select up to three commitments from a predefined list based on the lecturer’s objectives using an Audience Response System. Participants assigned to the control group left the lecture hall without formally making commitments, being told only that they would be contacted by email in the near future. Furthermore, participants in the control group were specifically asked to leave the auditorium after lecture session, and had no knowledge of the intervention group’s tasks.
Seven business days following the lecture, the first follow-up email was sent to both the intervention and control groups. The intervention group’s email reminded participants of the commitments they had selected and asked them to self-report whether or not they had “began to implement,” were “planning to implement” or “decided not to implement” changes in their practice behaviors related to those CTCs.
The control group’s email asked participants to self-report any changes in their practice behaviors as a result of the lecture. Both were questioned about the confidence in, and barriers to, behavior change. Additionally, follow-up emails were sent to both groups at 30 days post-lecture reminding participants of their commitments and asking them to report whether they had “begun to” were “planning to” or “decided not to” implement practice changes.
A total of 97 participants responded to the learning-related variables questionnaire, 57 in the intervention group and 40 in the control group. No significant differences were found between the control and intervention groups with respect to the learning-related variables at 7 days post-lecture. Accordingly:
- About 91% (73) of participants in the intervention group and 32% (21) of participants in the control group made a CTC their practice behavior at 7 days post-lecture. The proportion of participants making commitments was greater for those in the intervention group than those in the control group.
- At 30 days post-lecture, more participants in the intervention group relative to the control group reported contemplating or implementing changes in their practice behaviors (58% vs. 22%).
- When contemplation was excluded and only implementation of changes only was considered, more participants in the intervention group relative to the control group reported change (49% vs. 19%).
- More participants in the intervention group reported moving from contemplating to implementing changes than did those in the control group (19% vs. 3).
This study strongly suggested that CTC processes are more likely to induce behavioral changes that persist over time than those activities without CTC. The findings also suggest CTCs can be based upon a predefined set of suggested changes and still have a significant impact on reported behavior change.
CTC processes can ideally be adapted to a large-audience CME lecture through the use of handheld technologies like Audience Response Systems which have demonstrated success in motivating large audiences. With rapid training, a learner can develop a new paradigm to their education. Furthermore, the database developed from each learner’s commitments provides a personalized needs assessment of the specific behavior changes that learner must address to improve the care of their patients. Using ARS in this instance meant linking each tool to the specific responses of the user which required extra time of the participants.
As technology evolves, most challenges to implementation of CTC into CME activities will be more easily overcome and a further delineation of the efficacy of CTC will occur. From the supposition that all continuing educational components will need to demonstrate behavior change to qualify as an accredited educational event, effective application of CTC to large audience could help to achieve this goal.
The success of its future depends greatly on the appropriate recruitment of learners and speakers, and the advance of simple handheld technology as well as the ability to provide prompt and directed feedback to learners that enables them to better self-assess and continue to positively evolve their clinical practices based on the CME they receive.
Expanding CTC processes to all lectures at an event will require the acceptance and participation of all of the speakers. This will depend on effective recruitment of these speakers and clear explanation of how the process works, and it must not be seen as an additional burden. Providing presenters with a report of commitments related to their lectures could potentially enhance the quality of future presentations. If a majority of the audience identified a behavior change in one area, this knowledge could serve as a quality improvement method, thereby enhancing future iterations of that lecture.