Increasing the satisfaction of general practitioners with continuing medical education programs: A method for quality improvement through increasing teacher-learner interaction
© Gercenshtein et al; licensee BioMed Central Ltd. 2002
Received: 28 May 2002
Accepted: 20 August 2002
Published: 20 August 2002
Continuing medical education (CME) for general practitioners relies on specialist-based teaching methods in many settings. Formal lectures by specialists may not meet the learning needs of practitioners and may cause dissatisfaction with traditional CME. Increasing learner involvement in teaching programs may improve learner satisfaction.
A quality improvement program for CME for 18 general practitioners in the Tel Aviv region was designed as a result of dissatisfaction with traditional CME activities. A two-step strategy for change was developed. The CME participants first selected the study topics relevant to them from a needs assessment and prepared background material on the topics. In the second step, specialist teachers were invited to answer questions arising from the preparation of selected topics. Satisfaction with the traditional lecture program and the new participatory program were assessed by a questionnaire. The quality criteria included the relevance, importance and applicability of the CME topic chosen to the participant's practice, the clarity of the presentation and the effective use of teaching aids by the lecturer and the potential of the lecturer to serve as a consultant to the participant.
The participatory model of CME significantly increased satisfaction with relevance, applicability and interest in CME topics compared to the traditional lecture format.
Increased learner participation in the selection and preparation of CME topics, and increased interaction between CME teachers and learners results in increased satisfaction with teaching programs. Future study of the effect of this model on physician performance is required.
Keywordscontinuing medical education family medicine general practitioners lectures quality improvement
General practitioners in many settings rely heavily on specialist-based continuing medical education (CME) methods. These include direct consultation with experts, reviews in journal and textbooks, and formal continuing education activities. [1–3] A traditional hierarchical relationship results in a one-way transfer of knowledge from specialists to general practitioners . However general practitioners may wish to control their own educational agenda and to inform specialists of their learning needs. Learning centered on clinical cases is likely to be of greater use to family physicians than formal lectures . Some specialists may regard lectures as the principal method of transferring information and few may have given any serious consideration to alternative teaching methods. The disparity between what general practitioners want to learn from specialists and what specialists think they need can be a barrier to effective educational interaction if there is no negotiation between teachers and learners . This paper describes a quality improvement program which introduced a new method of CME with the objective of increasing satisfaction with CME in a group of board certified general practitioners in Israel, in order to try to bridge the needs/wants gap.
Context of the problem
A group of 18 board-certified family physicians have been participating in a continuing medical education course in pediatric medicine in Tel Aviv, Israel for several years. The physicians had a mean seniority of 8.3 years in practice (s.d. 6.8 years). In the initial phase of the course, board certified specialists in pediatrics were invited to give weekly lectures to the course. The course began with three traditional lectures on various pediatric topics followed by questions from the audience.
Outline of problem
A high degree of dissatisfaction was noted among participants in the traditional CME program based on responses to open questions on a feedback sheet collected at the end of each weekly session. As a result a decision was made to modify the teaching program.
Key measures for improvement
The course participants set the objectives for quality improvement in the teaching program after the start of the course. They chose to assess the relevance, importance and applicability of the CME topic chosen to the participant's practice, the clarity of the presentation and the effective use of teaching aids by the lecturer and the potential of the lecturer to serve as a consultant to the participant.
Strategy for change
As a first step in modifying the CME program, the participants listed the pediatric topics that were most important to them (needs assessment). Three participants were selected to prepare the topic for presentation and to lead group discussion on the topic. One physician was designated to present theoretical material from the pediatric literature, the second was required to present the evidence base for the topic and the third was to present a relevant case from their clinical practice. Important questions and unanswered questions from the discussion were recorded.
In a second step, the specialist who was to lecture to the group on a selected topic received the questions of the participants that arose in the group discussion of the topic. The lecturer was free to construct the presentation as they chose but was expected to relate to the questions of the participants that were provided in advance. The lecture was divided into short segments to allow for several periods of free discussion and questions.
Process of gathering information
The interactive lectures were evaluated by completion of a feedback questionnaire examining the quality criteria described above. The questionnaire consisted of 6 questions relating to the ideal performance of a specialist lecturing to family physicians. Responses were given on a six-point Likert-type scale with a score of six denoting strongest agreement with the item. Space was provided for additional free-text comments at the end of the questionnaire. The results of the feedback from the two course periods (3 frontal lectures and 7 interactive lectures) were compared. Data from the questionnaires were entered and analysed using Epi-Info software. Mean scores for each question were compared using t-tests with significance set at the 0.05 level.
Effects of change
Means scores of items on satisfaction with CME scale (n = 18)
The topic of the lecture was relevant to my practice.
P < 0.005
I would consult the lecturer regarding one of my patients.
P < 0.005
I learned theoretical material on an important topic.
P < 0.05
I learned new things that I can apply in my practice.
P < 0.005
The lecturer presented the material in an interesting way.
P < 0.005
The lecturer used audiovisual aids in an effective way.
P < 0.05
Analysis and interpretation
The results show significantly higher satisfaction for interactive lectures compared to frontal lectures in all categories. Although the involvement of the course participants in setting the objectives for the intervention and the construction of the intervention after the start of the course may be considered sources of bias in a classical research design, they are inherent in a quality improvement program. The disadvantages of the new method were described in the free-text comments included in the participants' feedback. The new method required considerable time for preparation of the teaching sessions and discussions. The method required the goodwill and cooperation of both the participants and the lecturers. Some of the family physicians did not participate actively in the discussions but claimed that they contributed more than in the old method. Occasionally the discussion focused on trivial issues or issues that seemed less relevant to the majority of the group, yet they occupied the group's time.
Adult learning theory and theories of how professionals maintain and develop competence emphasize the importance of self-directed learning and point to clinical practice and problem solving as key areas of interest . Many studies have shown that patient care provokes frequent information needs . General practitioners often rely on specialist-based CME because textbooks, journals, and other existing information tools are not adequate for answering clinical questions that arise in practice. Textbooks may become outdated quickly, journals are often not useful in daily practice and both methods are time-consuming and expensive. [10, 13] Computer systems that have been developed to help doctors are not widely used [10–12]. General practitioners have clear views of the content and style of teaching they wish to receive from their specialist colleagues . The satisfaction questionnaire developed by the family physicians in this program reflected a wish that teaching be directly related to theirclinical work.
In this quality improvement program, the family physicians' desire for two-way interaction and for effective mutual education was achieved by family physicians expressing clearly what they wanted from their specialist colleagues and by specialists developing greater educational expertise. Prior needs assessment is important for informing and directing the educational process. 
Two models of educational interaction between family physicians and specialists are described here . The first model presented is based on traditional didactic lectures given by specialists to general practitioners. General practitioners may dislike didactic lectures but specialists often prefer this method. The second model consists of interactive sessions centred on clinical cases. This model was popular with both family physicians and specialists. Heale et al found that traditional lectures and large group lectures were the least preferred method of CME.  A transition from passive to interactive learning groups was also recommended in another study .
Systematic reviews [17–21] of educational interventions have shown that continuing medical education can improve clinical performance and patient outcomes by changing doctors' behavior. The most effective methods described in these reviews include learning linked to clinical practice, interactive educational meetings, outreach events, and strategies that involve multiple educational interventions (for example, outreach plus reminders). The least effective methods are those most commonly used in general practice continuing medical education, namely, lecture-format teaching and unsolicited printed material (including clinical guidelines).
This report has described one method of increasing satisfaction with CME by increasing interaction between teachers and learners. Further study is required to test the association between the increase in satisfaction and changes in physician knowledge, competence and performance in this population.
- Slawson DC, Shaughnessy AF: Obtaining useful information from expert based sources. BMJ. 1997, 314: 947-949.View ArticlePubMedPubMed CentralGoogle Scholar
- Ely JW, Osheroff JA, Ebell MH, Bergus GR, Levy BT, Chambliss ML, Evans ER: Analysis of questions asked by family doctors regarding patient care. BMJ. 1999, 319: 358-361.View ArticlePubMedPubMed CentralGoogle Scholar
- Cantillon P, Jones R: Does continuing medical education in general practice make a difference?. BMJ. 1999, 318: 1276-1279.View ArticlePubMedPubMed CentralGoogle Scholar
- Little P: What do Wessex general practitioners think about the structure of hospital vocational training?. BMJ. 1994, 308: 1337-1339.View ArticlePubMedPubMed CentralGoogle Scholar
- Marshall MN: Qualitative study of educational interaction between general practitioners and specialists. BMJ. 1998, 316: 442-445.View ArticlePubMedPubMed CentralGoogle Scholar
- Bayley TJ: The hospital component of vocational training for general practice. BMJ. 1994, 308: 1339-1340.View ArticlePubMedPubMed CentralGoogle Scholar
- Marshall MN: How well do general practitioners and hospital consultants work together? A qualitative study of cooperation and conflict within the medical profession. Br J Gen Pract. 1998, 48: 1379-1382.PubMedPubMed CentralGoogle Scholar
- Marshall MN: How well do GPs and hospital consultants work together? A survey of the professional relationship. Fam Pract. 1999, 16: 33-38. 10.1093/fampra/16.1.33.View ArticlePubMedGoogle Scholar
- Holm HA: Quality issues in continuing medical education. BMJ. 1999, 318: 1276-1279.View ArticleGoogle Scholar
- Smith R: What clinical information do doctors need?. BMJ. 1996, 313: 1062-1068.View ArticlePubMedPubMed CentralGoogle Scholar
- Swinglehurst DA, Pierce M, Fuller JC: A clinical informaticist to support primary care decision making. Qual Health Care. 2001, 10: 245-249. 10.1136/qhc.0100245...View ArticlePubMedPubMed CentralGoogle Scholar
- Ely JW, Osheroff JA, Ebell MH, Chambliss ML, Vinson DC, Stevermer JJ, Pifer EA: Obstacles to answering doctors' questions about patient care with evidence: qualitative study. BMJ. 2002, 324: 710-10.1136/bmj.324.7339.710.View ArticlePubMedPubMed CentralGoogle Scholar
- Chambliss ML, Conley J: Answering clinical questions. J Fam Pract. 1996, 43: 140-144.PubMedGoogle Scholar
- Gelula MH, Sandlow LJ: Use of focus groups for identifying specialty needs of primary care physicians. J Contin Educ Health Prof. 1998, 18: 224-226.View ArticleGoogle Scholar
- Heale J, Davis D, Norman G, Woodward C, Neufeld V, Dodd P: A randomized controlled trial assessing the impact of problem-based versus didactic teaching methods in CME. Proc Annu Conf Res Med Educ. 1988, 27: 72-77.Google Scholar
- Eliasson G, Mattsson B: From teaching to learning. Experiences of small CME group work in general practice in Sweden. Scand J Prim Health Care. 1999, 17: 196-200. 10.1080/028134399750002403.View ArticlePubMedGoogle Scholar
- Davis DA, Thomson MA, Oxman AD, Haynes RB: Evidence for the effectiveness of CME. A review of fifty randomised controlled trials. JAMA. 1992, 268: 1111-1117. 10.1001/jama.268.9.1111.View ArticlePubMedGoogle Scholar
- Davis D: Does CME work? An analysis of the effect of educational activities on physician performance or health care outcomes. Int J Psychiatry Med. 1998, 28: 21-39.View ArticlePubMedGoogle Scholar
- Oxman AD, Thomson MA, Davis DA, Haynes RB: No magic bullets: a systematic review of 102 trials of interventions. Can Med Assoc J. 1995, 153: 1423-1427.Google Scholar
- Davis DA, Thomson MA, Oxman AD, Haynes RB: Changing physician performance: a systematic review of continuing medical education strategies. JAMA. 1995, 274: 700-705. 10.1001/jama.274.9.700.View ArticlePubMedGoogle Scholar
- Kerwick S, Jones RH: Educational interventions in primary care psychiatry: a review. Primary Care Psychiatry. 1996, 2: 107-117.Google Scholar
- The pre-publication history for this paper can be accessed here:http://www.biomedcentral.com/1471-2296/3/15/prepub
This article is published under license to BioMed Central Ltd. This is an Open Access article: verbatim copying and redistribution of this article are permitted in all media for any purpose, provided this notice is preserved along with the article's original URL.