Skip to main content
  • Research article
  • Open access
  • Published:

Promoting sustainability in quality improvement: an evaluation of a web-based continuing education program in blood pressure measurement

Abstract

Background

The accuracy of blood pressure measurement is variable in office-based settings. Even when staff training programs are effective, knowledge and skills decay over time, supporting the need for ongoing staff training. We evaluated whether a web-based continuing education program in blood pressure measurement reinforced knowledge and skills among clinical staff and promoted sustainability of an existing quality improvement program.

Methods

Medical assistants and nurses at six primary care clinics within a health system enrolled in a 30-min online educational program designed to refresh their knowledge of blood pressure measurement. A 20-question pre- and post-intervention survey addressed learners’ knowledge and attitudes. Direct observation of blood pressure measurement technique before and after the intervention was performed. Differences in responses to pre- and post-module knowledge and attitudes questions and in observation data were analyzed using chi-square tests and simple logistic regression.

Results

All 88 clinical staff members participated in the program and completed the evaluation survey. Participants answered 80.6% of questions correctly before the module and 93.4% afterwards (p < 0.01). Scores improved significantly among staff from all job types. Licensed practical nurses and staff who had been in their current job at least a year were more likely to answer questions correctly than registered nurses and those in their current job less than a year. Attitudes toward correct blood pressure measurement were high at baseline and did not improve significantly. Prior to the intervention, staff adhered to 9 of 18 elements of the recommended technique during at least 90% of observations. Following the program, staff was more likely to explain the protocol, provide a rest period, measure an average blood pressure, and record the average blood pressure, but less likely to measure blood pressure with the arm at heart level and use the right arm.

Conclusions

We designed, implemented, and evaluated a web-based educational program to improve knowledge, skills, and attitudes in blood pressure measurement and use of an automated device among nurses and medical assistants in ambulatory care. The program reinforced knowledge related to recommended blood pressure measurement technique.

Trial registration

Retrospectively registered with ClincalTrials.gov on March 22, 2012; registration number NCT01566864.

Peer Review reports

Background

Office-based blood pressure measurement is variable, even among clinical staff accustomed to measuring blood pressure [1,2,3]. Failure to adhere to American Heart Association (AHA) blood pressure measurement guidelines has been shown for practicing primary care physicians and nurses [2, 4]. Poor measurement technique can lead to both under- and over-treatment of hypertension, and may contribute to suboptimal control [4]. Lack of supervised training, knowledge of correct technique, and degradation of knowledge and skills have been cited as contributing factors [5,6,7,8]. Staff training programs may minimize measurement error, especially when integrated with the use of automated blood pressure devices [9, 10] as part of comprehensive quality improvement “bundles” [11, 12].

However, implementation and sustainability of quality improvement programs such as these may be hindered by factors such as lack of integration with existing workflow, degradation of knowledge or skills, lack of resources, and competing demands [13]. For example, in the area of blood pressure measurement, decay in staff knowledge and skills necessitates ongoing refresher training in order to ensure program sustainability [14]. Structured ongoing training in correct technique has been effective in improving knowledge and behaviors among clinical staff, but programs described in the literature have been conducted in-person over the course of several hours, consuming both staff and teaching faculty time [1, 5].

Sustainability, defined as “making an innovation routine until it reaches obsolescence,” may be conceptualized as depending on two conditions: institutionalisation and routinisation [15, 16]. Institutionalisation refers to the process by which organizations provide the resources and conditions needed to support a new practice. Routinisation refers to the process by which a new practice becomes a routine activity for workers within that organization. In other words, a program’s sustainability depends both on the extent to which an organization integrates a new practice into its physical infrastructure, policies, and management activities; and on workers developing a culture in which the new practice is treated as, “the way we do things here.” Although both conditions must be present to ensure a new practice’s sustainability, organizational leaders typically have more direct control over processes related to institutionalisation [16].

Web-based training has gained popularity as a modality for delivering up-to-date, standardized, accessible content that can be integrated with existing training systems, facilitating both institutionalisation across multiple sites and routinisation [17]. Web-based training can be disseminated across disparate care centers, offering the promise of delivering targeted, up-to-date training in line with organizational goals [18, 19]. As hospitals and physician practices continue to consolidate, the need for integrated, standardized training will only increase [20]. To our knowledge, no published studies of have addressed use of an online educational program towards sustainable quality improvement across a health system.

We designed, implemented, and evaluated a concise web-based educational program to improve knowledge, skills, and attitudes in blood pressure measurement and use of an automated device among nurses and medical assistants in ambulatory care. In this paper we report on the effectiveness as well as the institutionalisation of this intervention.

Methods

Context and setting

The study was conducted as part of Project ReD CHiP (Reducing Disparities and Controlling Hypertension in Primary Care), a pragmatic trial designed to evaluate multiple interventions for improving blood pressure control and reducing hypertension-related racial disparities conducted in six primary care practices within the Johns Hopkins Community Physicians (JHCP) health system [11].

In 2011, approximately 2 years prior to this study, we trained clinic staff to use the Omron HEM-907XL automated blood pressure measurement devices in a manner consistent with AHA guidelines [21]. As part of the practice network’s protocol, staff had been instructed to position patients in accordance with AHA guidelines, to rest patients for 3 min and then to obtain three consecutive blood pressure measurements using the automated features of the device [21]. The online training program described in this study was developed to address the organization’s need to train new employees and provide ongoing training for existing staff.

Study design

We conducted a pre- and post-intervention assessment of knowledge, attitudes, and behaviors among JHCP staff who participated in this online training program. All certified medical assistants (CMAs), licensed practical nurses (LPNs), and registered nurses (RNs) at intervention practices were invited to enroll. Prior to enrolling in this study, all staff members had already been trained and deemed proficient in performing the ReD CHiP blood pressure measurement process, including use of the automated device, which had become the study clinics’ standard blood pressure measurement technique. Initial training at hire consisted of group didactics and individual tests of proficiency via direct observation by JCHP nurse educators.

Ethics, consent, and permissions

This study was approved and HIPAA waiver granted by the Institutional Review Board at Johns Hopkins University School of Medicine (NA_00037622) and the Johns Hopkins Community Physicians Education Committee. As the procedures were classified by the IRB as comprising quality improvement endeavors, written consent was not obtained.

Continuing education program

The training program consisted of two 15-min videos and 20 pre- and post-module multiple choice questions that were delivered via Course Development (LearnShare LLC, Maumee, Ohio), an online educational platform that was available to the entire health system. All questions were vetted by program leadership and the web development team. For this study, clinical staff members viewed the videos during a 1 month period between April–May, 2013 and were required to score ≥ 80% on knowledge questions to pass. Those who did not score at least 80% were required to re-watch the training modules and re-take the post-module test. Module topics included the rationale behind blood pressure measurement, patient positioning, importance of a rest period and taking multiple measurements, key features of the automated device, troubleshooting device errors, and measuring blood pressure in emergency settings and special populations such as obese and physically disabled patients. To promote peer validation, two CMAs from one of the practices demonstrated correct blood pressure measurement techniques in the training videos [13].

Data collection

Knowledge and attitude outcomes

Knowledge and attitude outcomes were assessed using pre- and post-module questions, which consisted of a 15-question knowledge test and a 5-item survey of participants’ attitudes towards the blood pressure measurement program (Additional file 1). The knowledge test covered the importance of blood pressure control, patient positioning, rest period, and use of the automated device. Attitudes questions related to the importance of following AHA guidelines in all patients. A 5-point Likert scale from “1 – strongly agree” to “5 – strongly disagree” was used for responses to attitudes questions.

Behavior outcomes

To assess participants’ behavior, research assistants (RAs) observed a semi-random convenience sample of CMAs measuring blood pressure during the 4–8 weeks prior to and after the intervention, with a goal of observing between 2 and 3 patients per CMA at the clinical site. RAs were instructed to obtain observations during both morning and afternoon sessions throughout the week. RAs approached the first patient who presented to the clinic for registration. If the patient agreed to being observed, RAs collected demographic information and accompanied the patient to the exam room, where the RA recorded observations about blood pressure measurement using a structured form (Additional file 2) until the intake process ended. Information about the staff conducting the blood pressure measurement was not recorded. After completing an observation, RAs then approached the next patient presenting for registration and enrolled them if the next patient would be seen by a different CMA. To minimize risk for selection bias and for the Hawthorne effect, RAs were instructed to tell clinic staff that they were observing all aspects of the patient intake process instead of specifically blood pressure measurement. To establish the inter-rater reliability among observers, RAs rated four pre-recorded videos of mock office visits prior to deployment to the clinics. Concordance rates among the four RAs and three investigators (LB, SF, RB) was ≥ 85%.

Modifying variables

Staff participant demographic information was gathered from practice network administrative data, including gender, age, race, ethnicity, practice site, time in current job, and job title (CMA vs. LPN vs. RN). Variables for observer, site, and time of day also were recorded with direct observations.

Data analysis

Responses to knowledge questions were classified as correct or incorrect and tallied by individual participant and overall by pre- or post-module. For behavior outcomes, independent variables were observer and site. Dependent variables were aspects of blood pressure measurement. After the intervention, we learned that an incorrect response had been posted in the pre-assessment but not the post-assessment. The question, regarding patient positioning, gave the correct answer as “the patient’s arm should be held away from his/her body” rather than “the patient’s back should be supported during blood pressure measurement.” This question was therefore removed from data analysis.

We also conducted a post-hoc, secondary analysis to assess for links between when an education topic was delivered, duration of time allocated to that topic or quality of educational content, and staff members’ knowledge. To assess timing and duration of an educational topic, we coded the time when each aspect of blood pressure measurement technique was introduced, and the duration of time allocated to discussing a given technique. To assess quality of educational content, we first coded whether the module only verbally described an aspect of recommended blood pressure measurement technique or whether it also showed the underlying reason for why that aspect of technique is important. We also coded content on whether the module described a given aspect of technique verbally, visually, with written descriptors, or with a combination of these methods.

Differences in responses to pre- and post-module knowledge and attitudes questions and in observation data were analyzed using chi-square tests and simple logistic regression. STATA 11IC (Stata Corp, College Station, TX) was used to perform all analyses.

Results

All 88 invited clinical staff completed the continuing education program including the pre- and post-test (100% response, Table 1). Ninety-eight percent of participants were female; 73% were younger than age 46. 72% were CMAs, 22% were RNs and 7% were LPNs. Most (52%) were in their current job at least 3 years. On average, module completion took 26.9 min.

Table 1 Characteristics of participants in the educational program

Knowledge and attitudes

Mean attitudes scores toward correct blood pressure measurement was positive at baseline (mean score 4.2 out of 5) and did not improve significantly post-module (mean score 4.3, p = 0.33) (Table 2, Additional file 3). Participants answered 80.6% of knowledge questions correctly before viewing the module and 93.4% of questions correctly after viewing the module (p < 0.01) (Table 1). Scores improved significantly among staff from all job types. LPNs and staff who had been in their current job at least a year were more likely to answer questions correctly than RNs (p < 0.01) and those in their current job less than a year, respectively (p = 0.04). Demographic factors including age, gender, race and ethnicity were not associated with higher scores. No baseline differences in knowledge were found between the 3 sites with predominantly underserved populations and the other 3 sites. No correlation was seen between time allocated to the learning objective in the video and score on the knowledge assessment. All learning objectives were provided using verbal plus visual and/or text-based instruction, and all except one provided the reasoning for the technique as well as the recommendation.

Table 2 Assessment of participants’ knowledge and attitudes, pre- and post-program, N = 88

Greatest improvement in knowledge was seen in questions pertaining to measuring blood pressure in patients wearing long sleeves, handling device errors, turning the device off, measuring blood pressure in obese patients, and selecting the appropriate cuff size.

Behaviors

Two hundred ten observations were completed, 122 (2.7 observations per CMA) prior to the continuing education program and 88 (2 observations per CMA) subsequently (Table 3). Prior to the educational program, staff adhered to the following elements of the protocol during at least 90% of observations: no vaccine given during blood pressure measurement (100%), no fingerstick during blood pressure measurement (97%), back supported (98%), legs uncrossed (94%), correct cuff size used (97%), used bare arm or thin sleeve (93%), arm at heart level (98%), P-set to auto (98%), patient in position throughout measurement (93%). Following the training program, staff were significantly more likely to explain the protocol to patients (59% pre- vs. 80% post-training, p < 0.01), measure an average blood pressure (66% vs. 97%, p < 0.01), provide a rest period prior to blood pressure measurement (66% vs. 85%, p < 0.01), and record in the electronic medical record (EMR) the average of 3 readings (63% vs. 95%, p < 0.01). Following training, staff was less likely to measure blood pressure with the arm at heart level (98% vs. 69%, p < 0.01) and use the right arm to measure blood pressure (64% vs 42%, p < 0.01); patients were less likely to remain still during cuff inflation (84% vs. 53%, p < 0.01).

Table 3 Assessment of participants’ blood pressure measurement technique, pre- and post-program (N = 210)

Discussion

Provider and staff education are a critical component of implementing and institutionalizing quality improvement programs. Health system leaders must disseminate information about new initiatives, protocols, and techniques effectively in order to ensure that clinicians and staff members provide standardized care across settings [18]. Major barriers to delivering effective training include the logistical difficulties and expense associated with maintaining an adequate pool of educators, coordinating training sessions with providers’ clinical schedules, standardizing training across sites, and ensuring ongoing training for staff. Online educational platforms can help organizations address these gaps. However, little is known about how these tools can facilitate adoption, implementation, and sustainability of quality improvement initiatives [10, 16].

This study highlights important features of online education through the perspective of a program designed to improve blood pressure measurement. Knowledge of correct blood pressure measurement technique improved significantly among both medical assistants and nurses at these practices following a brief online continuing education program. Direct observation data revealed that following completion of the program clinical staff was significantly more likely to explain the protocol to patients, provide a rest period prior to blood pressure measurement, and measure and record the average blood pressure, but less likely to measure blood pressure with the arm at heart level and to use the right arm to measure blood pressure.

Sustainable healthcare interventions tend to be those that have a defining mission, are well integrated into the organizational culture, have an institutional champion, are adaptable, and can be conducted without a strain on existing resources [22, 23]. Our program had a succinct goal, involved senior leadership in the organization, was integrated with existing training structure at the time of hire as well as ongoing training, and included a platform for rapid dissemination without a strain on resources [11]. The educational modules and the implementation processes are modifiable when an issue arises such as a wrong answer or new medical information. Since adoption, this program has been used annually for 4 years to train and retrain staff at the six sites, demonstrating the routinisation of the initiative.

We found baseline knowledge gaps in basic blood pressure measurement among active clinical staff, even though all had been deemed proficient in the ReD CHiP program’s techniques at program implementation or at the time of hire (if they were employed after implementation). This is consistent with prior literature demonstrating knowledge gaps among outpatient staff [1, 2]. The greatest improvement in knowledge was seen for items modeled using multiple modalities in the educational program, including verbal, text, and visual cues: patient positioning, selecting the appropriate cuff size, and explaining the process to the patient.

The training program did not improve attitudes towards accurate blood pressure measurement. Baseline attitudes toward accurate blood pressure measurement were quite high, perhaps due to ongoing training associated with the quality improvement project. Formats other than online training may be more effective in changing attitudes toward effective blood pressure measurement. Awareness of organization and site-level barriers and tailoring interventions to address these obstacles may be one way to change attitudes and behaviors of clinical staff [24].

Direct observation data revealed that completing the training program was associated with improvement in certain steps in the blood pressure protocol, including explaining the protocol to patients, providing a rest period, use of average mode, and recording the average reading in the EMR. Several of these were linked behaviors, as the use of average mode on the device included a rest period and calculated the average blood pressure for recording in the EMR. Subsequent to the training, staff was less likely to use the right arm for blood pressure measurement and to support the arm at heart level. This latter point may be due to the incorrect response recorded during the pre-assessment. Failure to emphasize this information in the training program may also be responsible. Studies of both online and in-person training programs have described a decrement in immediate post-program knowledge gains back to pre-training values several months following the program, indicating a role for supervised practice to promote knowledge retention [25, 26].

Limitations to this study include short follow-up time to assess knowledge, attitudinal, and behavioral outcomes, and a single health system in one specialty. We do not know the generalisability of our findings to other specialties or practices, nor whether knowledge gains were maintained among staff. As this training program was purely an educational intervention and did not address other potential barriers to blood pressure control such as knowledge of guidelines and clinical inertia, we do not know whether improvements in knowledge and behaviors will translate to improved blood pressure control. Prior studies have shown that adherence to AHA blood pressure guidelines impacts treatment decisions as well as measured blood pressure values [2]. The Hawthorne effect is a concern given any direct observation data, which we tried to mitigate by not revealing to clinical staff that the blood pressure measurement process was being observed. Different observers conducted the pre- and post-program observations, which may have caused measurement bias, which we tried to minimize through a rigorous training program and quality control data in mock observations.

Conclusions

In conclusion, a brief online continuing education program, integrated into an existing organizational training structure, was associated with improvements in knowledge of correct blood pressure measurement techniques, explaining behaviors to patients, and taking multiple measurements using an automated blood pressure device among medical assistants and nurses at six practice sites. To our knowledge, this is the first published study to address use of an online educational program towards sustainable quality improvement across a health system. Such training holds the promise of promoting sustainability of education that requires ongoing reinforcement or knowledge. However, improving attitudes and skills may require a multimodal approach, including a combination of web-based and live training tailored to the existing organizational culture.

Abbreviations

AHA:

American heart association

CMA:

Certified medical assistant

EMR:

Electronic medical record

JHCP:

Johns Hopkins Community Physicians

LPN:

Licensed practical nurse

Project ReD CHiP:

Reducing disparities and controlling hypertension in primary care

RA:

Research assistant

RN:

Registered nurse

References

  1. Bolen SD, Samuels TA, Yeh HC, et al. Failure to intensify antihypertensive treatment by primary care providers: a cohort study in adults with diabetes mellitus and hypertension. J Gen Intern Med. 2008;23(5):543–50.

    Article  PubMed  PubMed Central  Google Scholar 

  2. Ray GM, Nawarskas JJ, Anderson JR. Blood pressure monitoring technique impacts hypertension treatment. J Gen Intern Med. 2012;27(6):623–9.

    Article  PubMed  Google Scholar 

  3. Dickson BK, Hajjar I. Blood pressure measurement education and evaluation program improves measurement accuracy in community-based nurses: a pilot study. J Am Acad Nurse Pract. 2007;19:93–102.

    Article  PubMed  Google Scholar 

  4. Sebo P, Pechere-Bertschi A, Herrmann FR, Haller DM, Bovier P. Blood pressure measurements are unreliable to diagnose hypertension in primary care. J Hypertens. 2014;32:509–17.

    Article  CAS  PubMed  Google Scholar 

  5. Myers MG, Valdivieso MA. Use of an automated blood pressure recording device, the blood pressure TRU, to reduce the “white coat effect” in routine practice. Am J Hypertens. 2003;16(6):494–7.

    Article  PubMed  Google Scholar 

  6. Baillie L, Curzio JA. Survey of first year student nurses’ experiences of learning blood pressure measurement. Nurse Educat in Pract. 2009;9:61–71.

    Article  Google Scholar 

  7. Armstrong RS. Nurses’ knowledge of error in blood pressure measurement. International J. Nurs Pract. 2002;8(3):118–26.

    Article  Google Scholar 

  8. Smith KK, Gilcreast D, Pierce K. Evaluation of staff’s retention of ACLS and BLS skills. Resuscitation. 2008;78(1):59–65.

    Article  PubMed  Google Scholar 

  9. Grim CM, Grim CE. A curriculum for the training and certification of blood pressure measurement for health care providers. Can J Cardiol. 1995;11(H):38–42.

    Google Scholar 

  10. Myers MG, Goodwin M, Dawes M, et al. Conventional versus automated measurement of blood pressure in primary care patients with systolic hypertension: randomised parallel design controlled trial. BMJ. 2013;342:d286.

    Article  Google Scholar 

  11. Cooper LA, Marsteller JA, Noronha GJ, et al. A multi-level system quality improvement intervention to reduce racial disparities in hypertension care and control: study protocol. Implement Sci. 2013;8:60.

    Article  PubMed  PubMed Central  Google Scholar 

  12. Umscheid CA, Townsend RRI. It time for a blood pressure measurement “bundle”? J Gen Intern Med. 2012;27(6):615–7.

    Article  PubMed  PubMed Central  Google Scholar 

  13. Stirman SW, Kimberly J, Cook N, Calloway A, Castro F, Charns M. The sustainability of new programs and innovations: a review of the empirical literature and recommendations for future research. Implementation Sci. 2012;7:17.

    Article  Google Scholar 

  14. Bruce NG, Shaper AG, Walker M, Wannamethee G. Observer bias in blood pressure studies. J Hypertens. 1988;6:375–80.

    Article  CAS  PubMed  Google Scholar 

  15. Greenhalgh T, Macfarlane F, Barton-Sweeney C, Woodard F. If we build it, will it stay? A case study of the sustainability of whole-system change in London. Milbank Q. 2012 Sep;90(3):516–47.

    Article  PubMed  PubMed Central  Google Scholar 

  16. Slaghuis SS, Strating MM, Bal RA, Nieboer APA. Framework and a measurement instrument for sustainability of work practices in long-term care. BMC Health Serv Res. 2011;11:314.

    Article  PubMed  PubMed Central  Google Scholar 

  17. Ruis RG, Mintzer M, Leipzig R. The impact of e-learning in medical education. Acad Med. 2007;81(3):207–12.

    Google Scholar 

  18. Compas C, Hopkins KA, Tonsley E. Best practices in implementing and sustaining quality of care: a review of the quality improvement literature. Res Geront Nurs. 2008;1(3):209–16.

    Article  Google Scholar 

  19. Cook DA. Where are we with web-based learning in medical education? Medical Teacher. 2006;28(7):594–8.

    Article  PubMed  Google Scholar 

  20. Gaynor M, Town R. The impact of hospital consolidation-update. Robert Wood Johnson Foundation. The synthesis project, policy brief 9. 2012. http://www.rwjf.org/content/dam/farm/reports/issue_briefs/2012/rwjf73261 Accessed 17 Dec 2017.

  21. Pickering TG, Hall JE, Appel LJ, et al. Recommendations for blood pressure measurement in humans and experimental animals. Part 1: blood pressure measurement in humans: a statement for professionals from the subcommittee of professional and public education of the American Heart Association Council on high blood pressure research. Hypertension. 2005;45:142–61.

    Article  CAS  PubMed  Google Scholar 

  22. Ringerman E, Flint LJ, Hughes DE. An innovative educational program: the peer competency validator model. J Nurs Staff Dev. 2006;22(3):114–21.

    Article  Google Scholar 

  23. Handler J, Lackland DT. Translation of hypertension treatment guidelines into practice: a review of implementation. J Am Soc Hypertens. 2011;5(4):197–207.

    Article  PubMed  Google Scholar 

  24. Baker R, Camosso-Stefinovic J, Gillies C, et al. Tailored interventions to overcome identified barriers to change: effects on professional practice and health care outcomes. Cochrane Database of Systematic Reviews 2010, Issue 3. Art. No.: CD005470. DOI: https://doi.org/10.1002/14651858.CD005470.pub2.

  25. Bell DS, Harless CE, Higa JK, et al. Knowledge retention after an online tutorial: a randomized educational experiment among resident physicians. J Gen Intern Med. 2008;23(8):1164–71.

    Article  PubMed  PubMed Central  Google Scholar 

  26. Gass DA, Curry L. Physicians’ and nurses’ retention of knowledge and skills after training in cardiopulmonary resuscitation. Can Med Assoc J. 1983;128:550–1.

    CAS  PubMed  PubMed Central  Google Scholar 

Download references

Acknowledgements

The authors wish to acknowledge invaluable assistance from Lindsey Hutzler, Gabriel Shaya, and Mekam Okoye, who performed direct observations at clinic sites.

Funding

This work was funded by grants from the National Heart, Lung, and Blood Institute (P50 HL0105187 and K24 HL83113).

Availability of data and materials

The de-identified dataset supporting the conclusions of this article is included as a supplementary file. Training videos are available upon request submitted to the corresponding author.

Author information

Authors and Affiliations

Authors

Contributions

LB, SF, LC, CL, TH, KD, and RB designed the study. LB, SF, and RB participated in data collection. LB, RB, SF, and LC participated in data analysis and interpretation. LB and RB drafted the article. All authors approved the final manuscript.

Corresponding author

Correspondence to Lauren Block.

Ethics declarations

Ethics approval and consent to participate

This study was approved and HIPAA waiver granted by the Institutional Review Board at Johns Hopkins University School of Medicine (NA_00037622) and the Johns Hopkins Community Physicians Education Committee. As the procedures were classified by the IRB as comprising quality improvement endeavors, written consent was not obtained. The IRB waived the need for written consent since the study was classified as quality improvement. All of the clinical staff members signed media release forms for participating in the videos. Verbal consent was obtained from the patients who were observed during intake. We had an IRB approved patient script for asking for permission to observe. As per the IRB approved CMA Observation Protocol, we did not receive consent from the MAs to observe them.

Consent for publication

No identifying information relating to individual participants is included herein.

Competing interests

The authors declare that they have no competing interests.

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Additional file

Additional file 1:

Pre- and post-intervention surveys are the questionnaires used to assess knowledge and attitudes of clinical staff before and following completion of the continuing education program. (DOCX 3714 kb)

Additional file 2:

Observation Form is the form used by RAs to record demographic information and observations about blood pressure measurement during the intake process. (DOCX 17 kb)

Additional file 3:

Knowledge assessment data is the deidentified dataset containing the results of our pre- and post-module knowledge assessment. (XLS 441 kb)

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Block, L., Flynn, S.J., Cooper, L.A. et al. Promoting sustainability in quality improvement: an evaluation of a web-based continuing education program in blood pressure measurement. BMC Fam Pract 19, 13 (2018). https://doi.org/10.1186/s12875-017-0682-5

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s12875-017-0682-5

Keywords