Published on in Vol 8, No 7 (2020): July

Preprints (earlier versions) of this paper are available at https://preprints.jmir.org/preprint/16764, first published .
Embedding “Smart” Disease Coding Within Routine Electronic Medical Record Workflow: Prospective Single-Arm Trial

Embedding “Smart” Disease Coding Within Routine Electronic Medical Record Workflow: Prospective Single-Arm Trial

Embedding “Smart” Disease Coding Within Routine Electronic Medical Record Workflow: Prospective Single-Arm Trial

Original Paper

Corresponding Author:

Dee Mangin, MBChB, DPH, MRNZCGP, FRNZCGP

Department of Family Medicine

McMaster University

David Braley Health Sciences Centre, 5th Floor

100 Main Street West

Hamilton, ON, L8P 1H6

Canada

Phone: 1 905 525 9140 ext 21219

Email: mangind@mcmaster.ca


Background: Electronic medical record (EMR) chronic disease measurement can help direct primary care prevention and treatment strategies and plan health services resource management. Incomplete data and poor consistency of coded disease values within EMR problem lists are widespread issues that limit primary and secondary uses of these data. These issues were shared by the McMaster University Sentinel and Information Collaboration (MUSIC), a primary care practice-based research network (PBRN) located in Hamilton, Ontario, Canada.

Objective: We sought to develop and evaluate the effectiveness of new EMR interface tools aimed at improving the quantity and the consistency of disease codes recorded within the disease registry across the MUSIC PBRN.

Methods: We used a single-arm prospective trial design with preintervention and postintervention data analysis to assess the effect of the intervention on disease recording volume and quality. The MUSIC network holds data on over 75,080 patients, 37,212 currently rostered. There were 4 MUSIC network clinician champions involved in gap analysis of the disease coding process and in the iterative design of new interface tools. We leveraged terminology standards and factored EMR workflow and usability into a new interface solution that aimed to optimize code selection volume and quality while minimizing physician time burden. The intervention was integrated as part of usual clinical workflow during routine billing activities.

Results: After implementation of the new interface (June 25, 2017), we assessed the disease registry codes at 3 and 6 months (intervention period) to compare their volume and quality to preintervention levels (baseline period). A total of 17,496 International Classification of Diseases, 9th Revision (ICD9) code values were recorded in the disease registry during the 11.5-year (2006 to mid-2017) baseline period. A large gain in disease recording occurred in the intervention period (8516/17,496, 48.67% over baseline), resulting in a total of 26,774 codes. The coding rate increased by a factor of 11.2, averaging 1419 codes per month over the baseline average rate of 127 codes per month. The proportion of preferred ICD9 codes increased by 17.03% in the intervention period (11,007/17,496, 62.91% vs 7417/9278, 79.94%; χ21=819.4; P<.001). A total of 45.03% (4178/9278) of disease codes were entered by way of the new screen prompt tools, with significant increases between quarters (Jul-Sep: 2507/6140, 40.83% vs Oct-Dec: 1671/3148, 53.08%; χ21=126.2; P<.001).

Conclusions: The introduction of clinician co-designed, workflow-embedded disease coding tools is a very effective solution to the issues of poor disease coding and quality in EMRs. The substantial effectiveness in a routine care environment demonstrates usability, and the intervention detail described here should be generalizable to any setting. Significant improvements in problem list coding within primary care EMRs can be realized with minimal disruption to routine clinical workflow.

JMIR Med Inform 2020;8(7):e16764

doi:10.2196/16764

Keywords



Primary care is at the center of health care delivery and coordination and is critically positioned to achieve better population health outcomes and address health inequity within clinical care [1,2]. Chronic disease and multimorbidity are increasingly prevalent in primary care populations [3-6]. Chronic disease identification at the individual level helps to inform better patient care and flags the potential burden of illness and of patients’ care experience. Chronic disease measurement at the practice and population level can help direct prevention strategies and plan health services resource management [3,4,7,8].

The uptake of electronic medical records (EMRs) internationally is high [9]. In Canada, 83% of primary care physicians are using EMRs [10]. Data within primary care EMRs support care for the individual patient. Aggregated, these data may also support practice-based and population health initiatives to understand, target, and deliver care [11], supporting both epidemiological research and quality improvement [11-13]. The Canadian Primary Care Sentinel Surveillance Network (CPCSSN) is one of several national networks that aggregate EMR data to support this work [7,8,14,15]. However, data completeness and consistency of coded values within EMR problem lists or disease registries limit primary and secondary uses of these data [4,16-20].

Primary care clinicians manage, on average, 3 problems per 10- to 15-minute consultation. They have limited time to devote to clinical encounter tasks and even less time for additional data recording and quality tasks that do not relate to individual patient care workflow [21,22]. Primary care physicians spend around half of their clinic time and 1 to 2 hours of after-clinic work devoted to EMR tasks [21,22]. Administrative tasks, including billing, account for around half of the time spent interacting with the EMR.

Primary care practice-based research networks (PBRNs) are clinician collectives focused on asking and answering research questions relevant to their practice context, often using aggregate, routinely collected EMR data. A PBRN offers an ideal setting to imagine and trial interventions that could improve data quality, while not interrupting clinician workflow.

The McMaster University Sentinel and Information Collaboration (MUSIC) PBRN in Hamilton, Ontario, Canada, contributes deidentified EMR data to the CPCSSN national network. Validated algorithms estimate chronic disease prevalence using disease registry codes, billing codes, and medication data [23]. The MUSIC network showed a low prevalence and variability in disease registry codes in relation to the patient population being served.

Our network has been previously successful in implementing an automated, electronic sentinel influenza reporting program integrated into the EMR [24]. We hypothesized that, if co-designed with clinicians, embedding “smart” disease recording within usual EMR clinical workflow could improve disease registry coding volume and quality without any significant burden for clinicians. In this paper, we describe the design, development, and results of a trial of implementation of disease coding tools within the EMR on disease code volume and consistency.


We conducted a pragmatic trial of an intervention aimed at improving the quantity and the consistency of coded disease data recorded within the disease registry across the MUSIC PBRN.

Setting

The study was set within the MUSIC practice-based research network. The MUSIC network holds data on over 75,080 patients, 37,212 currently rostered, from a broad range of neighborhoods within Hamilton, Ontario, Canada, and the surrounding area. All clinicians use the open source EMR, Open Source Clinical Application and Resources (OSCAR).

Study Design

We used a single-arm prospective trial design with preintervention and postintervention data analysis to assess the effect of the intervention on disease recording volume and quality.

Intervention Development

We discussed the project rationale with project stakeholders, including clinicians, clinic executives, and MUSIC network staff, to establish project support. There were 5 key aspects to our intervention development: literature review, as-is state investigation of the EMR interface, user engagement in design, standardization of disease codes, and iterative prototype feedback cycles.

Literature Review

We first conducted a nonexhaustive literature review to inform the interface design, noting barriers and facilitators for EMR meaningful use [13,25-28]. Prior research demonstrated the concept of leveraging billing workflow for disease-related data improvement [18] and the disease code morbidities most relevant to primary care [27].

As-Is State Investigation

The research team investigated the EMR interface for disease data capture within the OSCAR disease registry and within the billing module. Multiple disease registry issues were flagged, including the poor visibility of disease recording tools, which required side-stepped navigation. International Classification of Diseases, 9th Revision (ICD9) code selection was cumbersome due to nonintuitive term names arranged in a large, flat list that lacked organization.

The billing module is an obligatory part of clinical workflow and requires use of provincially issued diagnostic billing codes. The disease coding component of the billing module was explored for its capacity to be leveraged in disease registry code capture, and challenges to this plan were detected. Similar to the ICD9 coding tools, tools for selecting billing codes lacked clinician-friendly naming, quick-pick lists, or an easy method for search and selection of common conditions. Provincial diagnostic billing codes often lacked specificity, bundling several related conditions together, precluding their use in specific disease identification. Of particular note, the last inputted diagnostic code used to bill the previous patient encounter remained populated in the field, satisfying that portion of the data entry criteria for the billing process and providing little incentive for clinicians to choose the diagnostic code best matched to the current patient encounter.

User Engagement in Design

We engaged 4 clinicians as project advisors and champions. Semistructured interviews with champions identified issues that were possibly contributing to the low volume of disease registry codes and lack of code consistency; these fell into categories of people (physician users), process (workflow and optimized use), and technology (interface).

Stated issues included lack of awareness of how to optimally use disease coding tools, along with time constraints related to clinical workflows and data collection activities. Champions noted a lack of confidence in optimal code selection for both billing codes and disease registry codes, as coding tools were not well supported with search and retrieval tools or quick-pick lists that featured organized and complete sets of preferred terms presented in clinician-friendly formats. Issues of time inefficiency and workflow redundancy related to the need to separately select ICD9 code values for the disease registry when a billing diagnostic code value is already mandated for creating a billing invoice. Champions also reasoned that a firm clinical diagnosis does not always occur at the patient’s first billed encounter for the problem. Disease registry interface issues identified by physicians echoed many of the same constraints and barriers that researchers noted during the as-is state investigation, including low visibility of the disease registry module within the EMR and its lack of integration within clinical documentation workflow.

Standardized Disease Codes

We found that the terminology standard, Clinician-Friendly Pick-List Guide for clinical assessment [29], offered for licensed use from the Canadian Institute for Health Information (CIHI), provided a good basis for composing clinician-friendly, chronic disease quick-pick lists for both the billing diagnostic codes and the disease registry codes. We created a reference table composed of 1:1 matches between provincial diagnostic billing codes and the best equivalent ICD9 code to be leveraged for disease registry code capture in the new interface solution (Multimedia Appendix 1).

Iterative Design and Feedback Cycles

We developed wire-framed interface prototypes designed to address clinician-noted EMR interface constraints and to increase integration of the disease registry coding into the routine billing process workflow. We sought prototype feedback from clinical champions on (1) the selection of specific codes and their outward-facing names within quick-pick lists, (2) the interface ease of use and its fit into the clinical documentation workflow, and (3) the comprehensibility of data coding interface inputs, screen prompts, and outputs.

The OSCAR EMR service provider contributed substantially to the development of design features that were mindful of the constraints of the EMR platform. A functioning prototype of the interface solution was hosted on a project server and presented to the larger group of clinician end users, with support by clinical champions. This step allowed for consideration of other important design perspectives that were factored into the final interface solution and training of clinician end users.

Intervention Description

The final EMR interface solution (Figures 1 and 2) addressed the key issues identified by champions, incorporating disease coding prompts within usual workflow, ease of use, and minimal time burden.

Figure 1. The quick-pick list for disease registry data entry with a pop-up prompt embedded within the billing module.
View this figure
Figure 2. Screenshot of the billing diagnostic quick-pick list.
View this figure
Disease Code Quick-Pick Lists

We renamed the ICD disease registry codes with 51 front-facing clinician-friendly terms for common chronic conditions in primary care, guided by the CIHI list and clinical champion feedback. We organized the codes into a quick-pick list with clinically logical groupings and inserted this within the billing module (Figure 1) and the disease registry module. A total of 44 billing diagnostic codes were selected for closest equivalence to the disease registry codes (Multimedia Appendix 1) and fitted with new clinician-friendly term names. Where codes comprised multiple conditions, the one most relevant to the matched ICD9 code formed the leading portion of the term name. These were presented as an easily accessible drop-down list (quick-pick list) within the billing module to be used during obligatory billing activities (Figure 2).

Disease Registry Code Prompt Within the Billing Module

The table of billing diagnostic codes matched to ICD9 disease registry codes was posted to the back end of the EMR for automatic nomination of an equivalent disease registry ICD9 code via a pop-up window prompt (Figure 1). The timing of the prompt coincides with clinical cognitive processes around diagnosis and obligatory billing documentation tasks for clinical encounters. When one of the quick-pick billing diagnostic codes is selected, a pop-up screen appears that asks, “Do you want to add [term name] to the disease registry?” with “Yes” and “No” button selections. If the matching ICD9 code value is already in the patient’s disease registry, no prompt is presented. Clicking on “Yes” adds the underlying ICD9 code value to the patient’s disease registry. If “No” is clicked and the same billing code for the same patient is selected at a later consultation, the screen prompt is presented again up to 3 times, after which it is no longer presented. This repeated prompt was suggested by the clinician advisors who gave feedback that diagnosis is not always confirmed at the first presentation for a condition and that 3 times offers a reasonable opportunity to select a disease code without creating undue burden or contributing to alert fatigue.

Once the billing module interface changes were implemented, each clinic site hosted group training sessions for clinician end users that reinforced project rationale and described optimized use of new interface features. Clinician champions at each site encouraged and supported their peers in using the new tools. End users provided interface experience feedback to the project team via clinician champions.

Outcome Measures

Primary Outcome

The primary outcome was the change in total number of disease registry codes in the MUSIC data set compared with the expected number estimated from the preintervention period to assess whether the intervention had been successful.

Secondary Outcomes

The secondary outcomes were (1) data consistency, assessed by comparing the proportion of ICD9 codes that matched to the preferred codes at baseline and during the 6-month postintervention phase; (2) usability of the new interface coding tools, assessed by comparing counts of the mode by which the new codes were being added (interface prompts versus other means, eg, direct keying in); and (3) patient characteristics, including the number of patients with disease registry codes identified in their records and whether new codes were added to patients’ partially completed disease registries or de novo, to patients’ disease registries with no previous disease code entries.

Data Collection Period

We implemented the EMR interface changes on June 25, 2017. The preintervention data set includes all disease registry codes added between January 23, 2006, and June 24, 2017 (baseline period). The intervention data set includes all codes collected on or after the implementation date of June 25, 2017 (intervention period).

We compared the baseline period codes to the intervention period codes at 3 and 6 months after initiation of the intervention to assess their volume and quality.


Primary Outcome

During the 11-year baseline period (2006 to mid-2017), 17,496 ICD9 code values were recorded in the disease registry. This represents an average code collection rate of 127 codes per month. After implementation of new interface features, 9278 codes were added over 6 months, representing 8516 more codes over the expected volume of 762 codes. Disease registry codes were therefore increased by 48.67% (8516/17,496) by the intervention. The intervention period coding rate averaged 1546 codes per month, which is an increase of 1419 codes per month over the baseline rate (127 codes per month), or a factor of 11.2 (Figure 3). There were more codes added in the first 3 months (6138/9278) of the intervention period compared with the last 3 months (3140/9278).

Figure 3. Disease registry monthly code collection rates of baseline and intervention periods.
View this figure

Secondary Outcomes

Data Consistency

We found a statistically significant percentage point increase of 17.03% (χ21=819.4; P<.001) in the proportion of preferred ICD9 codes selected in the intervention period (7417/9278, 79.94%) compared with the baseline period (11,007/17,496, 62.91%) (Table 1). This shifted the proportion of preferred ICD codes overall from 62.91% (11,007/17,496) to 68.81% (18,424/26,774).

Table 1. Proportion of preferred International Classification of Diseases, 9th Revision codes used in the baseline and intervention periods.
PeriodPreferred ICDa term codes, n (%) (n=18,424)bNonpreferred ICD term codes, n (%) (n=8350)cTotal codes, n (N=26,774)
Baseline period 11,007 (62.91)6489 (37.09)17,496
Postintervention period7417 (79.94)1861 (20.06)9278
Proportional change3590 (17.03)4628 (–17.03)N/Ad

aICD: International Classification of Diseases.

b68.81% of total codes.

c31.19% of total codes.

dN/A: not applicable.

Usability of Coding Tools

Over the 6-month follow-up period, 45.03% (4178/9278) of codes were added via the new screen prompt triggered by the quick-pick list billing codes, with a significant rise in proportion from the first 3 months to the last 3 months (2507/6140, 40.83% vs 1671/3148, 53.08%; χ2=126.2; P<.001). The remaining codes were directly added through (1) the quick-pick list of 51 clinician-friendly disease registry terms positioned within the final screen of the billing module, (2) the quick-pick list in the disease registry module itself, or (3) manually typing the selected codes into the designated field of the disease registry module.

Patient Characteristics

A total of 12,459 unique patients had one or more disease registry codes in their record; 28.78% (3486/12,459) had codes recorded during the postintervention period. Among these 3486 patients with postintervention codes, 1527 (43.80%) had no previous disease registry codes in their record, indicating that the new disease coding tools were balanced between extending partially completed disease registries and creating new registries for patients (Multimedia Appendix 1). Demographic characteristics of patients with disease registry coding can be found in Multimedia Appendix 1.


Principal Results

Our study demonstrates that embedding clinician co-designed EMR disease recording tools into routine workflow, reinforced by training and peer support, results in substantial improvements in the quantity and quality of disease registry coding. In just 6 months, we found an absolute increase of 53.03% (9278/17,496), or a 48.67% (8516/17,496) gain over the number of disease codes expected from the previous 11-year period. There were more codes added in the first 3 months of the intervention period compared with the second 3 months. We saw an increase in the second 3 months in the proportion of codes being added via the new screen prompt triggered by the billing diagnosis code for that encounter. These findings might be expected; the potential gap in disease registry coding narrows as codes are added to a given patient’s problem list for existing but uncoded diseases, so eventually only new disorders identified at subsequent encounters need to be added.

The consistency of codes also increased, with a greater selection of preferred codes added to the disease registry within the intervention period compared with the baseline period. Having a more consistent set of disease codes improves the quality and thereby the value of the data set, supporting both population health research and quality improvement initiatives. The use of the new tools over the older, less systematic ways of entering disease registry codes suggests that this is an acceptable way to substantially increase disease coding and quality.

Strengths

We used a pragmatic, iterative approach to a primary care EMR enhancement project, with clinician end users involved in design at each step. We applied multiple methods to thoroughly inform the design, including potential solutions from the literature, a national reference standard, and the local EMR service provider. The solution was fitted to routine clinical documentation workflow to limit burden on clinicians. The 6-month follow-up provides a useful and informative assessment of the longitudinal benefit of the intervention. With the pace of change in health informatics, in addition to shifts in definitions for billing codes, gathering follow-up data over this targeted period avoids most potential process and contextual confounders.

Limitations

While the 6-month evaluation period avoids the confounders highlighted above, it also provides a limited scope with which to measure the long-term success of the interface change. Further longitudinal evaluation will help illuminate any extinction of effect as the coding gap closes and whether the predicted further increase in the overall consistency of codes is supported by the data.

This solution of prompting physicians to add disease registry codes as part of the billing documentation workflow limits coding to patients attending medical appointments. Other solutions for completing the disease registry for patients who attend infrequently will need to be devised to ensure representative problem list data for this group. Disease registry back coding of patients using validated algorithmic case definitions (eg, those offered by CPCSSN [23]) integrated with clinician input may offer a further opportunity to assign missing disease registry codes to inactive patients.

The intervention development and implementation had 5 key aspects of design, as well as training and peer support in implementation. It is not possible to determine the relative contribution of each to the overall effectiveness.

Comparison With Prior Work

Leading electronic health researchers have identified knowledge and research gaps in primary care EMRs, specifically the need for reliable disease and multimorbidity metrics to inform optimal management of patients’ clinical problems and population-level health strategies [30]. These issues were addressed in this research, first with identification of EMR design constraints affecting disease coding, followed by development, implementation, and evaluation of new data collection tools toward improved data quantity and quality.

Similar to other reported findings [17,19,31,32], we identified data quality issues in the MUSIC EMR data set that limit confidence in the use of chronic disease data for practice-based initiatives and research. Previous research in problem list design identified the benefit of incorporating the problem list into the clinical documentation routine [18,26]; this need was echoed in the feedback from MUSIC clinicians that were consulted in the design of the EMR interface improvement.

EMR usability studies have generated a myriad of clinician observations that identify navigation, safety, and cognitive load issues associated with EMRs [33]. This research underscores the importance of clinician input in EMR design and redesign projects. Continuous engagement of clinician end users in EMR implementation projects [34] or EMR use enhancement projects [35,36] has previously been reported to increase the projects’ likelihood of success [37]. Clinicians in the role of project champions and change management agents have proven essential for the encouragement of advanced EMR feature use [38].

In our study, the application of local physician co-design, which saw key clinician input into solution development, implementation planning, training components, and championing of new coding features, conceivably translated into an interface solution reasonably fitted to clinician workflow, leading to acceptability and uptake. Our study demonstrates that development and delivery of a relevant and usable solution for improving chronic disease recording is attainable.

Conclusion

Our pragmatic approach to EMR interface redesign resulted in substantial gains in disease code quantity and quality, providing a much-improved data set for asking and answering clinically important research questions. Clinician involvement in the intervention design, training, and peer support resulted in an accepted solution that placed little burden on clinicians. The often used quote, “If we want evidence-based practice, we need practice-based evidence” [39] mandates that PBRN data quality and quantity are adequate for this task. The study demonstrates that achieving significant improvements in problem list coding within primary care EMRs can be realized with minimal disruption to routine clinical workflow.

Acknowledgments

The authors would like to thank the primary care clinicians and patients of the MUSIC PBRN who contribute their data to the network through which the study data were generated and willingly contributed to this project. We also acknowledge Krzysztof Adamczyk, the information technology lead for the MUSIC network, and Ronnie Cheng, OSCAR program developer for the MUSIC network. We thank Kathy De Caire, Kati Ivanyi, Doug Oliver, and Jill Berridge for their executive support, and Casey Irvin for his help in the creation of figures and tables. We acknowledge the support of the McMaster University Department of Family Medicine for this PBRN.

Conflicts of Interest

None declared.

Multimedia Appendix 1

Supplementary Tables 1-5.

DOCX File , 17 KB

  1. Starfield B, Shi L, Macinko J. Contribution of primary care to health systems and health. Milbank Q 2005;83(3):457-502 [FREE Full text] [CrossRef] [Medline]
  2. Starfield B, Gérvas J, Mangin D. Clinical care and health disparities. Annu Rev Public Health 2012 Apr;33:89-106. [CrossRef] [Medline]
  3. Smith SM, Wallace E, O'Dowd T, Fortin M. Interventions for improving outcomes in patients with multimorbidity in primary care and community settings. Cochrane Database Syst Rev 2016 Mar 14;3:CD006560 [FREE Full text] [CrossRef] [Medline]
  4. Chronic Disease Management in Primary Health Care: A Demonstration of EMR Data for Quality and Health System Monitoring. Canadian Institute for Health Information. 2014 Jan.   URL: https://secure.cihi.ca/free_products/Burden-of-Chronic-Diseases_PHC_2014_AiB_EN-web.pdf [accessed 2019-10-09]
  5. Seniors and the Health Care System: What Is the Impact of Multiple Chronic Conditions? Canadian Institute for Health Information. 2011 Jan.   URL: https://secure.cihi.ca/free_products/air-chronic_disease_aib_en.pdf [accessed 2020-07-07] [WebCite Cache]
  6. Barnett K, Mercer SW, Norbury M, Watt G, Wyke S, Guthrie B. Epidemiology of multimorbidity and implications for health care, research, and medical education: a cross-sectional study. Lancet 2012 Jul 7;380(9836):37-43 [FREE Full text] [CrossRef] [Medline]
  7. Springate DA, Kontopantelis E, Ashcroft DM, Olier I, Parisi R, Chamapiwa E, et al. ClinicalCodes: an online clinical codes repository to improve the validity and reproducibility of research using electronic medical records. PLoS One 2014;9(6):e99825 [FREE Full text] [CrossRef] [Medline]
  8. Nicholson K, Terry AL, Fortin M, Williamson T, Thind A. Understanding multimorbidity in primary health care. Can Fam Physician 2015 Oct;61(10):918, e489-918, e490 [FREE Full text] [Medline]
  9. Schoen C, Osborn R, Squires D, Doty M, Rasmussen P, Pierson R, et al. A survey of primary care doctors in ten countries shows progress in use of health information technology, less in other areas. Health Aff (Millwood) 2012 Dec;31(12):2805-2816 [FREE Full text] [CrossRef] [Medline]
  10. 2018 Canadian Physician Survey. Canada Health Infoway. 2018 Dec.   URL: https:/​/infoway-inforoute.​ca/​en/​component/​edocman/​resources/​reports/​benefits-evaluation/​3643-2018-canadian-physician-survey [accessed 2020-07-07]
  11. Vaghefi I, Hughes JB, Law S, Lortie M, Leaver C, Lapointe L. Understanding the Impact of Electronic Medical Record Use on Practice-Based Population Health Management: A Mixed-Method Study. JMIR Med Inform 2016 Apr 04;4(2):e10 [FREE Full text] [CrossRef] [Medline]
  12. Gentil M, Cuggia M, Fiquet L, Hagenbourger C, Le Berre T, Banâtre A, et al. Factors influencing the development of primary care data collection projects from electronic health records: a systematic review of the literature. BMC Med Inform Decis Mak 2017 Sep 25;17(1):139 [FREE Full text] [CrossRef] [Medline]
  13. Paré G, Raymond L, Guinea AOD, Poba-Nzaou P, Trudel M, Marsan J, et al. Electronic health record usage behaviors in primary care medical practices: A survey of family physicians in Canada. Int J Med Inform 2015 Oct;84(10):857-867. [CrossRef] [Medline]
  14. Birtwhistle R, Queenan JA. Update from CPCSSN. Can Fam Physician 2016 Oct;62(10):851 [FREE Full text] [Medline]
  15. Birtwhistle R, Keshavjee K, Lambert-Lanning A, Godwin M, Greiver M, Manca D, et al. Building a pan-Canadian primary care sentinel surveillance network: initial development and moving forward. J Am Board Fam Med 2009;22(4):412-422 [FREE Full text] [CrossRef] [Medline]
  16. Coleman N, Halas G, Peeler W, Casaclang N, Williamson T, Katz A. From patient care to research: a validation study examining the factors contributing to data quality in a primary care electronic medical record database. BMC Fam Pract 2015 Feb 05;16:11 [FREE Full text] [CrossRef] [Medline]
  17. Singer A, Yakubovich S, Kroeker AL, Dufault B, Duarte R, Katz A. Data quality of electronic medical records in Manitoba: do problem lists accurately reflect chronic disease billing diagnoses? J Am Med Inform Assoc 2016 Nov;23(6):1107-1112. [CrossRef] [Medline]
  18. Wright A, McCoy AB, Hickman TT, Hilaire DS, Borbolla D, Bowes WA, et al. Problem list completeness in electronic health records: A multi-site study and assessment of success factors. Int J Med Inform 2015 Oct;84(10):784-790 [FREE Full text] [CrossRef] [Medline]
  19. Greiver M, Sullivan F, Kalia S, Aliarzadeh B, Sharma D, Bernard S, et al. Agreement between hospital and primary care on diagnostic labeling for COPD and heart failure in Toronto, Canada: a cross-sectional observational study. NPJ Prim Care Respir Med 2018 Mar 09;28(1):9 [FREE Full text] [CrossRef] [Medline]
  20. Greiver M, Wintemute K, Aliarzadeh B, Martin K, Khan S, Jackson D, et al. Implementation of data management and effect on chronic disease coding in a primary care organisation: A parallel cohort observational study. J Innov Health Inform 2016 Oct 12;23(3):843 [FREE Full text] [CrossRef] [Medline]
  21. Sinsky C, Colligan L, Li L, Prgomet M, Reynolds S, Goeders L, et al. Allocation of Physician Time in Ambulatory Practice: A Time and Motion Study in 4 Specialties. Ann Intern Med 2016 Dec 06;165(11):753-760. [CrossRef] [Medline]
  22. Arndt BG, Beasley JW, Watkinson MD, Temte JL, Tuan W, Sinsky CA, et al. Tethered to the EHR: Primary Care Physician Workload Assessment Using EHR Event Log Data and Time-Motion Observations. Ann Fam Med 2017 Sep;15(5):419-426 [FREE Full text] [CrossRef] [Medline]
  23. Williamson T, Green ME, Birtwhistle R, Khan S, Garies S, Wong ST, et al. Validating the 8 CPCSSN case definitions for chronic disease surveillance in a primary care database of electronic health records. Ann Fam Med 2014 Jul;12(4):367-372 [FREE Full text] [CrossRef] [Medline]
  24. Price D, Chan D, Greaves N. Physician surveillance of influenza: collaboration between primary care and public health. Can Fam Physician 2014 Jan;60(1):e7-15 [FREE Full text] [Medline]
  25. Chowdhry SM, Mishuris RG, Mann D. Problem-oriented charting: A review. Int J Med Inform 2017 Jul;103:95-102. [CrossRef] [Medline]
  26. Simons SMJ, Cillessen FHJM, Hazelzet JA. Determinants of a successful problem list to support the implementation of the problem-oriented medical record according to recent literature. BMC Med Inform Decis Mak 2016 Aug 02;16:102 [FREE Full text] [CrossRef] [Medline]
  27. Tonelli M, Wiebe N, Fortin M, Guthrie B, Hemmelgarn BR, James MT, Alberta Kidney Disease Network. Methods for identifying 30 chronic conditions: application to administrative data. BMC Med Inform Decis Mak 2015 Apr 17;15:31 [FREE Full text] [CrossRef] [Medline]
  28. Rahal RM, Mercer J, Kuziemsky C, Yaya S. Primary Care Physicians' Experience Using Advanced Electronic Medical Record Features to Support Chronic Disease Prevention and Management: Qualitative Study. JMIR Med Inform 2019 Nov 29;7(4):e13318 [FREE Full text] [CrossRef] [Medline]
  29. Canadian Institute for Health Information. Clinician-Friendly Pick-List Guide. Pan-Canadian Primary Health Care Electronic Medical Record Content Standard, Version 3. 2014.   URL: https://secure.cihi.ca/free_products/PHC_EMR_Content_Standard_V3_PickListGuide_EN.pdf [accessed 2020-07-07]
  30. Terry AL, Stewart M, Fortin M, Wong ST, Grava-Gubins I, Ashley L, et al. Stepping Up to the Plate: An Agenda for Research and Policy Action on Electronic Medical Records in Canadian Primary Healthcare. Healthc Policy 2016 Nov;12(2):19-32 [FREE Full text] [Medline]
  31. Price M, Davies I, Rusk R, Lesperance M, Weber J. Applying STOPP Guidelines in Primary Care Through Electronic Medical Record Decision Support: Randomized Control Trial Highlighting the Importance of Data Quality. JMIR Med Inform 2017 Jun 15;5(2):e15 [FREE Full text] [CrossRef] [Medline]
  32. Sollie A, Sijmons RH, Helsper C, Numans ME. Reusability of coded data in the primary care electronic medical record: A dynamic cohort study concerning cancer diagnoses. Int J Med Inform 2017 Mar;99:45-52. [CrossRef] [Medline]
  33. Zahabi M, Kaber DB, Swangnetr M. Usability and Safety in Electronic Medical Records Interface Design: A Review of Recent Literature and Guideline Formulation. Hum Factors 2015 Aug;57(5):805-834. [CrossRef] [Medline]
  34. Goodison R, Borycki EM, Kushniruk AW. Use of Agile Project Methodology in Health Care IT Implementations: A Scoping Review. Stud Health Technol Inform 2019;257:140-145. [Medline]
  35. Jones M, Talebi R, Littlejohn J, Bosnic O, Aprile J. An Optimization Program to Help Practices Assess Data Quality and Workflow With Their Electronic Medical Records: Observational Study. JMIR Hum Factors 2018 Dec 21;5(4):e30 [FREE Full text] [CrossRef] [Medline]
  36. Tran K, Leblanc K, Valentinis A, Kavanagh D, Zahr N, Ivers NM. Evaluating the Usability and Perceived Impact of an Electronic Medical Record Toolkit for Atrial Fibrillation Management in Primary Care: A Mixed-Methods Study Incorporating Human Factors Design. JMIR Hum Factors 2016 Feb 17;3(1):e7 [FREE Full text] [CrossRef] [Medline]
  37. Gill R, Borycki EM. The Use of Case Studies in Systems Implementations Within Health Care Settings: A Scoping Review. Stud Health Technol Inform 2017;234:142-149. [Medline]
  38. Terry AL, Ryan BL, McKay S, Oates M, Strong J, McRobert K, et al. Towards optimal electronic medical record use: perspectives of advanced users. Fam Pract 2018 Sep 18;35(5):607-611. [CrossRef] [Medline]
  39. Green LW. Making research relevant: if it is an evidence-based practice, where's the practice-based evidence? Fam Pract 2008 Dec;25 Suppl 1:i20-i24. [CrossRef] [Medline]


CIHI: Canadian Institute for Health Information
CPCSSN: Canadian Primary Care Sentinel Surveillance Network
EMR: electronic medical record
ICD9: International Classification of Diseases, 9th Revision
MUSIC: McMaster University Sentinel and Information Collaboration
OSCAR: Open Source Clinical Application and Resources
PBRN: practice-based research network


Edited by G Eysenbach; submitted 22.10.19; peer-reviewed by D Gunasekeran, C Fincham; comments to author 17.12.19; revised version received 21.02.20; accepted 10.04.20; published 27.07.20

Copyright

©Dee Mangin, Jennifer Lawson, Krzysztof Adamczyk, Dale Guenter. Originally published in JMIR Medical Informatics (http://medinform.jmir.org), 27.07.2020.

This is an open-access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in JMIR Medical Informatics, is properly cited. The complete bibliographic information, a link to the original publication on http://medinform.jmir.org/, as well as this copyright and license information must be included.