Published on in Vol 13 (2025)

This is a member publication of Imperial College London (Jisc)

Preprints (earlier versions) of this paper are available at https://preprints.jmir.org/preprint/58125, first published .
A Novel Framework to Assess Clinical Information in Digital Health Technologies: Cross-Sectional Survey Study

A Novel Framework to Assess Clinical Information in Digital Health Technologies: Cross-Sectional Survey Study

A Novel Framework to Assess Clinical Information in Digital Health Technologies: Cross-Sectional Survey Study

1Department of Primary Care and Public Health, Imperial College London, White City Campus, 80–92 Wood Lane, London, United Kingdom

2Centre for Healthcare and Communities, Coventry University, Richard Crossman, Priory Street, Coventry, United Kingdom

3Global Health Research Group, School of Medicine, University College Dublin, Dublin, Ireland

4School of Life Course and Population Sciences, King’s College London, London, United Kingdom

Corresponding Author:

Kayode Philip Fadahunsi, PhD


Background: Digital health is a critical driver of quality, safety, and efficiency in health care. However, poor quality of clinical information in digital health technologies (DHTs) can compromise the quality and safety of care. The Clinical Information Quality (CLIQ) framework was developed, based on a systemic review of literature and an international eDelphi study, as a tool to assess the quality of clinical information in DHTs.

Objectives: The aim of this study is to assess the applicability, internal consistency, and construct validity of the CLIQ framework.

Methods: This study was conducted as a cross-sectional survey of health care professionals across the United Kingdom who regularly use SystmOne electronic health records. Participants were invited through emails and social media platforms. The CLIQ questionnaire was administered as a web-based survey. Spearman correlation coefficients were computed to investigate the linear relationship between the dimensions in the CLIQ framework. The Cronbach α coefficients were computed to assess the internal consistency of the global scale (ie, CLIQ framework) and the subscales (ie, the informativeness, availability, and usability categories). Confirmatory factor analysis was used to assess the extent to which the survey data supported the construct validity of the CLIQ framework.

Results: A total of 109 health care professionals completed the survey, of which two-thirds (67, 61.5%) were doctors and a quarter (26, 23.9%) were nurses or advanced nurse practitioners. Overall, the CLIQ dimensions had good quality scores except for portability, which had a modest score. The inter-item correlations were all positive and not likely due to chance. The Cronbach α coefficient for the overall CLIQ framework was 0.89 (95% CI 0.85‐0.92). The confirmatory factor analysis provided a modest support for the construct validity of the CLIQ framework with the comparative fit index of 0.86 and standardized root mean square residual of 0.08.

Conclusions: The CLIQ framework demonstrated a high reliability and a modest construct validity. The CLIQ framework offers a pragmatic approach to assessing the quality of clinical information in DHTs and could be applied as part of information quality assurance systems in health care settings to improve quality of health information.

JMIR Med Inform 2025;13:e58125

doi:10.2196/58125

Keywords



Digital health is critical to quality, safety, and efficiency of health care services [1]. Digital health technologies (DHTs) can enhance the delivery of health care services in several ways [2]. Electronic health records (EHRs) make medical records readily available at the point of care [3]. Electronic prescribing systems reduce the incidents of medication errors [4]. Clinical decision support systems support clinicians in decision-making [5,6]. Mobile health (mHealth) apps support self-management of chronic diseases [7].

However, poor quality of clinical information in DHTs can compromise the quality and safety of care [8,9]. A systematic review of literature reported widespread incidents of delayed, missing, partial, and wrong information in DHTs resulting in adverse outcomes and deaths [10]. Most of the information quality problems reported in the included studies were based on incidents reporting systems [10]. While retrospective lessons based on adverse events in the incidents reporting systems could be useful, it is more important to identify and address information quality problems as a proactive measure to prevent adverse events, hence the need for an information quality framework.

In another systematic review [11], we identified 10 existing information quality frameworks for DHTs comprising 5 frameworks for EHRs [12-16] and one each for clinical decision support systems [17], cloud-based information systems [18], electronic medical records [19], mobile and web-based telemedicine apps [20], and primary care databases [13]. Although these frameworks identified dimensions that are relevant to assessing clinical information in DHTs, most were developed without inputs of clinicians and did not provide a tool that can be used to assess DHTs [11]. To overcome these limitations, we developed the Clinical Information Quality (CLIQ) framework for DHTs.

The CLIQ framework identifies, defines, and integrates 14 dimensions that are relevant to assessing clinical information in DHTs [11,21]. The dimensions in the CLIQ framework are grouped into 3 categories—informativeness, availability, and usability. The informativeness category relates to the usefulness of clinical information in patient care. Dimensions in the informativeness category include accuracy, completeness, interpretability, plausibility, relevance, and trustworthiness. The availability category relates to the functionality of the DHTs holding clinical information. Dimensions in this category include accessibility, portability, searchability, security, and timeliness. The usability category—comprising conformance, consistency of presentation, and maintainability—concerns the ease of use of clinical information. The definitions of the dimensions in the CLIQ framework are shown in Table 1.

The CLIQ framework was developed in 4 successive stages as shown in Figure 1. An initial CLIQ framework was developed through a systematic review and qualitative synthesis of existing information quality frameworks for DHTs [11]. A CLIQ assessment questionnaire was then developed based on the CLIQ framework and further evidence from literature [21]. The questionnaire offers a pragmatic approach to assessing clinical information in DHTs based on relatable clinical scenarios. The framework and the accompanying questionnaire were revised via an international eDelphi study among 35 clinicians from 10 different countries [21].

The development and modification of the CLIQ framework through a systematic review of literature and an international eDelphi study resulted in an evidence-based and user-friendly framework that is grounded in the literature and the views of clinicians who are users of clinical information. However, these approaches addressed only the content and face validity of the CLIQ framework and not its applicability, reliability, and construct validity. Therefore, the aim of this study is to assess the applicability, internal consistency, and construct validity of the CLIQ framework.

Table 1. Clinical Information quality framework for digital health technologies.
Clinical information quality dimensions and their definition
Informativeness: the usefulness of digital information for clinical purposes
AccuracyThe extent to which information is accurate.
CompletenessThe extent to which no required information is missing.
InterpretabilityThe extent to which information can be interpreted.
PlausibilityThe extent to which information makes sense based on clinical knowledge.
TrustworthinessThe extent to which the source of information is trustworthy and verifiable.
RelevanceThe extent to which information is useful for patient care.
Availability: the functionality of the system holding clinical information
AccessibilityThe extent to which information is accessible.
PortabilityThe extent to which information can be moved or transferred between different systems.
SearchabilityThe extent to which needed information can be found.
SecurityThe extent to which information is protected from unauthorized access, corruption, and damage.
TimelinessThe extent to which information is up to date.
Usability: the ease of use of clinical information
ConformanceThe extent to which information is presented in a format that complies with institutional, national, or international standards.
Consistency of presentationThe extent to which presentation of information adheres to the same set of institutional, national, or international standards.
MaintainabilityThe extent to which information can be maintained (eg, modified, corrected, updated, adapted, and upgraded) to achieve intended improvement.
Figure 1. Stages of CLIQ framework Development.CLIQ: Clinical Information Quality.

Study Design

This study was conducted as a web-based cross-sectional survey of health care professionals across the United Kingdom who use SystmOne EHRs (The Phoenix Partnership Ltd). The cross-sectional survey approach allowed the assessment of the applicability, internal consistency, and construct validity of the CLIQ framework. The web-based approach offered a convenient, affordable, and pragmatic way of conducting a study and collecting data. The choice of SystmOne was informed by its wide use across different health care settings in the United Kingdom, including general practices, urgent care centers, social care services, hospitals, and prison medical services to manage about 61 million EHRs [22]. CHERRIES (Checklist for Reporting Results of Internet E-Surveys) was used to guide the survey report [23].

Study Participants

Eligibility criteria for the study included being a health care professional with a clinical role and a regular user of SystmOne (defined as using SystmOne as part of routine professional activities to document clinical information). Administrative staff using SystmOne and health care professionals using SystmOne occasionally (eg, to check clinical records) were not eligible to participate in the study. Participants were invited through emails and social media platforms, including Facebook, LinkedIn, WhatsApp, Yammer, Digital Health Networks, and the Future National Health System Collaboration platform. Although there is no consensus about the adequate sample size needed to validate a questionnaire, the literature recommends recruiting at least 10 participants for each item of the scale being validated as a rule of thumb [24]. As there are 14 dimensions of the CLIQ framework, this was equivalent to 140 participants in our study.

Data Collection

Health care professionals were invited to use the CLIQ framework to evaluate the information quality of SystmOne. The survey questionnaire comprised the 14 items of the CLIQ framework and 2 questions about the occupation of the respondents to assess their eligibility (Multimedia Appendix 1). Each question was displayed on a separate mobile or computer screen to the participants. The responses were made mandatory to avoid missing data, which could limit the assessment of the construct validity [25]. Participants were able to change their answers using the back button. Bot detection and prevention of multiple submissions were turned on in Qualtrics (Qualtrics). An invitation containing a link to the participant information sheet, the consent form, and the questionnaire was shared with the health care professionals electronically through the channels described earlier. Two reminders, at least 2 weeks apart, were sent to encourage participation. Health care professionals were also encouraged to share the invitation with colleagues. Data collection took place between February 27 and June 7, 2022. The questionnaire was administered through a web-based survey platform, Qualtrics.

Data Analysis

The survey result was downloaded from Qualtrics in an excel format and imported into SPSS (version 20; IBM Statistics). Only completed entries were analyzed as uncompleted entries stopped at the consent. SPSS was used to conduct descriptive statistical analyses and compute correlation coefficients and Cronbach α scores.

The 3 options for each dimension (eg, accurate, partly accurate, and not accurate) were recoded into the integers 1, 2, and 3, representing low, modest, and good quality, respectively. A descriptive statistical analysis was conducted and interpreted to demonstrate the applicability of the CLIQ framework. The distribution of the responses was expressed as frequencies, percentages, means, and SDs.

Spearman correlation coefficients were computed to investigate the linear relationship between the dimensions in the CLIQ framework. The ordinal nature of the data informed the choice of Spearman coefficients. Correlation coefficients of 0.1‐0.2 were regarded as poor, 0.3‐0.5 as fair, 0.6‐0.7 as moderate, and 0.8‐0.9 as very strong [26] .

The Cronbach α coefficient was computed to assess the internal consistency of the global scale (ie, CLIQ framework) and the subscales (ie, the informativeness, availability, and usability categories). A Cronbach α coefficient of 0.7 or above was regarded as an indication of the reliability of the scale, and an alpha coefficient between 0.6 and 0.7 was considered acceptable [26].

The SPSS data file was subsequently exported to SPSS Amos (version 28; IBM Statistics) to assess the construct validity of the CLIQ framework. Confirmatory factor analysis, a structural equation modeling technique, was used to assess the extent to which the survey data supported the construct validity of the CLIQ framework [27]. Confirmatory factor analysis was adopted because the CLIQ framework has multiple subscales (ie, informativeness, availability, and usability categories) that were predetermined [11].

The maximum likelihood estimation method was used for the confirmatory factor analysis. The model fit was assessed based on the Standardized Root Mean Square Residual (SRMR) and Comparative Fit Index (CFI) as recommended in the literature for studies with a sample size of less than 250 [28]. A CFI greater than 0.9 and an SRMR less than 0.08 indicate model fit [28].

Ethical Considerations

Informed consent was obtained from each participant at the beginning of the web-based survey after they had been provided with the participant information sheet, containing detailed information about the study objectives, expectation of the participants, duties of the researchers, and relevant contacts. Participation was voluntary. A secure web-based survey platform was used for data collection as already described. No personal information that could be used to identify the participants was collected in the survey. Survey responses were anonymized with codes automatically assigned to participants by the survey platform. The results were securely stored in Imperial College Shared Drive. Participants could withdraw from the study at any time without giving any reasons. However, once the survey was submitted, the data could not be withdrawn as the responses were anonymous. Participants were not compensated for taking part. Ethical approval was obtained from the Imperial College research ethics committee (21IC7415).


Participants’ Characteristics

A total of 109 health care professionals completed the survey, with two-thirds (67, 61.5%) being doctors and almost a quarter (26, 23.9%) being nurses or advanced nurse practitioners. The rest of the participants had other clinical occupations. Table 2 shows the distribution of the participants by occupation.

Table 2. Distribution of the occupation of the survey participants (N=109).
OccupationValues, n (%)
Doctors67 (61.5)
Nurses and advanced nurse practitioners26 (23.9)
Health care assistants5 (4.6)
Physiotherapists and occupational therapists4 (3.7)
Pharmacists2 (1.8)
Podiatrists2 (1.8)
Physician associates1 (0.9)
Therapy support workers1 (0.9)
Community health workers1 (0.9)

Participants’ Assessment of Quality of the Dimensions

The mean quality score assigned to each dimension ranged from 2.2 for portability to 2.9 for security of clinical information in SystmOne (1, 2, and 3 indicate low, modest, and good quality, respectively). Most participants (97/109, 89%) ranked security of clinical information in SystmOne as good while only more than a third of the participants (42/109, 38.5%) ranked portability of clinical information in SystmOne as good. The summary of the assessment result is shown in Table 3.

Table 3. Clinical Information Quality of SystmOne as assessed by the participants.
DimensionAssessment of the quality of the dimension, n (%)Mean score (SD)
GoodModestLow
Accuracy73 (67)32 (29.4)4 (3.7)2.6 (0.6)
Completeness63 (57.8)43 (39.4)3 (2.8)2.6 (0.6)
Interpretability82 (75.2)26 (23.9)1 (0.9)2.7 (0.5)
Plausibility92 (84.4)17 (15.6)02.8 (0.4)
Relevance89 (81.7)20 (18.3)02.8 (0.4)
Trustworthiness94 (86.2)15 (13.8)02.9 (0.3)
Accessibility77 (70.6)31 (28.4)1 (0.9)2.7 (0.5)
Portability42 (38.5)49 (45)18 (16.5)2.2 (0.7)
Searchability56 (51.4)49 (45)4 (3.7)2.4 (0.6)
Security97 (89)12 (11)02.9 (0.3)
Timeliness71 (65.1)35 (32.1)3 (2.8)2.6 (0.5)
Conformance77 (70.6)27 (24.8)5 (4.6)2.7 (0.6)
Consistency of presentation77 (70.6)22 (20.2)10 (9.2)2.6 (0.7)
Maintainability70 (64.2)33 (30.3)6 (5.5)2.6 (0.6)

Inter-item Correlation

There were positive correlations between all possible pairs of dimensions in the CLIQ framework (Multimedia Appendix 2). All the correlation coefficients were statistically significant (represented as asterisks) except for the correlations of portability with each of plausibility, relevance, trustworthiness, and security. There is a strong statistically significant correlation between conformance and consistency (Spearman ρ=0.751; P<.001).

Internal Consistency of the CLIQ Framework

The Cronbach α coefficients and corresponding 95% CIs for the informativeness, availability, and usability subscales were 0.78 (95% CI 0.70‐0.84), 0.69 (95% CI 0.58‐0.77), and 0.83 (95% CI 0.77‐0.88), respectively. Once security was removed from the availability subscale, the Cronbach α coefficient for the availability subscale increased marginally to 0.70 (95% CI 0.60‐0.78). The Cronbach α coefficient for the overall CLIQ framework is 0.89 (95% CI 0.85‐0.92).

Construct Validity of the CLIQ Framework

Although the chi-square goodness-of-fit test (χ22.10=155.69; P<.001) did not demonstrate fitness of the model, CFI (0.86) and SRMR (0.08) suggest that the model fits modestly with the data. All the factor loadings (ie, covariance estimates) were positive and statistically significant, as shown in Table 4. Significant tests (SE, critical ratio, and P value) were not reported for the first item in each category (ie, accuracy, accessibility, and conformance) because their unstandardized covariance values were fixed as 1, which is a standard procedure in confirmatory factor analysis.

The statistical significance of positive standardized covariance estimates for all information quality dimensions supports the placement of the dimensions in their respective category. This is further illustrated in the path diagram shown in Figure 2. The high values of estimated covariances between the latent variables and the observed variables support the construct validity of the model. Still, the high covariance estimates between the latent variables (informativeness, availability, and usability) indicate overlap of the categories.

Table 4. Standardized and unstandardized estimates of the covariance.
Standardized covariance estimatesUnstandardized covariance estimatesSECritical ratioP value
Accuracy0.641.00N/AaN/AN/A
Completeness0.761.180.186.41<.001
Interpretability0.590.770.155.27<.001
Plausibility0.520.530.114.72<.001
Relevance0.560.610.125.00<.001
Trustworthiness0.590.570.115.24<.001
Accessibility0.601.00N/AN/AN/A
Portability0.451.130.283.99<.001
Searchability0.721.430.255.66<.001
Security0.460.500.134.03<.001
Timeliness0.561.050.224.73<.001
Conformance0.821.00N/AN/AN/A
Consistency0.851.200.129.64<.001
Maintainability0.730.940.128.05<.001

aN/A: not applicable.

Figure 2. Path diagram for confirmatory factor analysis The rectangular shapes represent the observed variables (ie, information quality dimensions) that were directly measured in the survey. The circular shapes represent the latent variables (ie, information quality categories) that could be inferred from the measured variables. The abbreviation “e” represents error. The connecting arrows represent covariances, with the accompanying numbers representing their estimates.

Principal Findings

The study assessed the applicability, internal consistency, and construct validity of the CLIQ framework based on the pilot CLIQ assessment of SystmOne EHRs. Overall, the CLIQ dimensions had good quality scores except for portability which had a modest score. The inter-item correlations were all positive and not likely due to chance. The Cronbach α score demonstrated a good internal consistency of the CLIQ framework and its informativeness, availability, and usability subscales. The results of the Confirmatory Factor Analysis provided a modest support for the construct validity of the CLIQ framework.

Comparison With Previous Literature

The combination of good security and modest portability of clinical information in SystmOne indicates the possibility of a trade-off between security and portability. Portability might have been limited inadvertently to improve the security of clinical information. Similar trade-offs, such as between accessibility and security, have been documented in the literature [29]. Software developers need to be vigilant to identify potential trade-offs that may compromise the quality and safety of care.

The positive and mostly significant correlations between the items in the scale indicated a close relationship between the dimensions [30]. This was not unexpected because all dimensions are components of the same CLIQ framework. The generally low values of the correlations demonstrated that the dimensions, although related, are distinct [30]. The high correlation between conformance and consistency of presentation is understandable, as both concern presentation of information. Similarly, the high covariance between the latent variables (informativeness, availability, and usability), which indicate that the categories are not entirely distinct, is not strange in information quality research. Categories in information quality frameworks are not mutually exclusive as each dimension in an information framework often fits into multiple categories [31]. However, there is a scope for model revision to further explore this overlap in future CLIQ research.

The overall good internal consistency of the CLIQ framework and its subscales (ie, informativeness, availability, and usability) indicates the reliability of the CLIQ framework [32]. Although the study was unable to rely on fit indices primarily influenced by sample size such as the chi-square goodness-of-fit index [23], the index least affected by sample size, the SRMR [28], supports the model fit.

This study used similar methods as a study on the validation of the Modified Enlight Suite (MES), a generic mHealth assessment questionnaire [25]. The MES study demonstrated an overall good internal consistency and modest construct validity of the MES based on a survey of more than a thousand medical students and health care professionals who assessed a freely downloadable COVID-19 app in Ireland [25]. The differences in the uptakes of the 2 studies are probably related to the choice of DHTs and study population. This study population was limited to health care professionals using SystmOne. In contrast, the MES study was open to all health care professionals and medical students who could download the Irish COVID-19 app.

Strengths and Weaknesses

To the best of our knowledge, the CLIQ framework is the first information quality framework for DHTs, of which the internal reliability and construct validity have been assessed. Only the face and content validity of most existing information quality frameworks were assessed [11]. This study went a step further to explore the internal consistency and construct validity of the CLIQ framework.

Although the choice of SystmOne EHRs ensured participation of a multidisciplinary population of health care professionals in the testing of the CLIQ framework, allied health professionals were less represented. In addition, the voluntary nature of recruitment established a potential for systemic bias because those who were engaged in digital solutions were more likely to respond. The use of mandatory responses, although useful for survey data completeness, could introduce bias if participants provided less thoughtful responses just to complete the survey. Finally, the low response rate limited the assessment of the construct validity of the CLIQ framework [24]. The low uptake was probably due to the busy schedules of the health care professionals at a time when the National Health System was under immense pressure due to COVID-19 pandemic. Nevertheless, the sample size was sufficient to assess the CLIQ framework’s applicability and reliability as demonstrated by the narrow CIs for the Cronbach α coefficients. Future studies on the validity of the CLIQ framework need to consider and address these limitations.

Implications for Practice

Information Quality Assurance System

The CLIQ research highlights the importance of information quality and its relevance to the quality and safety of care. Therefore, establishing a robust system for information quality assurance in health care institutions is essential. Such an information quality assurance system entails regular checks and monitoring, data validation, and information quality audits [33]. The CLIQ framework could be a useful tool for information quality audits. The information quality assurance system should be integrated into the information governance system, where one already exists, to prevent duplication and fragmentation. Designating clinicians with additional informatics training to oversee such information quality assurance system is necessary to ensure its successful implementation because they understand the clinical and technological aspects of patient care [34]. Although health care institutions within high-income countries have designated clinical informatics roles, such as the chief clinical information officer, the job description is still evolving [35]. It is vital to both introduce these roles in health facilities and expand the responsibilities to include information quality assurance.

Informatics Training and Education for Health Care Professionals

The CLIQ framework demonstrates that information quality is multidimensional, and understanding its meaning, relevance, and assessment requires some training. The clinicians who participated in the international eDelphi study expressed concerns that health care professionals without informatics training might be unfamiliar with information quality–related terms [21]. In addition, information quality problems, such as missing and inaccurate information, could result from human errors [36]. Informatics training could provide health care professionals with knowledge about the meaning, relevance, and assessment of information quality. Training could also be used to address information quality problems through promotion of good practice such as proper documentation and adequate record keeping. Informatics training should be included in health care professionals’ prequalification and postqualification education competencies to also keep them up-to-date of the ever-changing information landscape in digital health [37]. However, care should be taken not to complicate an already-complex medical curriculum with extensive informatics training [37]. Rather, efforts should be made to provide appropriate level of informatics training that is commensurate with individual and institutional needs. Although informatics training could build health professionals’ competencies around information quality, mistakes will continue to occur because “to err is human” [38]. Therefore, it is vital to institute a system that reduces incidence and impact of errors.

Automated Error Detection and Data Validation Systems

The quality of clinical information could be improved by setting up automated error detection and data validation systems in DHTs. Machine learning and artificial intelligence algorithms could be applied for automated data validation and real-time error detection with errors flagged during data entry [39]. Automated error detection and data validation systems could reduce the likelihood of wrong data entry as well as prompt early correction of errors before patient safety is compromised. However, lack of trust in machine learning and artificial intelligence could make health care professionals ignore or override automated warnings and alerts [40].

Interoperability and Integration of Digital Health Systems

Interoperability enhances seamless flow of information across different DHTs such as EHRs, laboratory information system, and electronic triage system [41]. Seamless health information exchange between multiple platforms improves quality of clinical information [42]. Communication between systems enhances timely access to up-to-date information in or near real time [42]. Accuracy is enhanced when information is obtained directly from source with elimination of possible changes during transcription [41]. Portability and interoperability could be improved through adoption of clinical data models (eg, openEHR), standardized terminologies (eg, Systematized Nomenclature of Medicine Clinical Term—SNOMED CT), and clinical coding system (eg, International Classification of Diseases, Tenth RevisionICD-10), which promote consistent presentation of clinical information [43,44]. Although interoperability of digital health systems is desirable, a huge cost is often required for its implementation. Concerns about security of information may also be an obstacle due to increased access to information [41]. Security concerns could be addressed by obtaining prior informed consent for information sharing from patients when they register with the health service.

User-Friendly Design Interface

The quality of data entry could be improved with a user-friendly design interface [45]. Accuracy and completeness of the information in the DHTs could be enhanced with features such as drop-down menus, prepopulation fields, and validation prompts. Faulty design interface can affect consistency of presentation and conformance. A drop down with different units of the same medication can lead to medication errors [10]. Searchability of information improves when the design interface allows smooth navigation through the digital system.

Implications for Policy

Information Quality Requirements

Policies and guidelines are needed to define and communicate information quality requirements for DHTs. Policies should communicate the official position of an institution on information quality requirements while guidelines should provide clear guidance on how to meet the requirements. The CLIQ framework could be used as a conceptual guide for information quality requirements. Policies and guidelines are needed to address issues relating to information quality, including interoperability, privacy and confidentiality, information format, design interface, training, and so forth. These issues should be addressed collectively in a single information quality policy or individually with different guidelines based on the needs of the institution. It is essential to consider whether the information is used locally, nationally, or internationally. The process of formulating, implementing, and disseminating the policies needs to involve all stakeholders, as described subsequently.

Collaboration and Partnerships Among Information Stakeholders

The development of the CLIQ framework demonstrates the importance of multidisciplinary and international collaboration in the development of information quality standards. The CLIQ framework was developed with inputs of scores of professionals from more than 10 countries. Establishing and implementing information quality standards require collaboration among health care organizations, software developers and vendors, regulatory bodies, patient support group, professional organizations, and so forth. Collaboration will ensure buy-in of all information stakeholders and facilitate the implementation process.

Certification and Regulation

Enforcement of information quality standards for DHTs requires regulatory oversights. Regulatory bodies for DHTs in different countries and regions, such as the United States Food and Drug Administration and the Medicines and Healthcare products Regulatory Agency, need to include information quality requirements as part of prequalification criteria for DHTs. Information quality of DHTs should be assessed using evidence-based approach such as the CLIQ framework before they are certified for use in health care facilities. Only DHTs that meet specified prequalification criteria, including information quality requirements, should be certified by the regulatory bodies.

Implication for Future Research

Evidence-Based Approach

The CLIQ framework was developed and validated using an approach that combined evidence from literature and empirical studies. We conducted a systematic review of frameworks, rather than interventions, based on the BeHeMoTh procedure [46]. The international eDelphi study obtained quantitative and qualitative evidence from clinicians across 10 countries [21]. The pilot CLIQ assessment used a systematic approach to investigate the reliability and validity of this novel tool. Overall, the evidence-based approach used while developing the CLIQ framework provides a methodological model which could be adopted in future research while developing pragmatic frameworks.

Information Quality Research

The systematic review of literature showed a dearth of information quality research relating to DHTs [11]. As DHTs are used more commonly worldwide, there is an increased need for information quality research to match the growth in technological innovations. The CLIQ framework demonstrates the importance of information quality research in addressing contemporary issues relating to DHTs. Some information quality dimensions, which were hitherto not included in most existing information quality frameworks for DHTs, are included in the CLIQ framework. Searchability and maintainability are more relevant due to technological advancement which has made it possible to capture, process, and store an increasing volume of digital information. Searchability, that is, locating needed information from the wide range of available information, is thus essential. Maintainability of information is also important to ensure that information quality is not sacrificed with increasing quantity of information. Similarly, portability has become more relevant than ever with the current tendency toward integration and interoperability of DHTs.

Patient Perspective on Information Quality

This study focused on health care professionals as end users of clinical information in DHTs. However, patients have also become end users of clinical information in DHTs. Patients are increasingly viewing and contributing more information into DHTs, especially since the COVID-19 pandemic [47]. Multiple DHTs, such as AskmyGp (Evergreen Health Solution Ltd), eConsult (eConsult Health Ltd), and Accurx (Accurx Ltd), allow patients to enter text using their own words and upload photographs that could be saved in the EHRs. In addition, many wearable devices and mobile apps generate consumer health data, nowadays, that could be imported into the EHRs. It would therefore be useful to explore the patients’ perspective on information quality in future studies and incorporate their views and suggestions into the CLIQ framework.

Conclusions

The CLIQ research highlights the importance of information quality and its relevance to the quality and safety of care. The CLIQ framework demonstrated a high reliability and a modest construct validity. The CLIQ framework offers a pragmatic approach to assessing the quality of clinical information in DHTs and could be applied as part of information quality assurance systems in health care settings to improve quality of health information.

Acknowledgments

The authors are grateful to all health care professionals who participated in this survey. ALN is supported by the National Institute for Health and Care Research (NIHR) Patient Safety Translational Research Centre and the NIHR Biomedical Research Centre. AM and JC are supported by NIHR Applied Research Collaboration Northwest London. The views expressed in this publication are those of the authors and not necessarily those of the NIHR or the Department of Health and Social Care. KPF’s doctoral research was funded by the Federal Government of Nigeria.

Conflicts of Interest

None declared.

Multimedia Appendix 1

Clinical Information Quality (CLIQ) questionnaire.

PDF File, 132 KB

Multimedia Appendix 2

Inter-item correlation matrix.

DOCX File, 22 KB

Checklist 1

Checklist for Reporting Results of Internet E-Surveys.

DOCX File, 19 KB

  1. World Health Organization. Global strategy on digital health 2020-2025. 2021. URL: http://apps.who.int/bookorders [Accessed 2025-05-05]
  2. Gomes M, Murray E, Raftery J. Economic evaluation of digital health interventions: methodological issues and recommendations for practice. Pharmacoeconomics. Apr 2022;40(4):367-378. [CrossRef] [Medline]
  3. Institute of Medicine. Key Capabilities of an Electronic Health Record System: Letter Report. National Academies Press; 2003. URL: https://www.ncbi.nlm.nih.gov/books/NBK221802/ [Accessed 2025-05-16]
  4. Graf A, Fehring L, Henningsen M, Zinner M. Going digital in Germany: an exploration of physicians’ attitudes towards the introduction of electronic prescriptions—a mixed methods approach. Int J Med Inform. Jun 2023;174(March):105063. [CrossRef] [Medline]
  5. Sutton RT, Pincock D, Baumgart DC, Sadowski DC, Fedorak RN, Kroeker KI. An overview of clinical decision support systems: benefits, risks, and strategies for success. NPJ Digit Med. 2020;3(1):17. [CrossRef] [Medline]
  6. Teufel A, Binder H. Clinical Decision Support Systems. Visc Med. Dec 2021;37(6):491-498. [CrossRef] [Medline]
  7. World Health Organization. mHealth New Horizons for Health Through Mobile Technologies. World Health Organization; 2011. URL: https://iris.who.int/handle/10665/44607 [Accessed 2025-05-16]
  8. Meeks DW, Smith MW, Taylor L, Sittig DF, Scott JM, Singh H. An analysis of electronic health record-related patient safety concerns. J Am Med Inform Assoc. 2014;21(6):1053-1059. [CrossRef] [Medline]
  9. Magrabi F, Ong MS, Runciman W, Coiera E. Patient safety problems associated with healthcare information technology: an analysis of adverse events reported to the US Food and Drug Administration. AMIA Annu Symp Proc. 2011;2011(1):853-857. [Medline]
  10. Kim MO, Coiera E, Magrabi F. Problems with health information technology and their effects on care delivery and patient outcomes: a systematic review. J Am Med Inform Assoc. Mar 1, 2017;24(2):246-250. [CrossRef] [Medline]
  11. Fadahunsi KP, O’Connor S, Akinlua JT, et al. Information quality frameworks for digital health technologies: systematic review. J Med Internet Res. May 17, 2021;23(5):e23479. [CrossRef] [Medline]
  12. Almutiry OS. Data Quality Assessment Instrument For Electronic Health Record Systems in Saudi Arabia. University of Southampton; 2017.
  13. Dungey S, Beloff N, Puri S, Boggon R, Williams T, Tate AR. A pragmatic approach for measuring data quality in primary care databases. Presented at: 2014 IEEE-EMBS International Conference on Biomedical and Health Informatics (BHI); Jun 1-4, 2014; Valencia, Spain. [CrossRef]
  14. Weiskopf NG, Weng C. Methods and dimensions of electronic health record data quality assessment: enabling reuse for clinical research. J Am Med Inform Assoc. Jan 1, 2013;20(1):144-151. [CrossRef]
  15. Stetson PD, Bakken S, Wrenn JO, Siegler EL. Assessing electronic note quality using the Physician Documentation Quality Instrument (PDQI-9). Appl Clin Inform. 2012;3(2):164-174. [CrossRef] [Medline]
  16. Kahn MG, Callahan TJ, Barnard J, et al. A harmonized data quality assessment terminology and framework for the secondary use of electronic health record data. EGEMS (Wash DC). 2016;4(1):1244. [CrossRef] [Medline]
  17. McCormack JL, Ash JS. Clinician perspectives on the quality of patient data used for clinical decision support: a qualitative study. AMIA Annu Symp Proc. 2012;2012:1302-1309. [Medline]
  18. Almutiry O, Wills G, Alwabel A, Crowder R, Walters R. Toward a framework for data quality in cloud-based health information system. Presented at: 2013 International Conference on Information Society (i-Society); Jun 24-26, 2013; Toronto, ON, Canada.
  19. Bowen M. EMR Data Quality: Evaluation Guide. eHealth Observatory; 2012.
  20. Bolt T, Kano S, Kodate A. Information quality in home care coordination services. J Telemed Telecare. Jul 2007;13(Suppl 1):7-9. [CrossRef]
  21. Fadahunsi KP, Wark PA, Mastellos N, et al. Assessment of clinical information quality in digital health technologies: international eDelphi study. J Med Internet Res. Dec 6, 2022;24(12):e41889. [CrossRef] [Medline]
  22. TPP. Clinical software to revolutionise healthcare. 2024. URL: https://tpp-uk.com/ [Accessed 2025-05-05]
  23. Eysenbach G. Improving the quality of web surveys: the Checklist for Reporting Results of Internet E-Surveys (CHERRIES). J Med Internet Res. Sep 29, 2004;6(3):e34. [CrossRef] [Medline]
  24. Boateng GO, Neilands TB, Frongillo EA, Melgar-Quiñonez HR, Young SL. Best practices for developing and validating scales for health, social, and behavioral research: a primer. Front Public Health. 2018;6:149. [CrossRef] [Medline]
  25. Woulfe F, Fadahunsi KP, O’Grady M, et al. Modification and validation of an mHealth app quality assessment methodology for international use: cross-sectional and eDelphi studies. JMIR Form Res. Aug 19, 2022;6(8):e36912. [CrossRef] [Medline]
  26. Taherdoost H. Validity and reliability of the research instrument; how to test the validation of a questionnaire/survey in a research. Int J Acad Res Manag. 2018;5(3):28-36.
  27. Ullman JB, Bentler PM. Structural equation modeling. In: Weiner I, Schinka JA, Velicer WF, editors. Handbook of Psychology. John Wiley & Sons, Inc; 2013:661-619.
  28. Hu LT, Bentler PM. Cutoff criteria for fit indexes in covariance structure analysis: conventional criteria versus new alternatives. Struct Equation Model. Jan 1999;6(1):1-55. [CrossRef]
  29. Eppler MJ, Wittig D. Conceptualizing information quality: a review of information quality frameworks from the last ten years goals. Presented at: Fifth Conference on Information Quality (IQ 2000); Jan 2000:1-14; MIT—Massachusetts Institute of Technology, Cambridge, MA.
  30. Akoglu H. User’s guide to correlation coefficients. Turk J Emerg Med. Sep 2018;18(3):91-93. [CrossRef] [Medline]
  31. Wang RY, Strong DM. Beyond accuracy: what data quality means to data consumers. J Manage Inf Syst. Mar 1996;12(4):5-33. [CrossRef]
  32. Chan YH. Biostatistics 104: correlational analysis. Singapore Med J. Dec 2003;44(12):614-619. [Medline]
  33. World health organization. DATA QUALITY ASSURANCE module 1 framework and metrics. 2022. URL: http://apps.who.int/bookorders [Accessed 2025-05-05]
  34. Lee J. The chief medical informatics officer’s (CMIO) view: clinical, technical and leadership acumen. In: Nursing Informatics, 5th Edition. Springer; 2022:99-110. [CrossRef]
  35. Sridharan S, Priestman W, Sebire NJ. Chief Information Officer team evolution in university hospitals: interaction of the three ‘C’s (CIO, CCIO, CRIO). BMJ Health Care Inform. Apr 2018;25(2):88-91. [CrossRef]
  36. Magrabi F, Liaw ST, Arachi D, Runciman W, Coiera E, Kidd MR. Identifying patient safety problems associated with information technology in general practice: an analysis of incident reports. BMJ Qual Saf. Nov 2016;25(11):870-880. [CrossRef] [Medline]
  37. Davies A, Hassey A, Williams J, Moulton G. Creation of a core competency framework for clinical informatics: from genesis to maintaining relevance. Int J Med Inform. Dec 2022;168:104905. [CrossRef] [Medline]
  38. Institute of Medicine. To Err Is Human: Building a Safer Health System. Vol 11. Institute of Medicine; 2000:312. ISBN: 0309261740
  39. Khan S, Yairi T. A review on the application of deep learning in system health management. Mech Syst Signal Process. Jul 2018;107:241-265. [CrossRef]
  40. Nicora G, Rios M, Abu-Hanna A, Bellazzi R. Evaluating pointwise reliability of machine learning prediction. J Biomed Inform. Mar 2022;127:103996. [CrossRef] [Medline]
  41. Iroju O, Soriyan A, Gambo I, Olaleke J. Interoperability in healthcare: benefits, challenges and resolutions. Int J Innov Appl Stud. 2013;(1):262-270.
  42. Lehne M, Sass J, Essenwanger A, Schepers J, Thun S. Why digital medicine depends on interoperability. NPJ Digit Med. 2019;2(1):79. [CrossRef] [Medline]
  43. Kalra D, Beale T, Heard S. The openEHR foundation. Studies in health technology and informatics. Stud Health Technol Inform. 2005;115:153-173.
  44. Gaudet-Blavignac C, Foufi V, Bjelogrlic M, Lovis C. Use of the Systematized Nomenclature of Medicine Clinical Terms (SNOMED CT) for processing free text in health care: systematic scoping review. J Med Internet Res. Jan 26, 2021;23(1):e24594. [CrossRef] [Medline]
  45. Wilbanks BA, Moss JA. Impact of data entry interface design on cognitive workload, documentation correctness, and documentation efficiency. AMIA Jt Summits Transl Sci Proc. 2021;2021:634-643. [Medline]
  46. Fadahunsi KP, Akinlua JT, O’Connor S, et al. Protocol for a systematic review and qualitative synthesis of information quality frameworks in eHealth. BMJ Open. Mar 5, 2019;9(3):e024722. [CrossRef] [Medline]
  47. Clarke G, Pariza P, Wolters A. How has COVID-19 affected service delivery in GP practices that offered remote consultations before the pandemic. The Health Foundation. 2020:1-15. URL: https:/​/www.​health.org.uk/​news-and-comment/​charts-and-infographics/​how-has-covid-19-affected-service-delivery-in-gp-practices [Accessed 2025-05-05]


CFI: comparative fit index
CHERRIES: Checklist for Reporting Results of Internet E-Surveys
CLIQ: clinical information quality
DHT: digital health technology
EHR: electronic health record
MES: modified enlight suite
SRMR: Standardized Root Mean Square Residual


Edited by Arriel Benis; submitted 06.03.24; peer-reviewed by Mani Abdul Karim, Piyush Mahapatra; final revised version received 13.04.25; accepted 14.04.25; published 30.05.25.

Copyright

© Kayode Philip Fadahunsi, Petra A Wark, Nikolaos Mastellos, Ana Luisa Neves, Joseph Gallagher, Azeem Majeed, Josip Car. Originally published in JMIR Medical Informatics (https://medinform.jmir.org), 30.5.2025.

This is an open-access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in JMIR Medical Informatics, is properly cited. The complete bibliographic information, a link to the original publication on https://medinform.jmir.org/, as well as this copyright and license information must be included.