287122
Innovative use of multiple methods to assess health informatics system standards and interoperability
Wednesday, November 6, 2013
Andrew Westdorp, BS
,
Public Health Research, Social & Scientific Systems, Inc., Silver Spring, MD
Lydia Rogers, MBA
,
Public Health Research, Social & Scientific Systems, Inc., Silver Spring, MD
Jean Wilson, MA
,
Public Health Research, Social & Scientific Systems, Inc., Silver Spring, MD
Susan J. Griffey, DrPH, BSN, FNP
,
Evaluation Center, Public Health Research group, Social & Scientific Systems, Inc, Silver Spring, MD
Background:Health informatics systems (HIS) are essential in collection and management of health research data. While HIS are proliferating rapidly, little is published about evaluation of system standards, functional requirements, and interoperability. We developed an innovative evaluation approach using best practices to assess 4 HIS created in different settings but supporting the same health data collection effort. Best practices included defining the evaluation criteria, objectives, and framework early on, involving stakeholders and openly communicating findings, and combining existing methods in new and interesting ways. Objective/Purpose:The approach would yield evidence for HIS functional standards and interoperability characteristics that could serve as a model for future large-scale health data collections, especially in government settings. Methods:We designed a multi-method informatics evaluation to 1) benchmark system functionality against a set of specified standards, and 2) test 4 informatics systems using real-case scenarios, leveraging collaborations among stakeholders (system funders, developers, implementers) and evaluators. Evaluators used benchmarking to review the 4 systems' documentation against relevant standards. We collected qualitative and quantitative information, using mock user roles (system administrators, managers, data collectors), about the systems' performance in practice through scenario-testing. Results:Benchmarking defined key standards for the 4 systems. Review of system/user documentation allowed evaluators to identify the strongest functional areas such as case management, and where gaps existed such as in specimen tracking. System documents and test environments made feasible a limited examination of interoperable modules across the HIS; this identified the need for further prototype testing to pinpoint system functions and modules that could be shared. Scenario-testing with real use-cases provided objective information and qualitative input from system users. Together, benchmarking and scenario-testing identified mismatches in functionality (e.g., case management) between system documentation and real use. Discussion/Conclusions:This evaluation approach identified critical functional standards and gaps in characteristics across 4 informatics systems--both open-source and proprietary--for specific remediation efforts.
Learning Areas:
Communication and informatics
Conduct evaluation related to programs, research, and other areas of practice
Planning of health education strategies, interventions, and programs
Public health or related research
Learning Objectives:
Describe how collaboration can be used effectively in working to improve public health through information technology and informatics;
Demonstrate how best practices in program evaluation can relate to assessment of health lT systems;
Describe innovative evaluation approaches can be applied to identify gaps and strengthen issues with system functionality and interoperability.
Keywords: Information Systems, Evaluation
Presenting author's disclosure statement:Qualified on the content I am responsible for because: I am an experienced health research evaluator, working at NIH, and have extensive data collection, survey methodology, web/IT usability testing, and cognitive testing experience. I currently am a lead member of an evaluation of health informatics systems for NIH.
Any relevant financial relationships? No
I agree to comply with the American Public Health Association Conflict of Interest and Commercial Support Guidelines,
and to disclose to the participants any off-label or experimental uses of a commercial product or service discussed
in my presentation.