58
Test Results Summary for 2014 Edition EHR Certification 17-3199-R-0022-PRA V1.0, November 17, 2017 ©2017 InfoGard. May be reproduced only in its original entirety, without revision 1 Part 1: Product and Developer Information 1.1 Certified Product Information 1.2 Developer/Vendor Information Part 2: ONC-Authorized Certification Body Information 2.1 ONC-Authorized Certification Body Information Test Type: Modular Developer/Vendor Name: CompuGroup Medical, Inc. Address: 3300 N. Central Ave. Ste. 2100 ONC HIT Certification Program Test Results Summary for 2014 Edition EHR Certification Product Name: CGM CLINICAL™ Product Version: 8.2.19115 Domain: Ambulatory Developer/Vendor Contact: Chris Lohl ONC-ACB Name: InfoGard Laboratories, Inc. Address: 709 Fiero Lane Suite 25 Phoenix, AZ 85012 Website: https://www.cgm.com/us/index.en.jsp Email: [email protected] Phone: (888) 627-7633 ONC-ACB Contact: Adam Hardcastle This test results summary is approved for public release by the following ONC-Authorized Certification Body Representative: Adam Hardcastle EHR Certification Body Manager ONC-ACB Authorized Representative Function/Title San Luis Obispo, CA 93401 Website: www.infogard.com Email: [email protected] Phone: (805) 783-0810 Signature and Date 11/17/2017

Test Results Summary for 2014 Edition EHR …connect.ul.com/rs/365-LEA-623/images/17-3199-R-0022-PRA...Test Results Summary for 2014 Edition EHR Certification 17-3199-R-0022-PRA V1.0,

  • Upload
    others

  • View
    4

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Test Results Summary for 2014 Edition EHR …connect.ul.com/rs/365-LEA-623/images/17-3199-R-0022-PRA...Test Results Summary for 2014 Edition EHR Certification 17-3199-R-0022-PRA V1.0,

Test Results Summary for 2014 Edition EHR Certification17-3199-R-0022-PRA V1.0, November 17, 2017

©2017 InfoGard. May be reproduced only in its original entirety, without revision 1

Part 1: Product and Developer Information1.1 Certified Product Information

1.2 Developer/Vendor Information

Part 2: ONC-Authorized Certification Body Information2.1 ONC-Authorized Certification Body Information

Test Type: Modular

Developer/Vendor Name: CompuGroup Medical, Inc.Address: 3300 N. Central Ave. Ste. 2100

ONC HIT Certification Program Test Results Summary for 2014 Edition EHR Certification

Product Name: CGM CLINICAL™Product Version: 8.2.19115Domain: Ambulatory

Developer/Vendor Contact: Chris Lohl

ONC-ACB Name: InfoGard Laboratories, Inc.Address: 709 Fiero Lane Suite 25

Phoenix, AZ 85012Website: https://www.cgm.com/us/index.en.jspEmail: [email protected]: (888) 627-7633

ONC-ACB Contact: Adam Hardcastle

This test results summary is approved for public release by the following ONC-Authorized Certification Body Representative:

Adam Hardcastle EHR Certification Body ManagerONC-ACB Authorized Representative Function/Title

San Luis Obispo, CA 93401Website: www.infogard.comEmail: [email protected]: (805) 783-0810

Signature and Date11/17/2017

Page 2: Test Results Summary for 2014 Edition EHR …connect.ul.com/rs/365-LEA-623/images/17-3199-R-0022-PRA...Test Results Summary for 2014 Edition EHR Certification 17-3199-R-0022-PRA V1.0,

Test Results Summary for 2014 Edition EHR Certification17-3199-R-0022-PRA V1.0, November 17, 2017

©2017 InfoGard. May be reproduced only in its original entirety, without revision 2

2.2 Gap Certification

(a)(1) (a)(19) (d)(6) (h)(1)(a)(6) (a)(20) (d)(8) (h)(2)(a)(7) (b)(5)* (d)(9) (h)(3)(a)(17) (d)(1) (f)(1)(a)(18) (d)(5) (f)(7)**

*Gap certification allowed for Inpatient setting only**Gap certification allowed for Ambulatory setting only

2.3 Inherited CertificationThe following identifies criterion or criteria certified via inherited certification

(a)(1) (a)(16) Inpt. only (c)(2) (f)(2) (a)(2) (a)(17) Inpt. only (c)(3) (f)(3)

The following identifies criterion or criteria certified via gap certification§170.314

No gap certification

§170.314

(a)(3) (a)(18) (d)(1) (f)(4) Inpt. only

(a)(4) (a)(19) (d)(2) (f)(5) Amb. only

(a)(5) (a)(20)

(a)(8) (b)(3) (d)(6) (f)(7) Amb. Only

(a)(9) (b)(4) (d)(7) (g)(1)

(d)(3) (a)(6) (b)(1) (d)(4)

(f)(6) Amb. only (a)(7) (b)(2) (d)(5)

(a)(12) (b)(7) (e)(1) (g)(4) (a)(13) (b)(8) (e)(2) Amb. only (h)(1)

(a)(10) (b)(5) (d)(8) (g)(2) (a)(11) (b)(6) Inpt. only (d)(9) Optional (g)(3)

No inherited certification

(a)(14) (b)(9) (e)(3) Amb. only (h)(2) (a)(15) (c)(1) (f)(1) (h)(3)

Page 3: Test Results Summary for 2014 Edition EHR …connect.ul.com/rs/365-LEA-623/images/17-3199-R-0022-PRA...Test Results Summary for 2014 Edition EHR Certification 17-3199-R-0022-PRA V1.0,

Test Results Summary for 2014 Edition EHR Certification17-3199-R-0022-PRA V1.0, November 17, 2017

©2017 InfoGard. May be reproduced only in its original entirety, without revision 3

Part 3: NVLAP-Accredited Testing Laboratory Information

3.1 NVLAP-Accredited Testing Laboratory Information

11/17/17

3.2 Test Information

3.2.1 Additional Software Relied Upon for Certification

No additional software required

Report Number: 17-3199-R-0022 V1.1Test Date(s): 10/13/2017

ATL Name: InfoGard Laboratories, Inc.

Email: [email protected]: (805) 783-0810ATL Contact: Milton Padilla

Accreditation Number: NVLAP Lab Code 100432-0Address: 709 Fiero Lane Suite 25

San Luis Obispo, CA 93401Website: www.infogard.com

Signature and Date

Additional Software Applicable CriteriaFunctionality provided by

Additional Software

Alere Analytics314(a)(8) and 314(c)(1)-(3)

Clinical Decision Support rules

For more information on scope of accreditation, please reference http://ts.nist.gov/Standards/scopes/1004320.htm

Part 3 of this test results summary is approved for public release by the following Accredited Testing Laboratory Representative:

Mark Shin EHR Approved SignatoryATL Authorized Representative Function/Title

SureScripts with connectivity viaCGM eRx™

314(b)(3), 314(a)(10), and 314(a)(2)

eRX, Drug database

Page 4: Test Results Summary for 2014 Edition EHR …connect.ul.com/rs/365-LEA-623/images/17-3199-R-0022-PRA...Test Results Summary for 2014 Edition EHR Certification 17-3199-R-0022-PRA V1.0,

Test Results Summary for 2014 Edition EHR Certification17-3199-R-0022-PRA V1.0, November 17, 2017

©2017 InfoGard. May be reproduced only in its original entirety, without revision 4

3.2.2 Test Tools

Version

1.0.6

No test tools required

3.2.3 Test Data

3.2.4 Standards3.2.4.1 Multiple Standards Permitted

HL7 CDA Cancer Registry Reporting Validation ToolHL7 v2 Electronic Laboratory Reporting (ELR) Validation ToolHL7 v2 Immunization Information System (IIS) Reporting Valdiation ToolHL7 v2 Laboratory Restults Intervace (LRI) Validation ToolHL7 v2 Syndromic Surveillance Reporting Validation ToolTransport Testing Tool

Test ToolCypressePrescribing Validation Tool

(a)(8)(ii)(A)(2)

§170.204(b)(1)HL7 Version 3 Implementation Guide: URL-Based Implementations of the Context-Aware Information Retrieval (Infobutton) Domain

§170.204(b)(2)HL7 Version 3 Implementation Guide: Context-Aware Knowledge Retrieval (Infobutton) Service-Oriented Architecture Implementation Guide

(a)(13)

§170.207(a)(3)IHTSDO SNOMED CT® International Release July 2012 and US Extension to SNOMED CT® March 2012 Release

§170.207(j)HL7 Version 3 Standard: Clinical Genomics; Pedigree

Direct Certificate Discovery ToolEdge Testing Tool

Alteration (customization) to the test data was necessary and is described in Appendix [insert appendix letter ] No alteration (customization) to the test data was necessary

The following identifies the standard(s) that has been successfully tested where more than one standard is permitted

Criterion # Standard Successfully Tested

Page 5: Test Results Summary for 2014 Edition EHR …connect.ul.com/rs/365-LEA-623/images/17-3199-R-0022-PRA...Test Results Summary for 2014 Edition EHR Certification 17-3199-R-0022-PRA V1.0,

Test Results Summary for 2014 Edition EHR Certification17-3199-R-0022-PRA V1.0, November 17, 2017

©2017 InfoGard. May be reproduced only in its original entirety, without revision 5

None of the criteria and corresponding standards listed above are applicable

3.2.4.2 Newer Versions of Standards

No newer version of a minimum standard was tested

(b)(2)(i)(A)

§170.207(i) The code set specified at 45 CFR 162.1002(c)(2) (ICD-10-CM) for the indicated conditions

§170.207(a)(3)IHTSDO SNOMED CT® International Release July 2012 and US Extension to SNOMED CT® March 2012 Release

(b)(7)(i)

§170.207(i) The code set specified at 45 CFR 162.1002(c)(2) (ICD-10-CM) for the indicated conditions

§170.207(a)(3)IHTSDO SNOMED CT® International Release July 2012 and US Extension to SNOMED CT® March 2012 Release

(a)(15)(i)

§170.204(b)(1) HL7 Version 3 Implementation Guide: URL-Based Implementations of the Context-Aware Information Retrieval (Infobutton) Domain

§170.204(b)(2)HL7 Version 3 Implementation Guide: Context-Aware Knowledge Retrieval (Infobutton) Service-Oriented Architecture Implementation Guide

(a)(16)(ii) §170.210(g) Network Time Protocol Version 3 (RFC 1305)

§170. 210(g)Network Time Protocol Version 4 (RFC 5905)

(e)(1)(ii)(A)(2) §170.210(g) Network Time Protocol Version 3 (RFC 1305)

§170. 210(g)Network Time Protocol Version 4 (RFC 5905)

(e)(3)(ii) Annex A of the FIPS Publication 140-2

(b)(8)(i)

§170.207(i) The code set specified at 45 CFR 162.1002(c)(2) (ICD-10-CM) for the indicated conditions

§170.207(a)(3)IHTSDO SNOMED CT® International Release July 2012 and US Extension to SNOMED CT® March 2012 Release

(e)(1)(i) Annex A of the FIPS Publication 140-2

Common MU Data Set (15)

§170.207(a)(3)IHTSDO SNOMED CT® International Release July 2012 and US Extension to SNOMED CT® March 2012 Release

§170.207(b)(2)The code set specified at 45 CFR 162.1002(a)(5) (HCPCS and CPT-4)

The following identifies the newer version of a minimum standard(s) that has been successfully tested

Newer Version Applicable Criteria

Page 6: Test Results Summary for 2014 Edition EHR …connect.ul.com/rs/365-LEA-623/images/17-3199-R-0022-PRA...Test Results Summary for 2014 Edition EHR Certification 17-3199-R-0022-PRA V1.0,

Test Results Summary for 2014 Edition EHR Certification17-3199-R-0022-PRA V1.0, November 17, 2017

©2017 InfoGard. May be reproduced only in its original entirety, without revision 6

3.2.5 Optional Functionality

No optional functionality tested

Criterion # Optional Functionality Successfully Tested(a)(4)(iii) Plot and display growth charts

(b)(2)(ii)(C) Transmit health information to a Third Party using the standards specified at §170.202(b) and (c) (SOAP Protocols)

(e)(1) View, download and transmit data to a third party using the standard specified at §170.202(d) (Edge Protocol IG version 1.1)

(f)(3)

Ambulatory setting only – Create syndrome-based public health surveillance information for transmission using the standard specified at §170.205(d)(3) (urgent care visit scenario)

(b)(1)(i)(B) Receive summary care record using the standards specified at §170.202(a) and (b) (Direct and XDM Validation)

(b)(1)(i)(C) Receive summary care record using the standards specified at §170.202(b) and (c) (SOAP Protocols)

(b)(2)(ii)(B) Transmit health information to a Third Party using the standards specified at §170.202(a) and (b) (Direct and XDM Validation)

(f)(7) Ambulatory setting only – transmission to public health agencies – syndromic surveillance - Create Data Elements

Common MU Data Set (15)

Express Procedures according to the standard specified at §170.207(b)(3) (45 CFR162.1002(a)(4): Code on Dental Procedures and Nomenclature)

Common MU Data Set (15)

Express Procedures according to the standard specified at §170.207(b)(4) (45 CFR162.1002(c)(3): ICD-10-PCS)

Page 7: Test Results Summary for 2014 Edition EHR …connect.ul.com/rs/365-LEA-623/images/17-3199-R-0022-PRA...Test Results Summary for 2014 Edition EHR Certification 17-3199-R-0022-PRA V1.0,

Test Results Summary for 2014 Edition EHR Certification17-3199-R-0022-PRA V1.0, November 17, 2017

©2017 InfoGard. May be reproduced only in its original entirety, without revision 7

3.2.6 2014 Edition Certification Criteria* Successfully Tested

TP** TD*** TP** TD***

1.2

1.2 1.4

1.4 1.2

Version

(a)(1) (c)(3) (a)(2) (d)(1) (a)(3) (d)(2) (a)(4) (d)(3)

Criteria #Version

Criteria #

(a)(8) (d)(7) (a)(9) (d)(8) (a)(10) (d)(9) Optional

(a)(5) (d)(4) (a)(6) (d)(5) (a)(7) (d)(6)

(a)(14) (f)(1) (a)(15) (f)(2) (a)(16) Inpt. only (f)(3)

(a)(11) (e)(1) (a)(12) (e)(2) Amb. only

(a)(13) (e)(3) Amb. only

(a)(20) (f)(6) Optional & Amb. only (b)(1)

(b)(2) (f)(7) Amb. only

(a)(17) Inpt. only (f)(4) Inpt. only

(a)(18) (f)(5) Optional & Amb. only (a)(19)

(b)(6) Inpt. only (g)(4) (b)(7) (h)(1) (b)(8) (h)(2)

(b)(3) (g)(1) (b)(4) (g)(2) (b)(5) (g)(3)

**Indicates the version number for the Test Procedure (TP)***Indicates the version number for the Test Data (TD)

(b)(9) (h)(3) (c)(1) (c)(2)*For a list of the 2014 Edition Certification Criteria, please reference http://www.healthit.gov/certification (navigation: 2014 Edition Test Method)

Page 8: Test Results Summary for 2014 Edition EHR …connect.ul.com/rs/365-LEA-623/images/17-3199-R-0022-PRA...Test Results Summary for 2014 Edition EHR Certification 17-3199-R-0022-PRA V1.0,

Test Results Summary for 2014 Edition EHR Certification17-3199-R-0022-PRA V1.0, November 17, 2017

©2017 InfoGard. May be reproduced only in its original entirety, without revision 8

3.2.7 2014 Clinical Quality Measures*Type of Clinical Quality Measures Successfully Tested:

CMS ID Version CMS ID Version CMS ID Version CMS ID Version2 90 136 155

22 117 137 15650 122 138 15752 123 139 15856 124 140 15961 125 141 16062 126 142 16164 127 143 16365 128 144 16466 129 145 16568 130 146 16669 131 147 16774 132 148 16975 133 149 17777 134 153 17982 135 154 182

CMS ID Version CMS ID Version CMS ID Version CMS ID Version9 71 107 172

26 72 108 17830 73 109 18531 91 110 18832 100 111 19053 102 11355 104 11460 105 171

Ambulatory Inpatient No CQMs tested*For a list of the 2014 Clinical Quality Measures, please reference http://www.cms.gov (navigation: 2014 Clinical Quality Measures)

Ambulatory CQMs

Inpatient CQMs

Page 9: Test Results Summary for 2014 Edition EHR …connect.ul.com/rs/365-LEA-623/images/17-3199-R-0022-PRA...Test Results Summary for 2014 Edition EHR Certification 17-3199-R-0022-PRA V1.0,

Test Results Summary for 2014 Edition EHR Certification17-3199-R-0022-PRA V1.0, November 17, 2017

©2017 InfoGard. May be reproduced only in its original entirety, without revision 9

3.2.8 Automated Numerator Recording and Measure Calculation3.2.8.1 Automated Numerator Recording

Automated Numerator Recording was not tested

3.2.8.2 Automated Measure Calculation

Automated Measure Calculation was not tested

3.2.9 Attestation

(a)(3) (a)(12) (a)(19) (b)(8)(a)(4) (a)(13) (a)(20) (b)(9)

Automated Numerator Recording Successfully Tested(a)(1) (a)(11) (a)(18) (b)(6)

(a)(7) (a)(16) (b)(4) (e)(3)(a)(9) (a)(17) (b)(5)

(a)(5) (a)(14) (b)(2) (e)(1)(a)(6) (a)(15) (b)(3) (e)(2)

Automated Numerator Recording Successfully Tested(a)(1) (a)(11) (a)(18) (b)(6)(a)(3) (a)(12) (a)(19) (b)(8)

(a)(15) (b)(3) (e)(2)(a)(7) (a)(16) (b)(4) (e)(3)

(a)(4) (a)(13) (a)(20) (b)(9)(a)(5) (a)(14) (b)(2) (e)(1)

*Required if any of the following were tested: (a)(1), (a)(2), (a)(6), (a)(7), (a)(8), (a)(16), (a)(18), (a)(19), (a)(20), (b)(3), (b)(4), (b)(9)**Required for every EHR product

Safety-Enhanced Design* A Quality Management System** B Privacy and Security C

(a)(9) (a)(17) (b)(5)

Attestation Forms (as applicable) Appendix

(a)(6)

Page 10: Test Results Summary for 2014 Edition EHR …connect.ul.com/rs/365-LEA-623/images/17-3199-R-0022-PRA...Test Results Summary for 2014 Edition EHR Certification 17-3199-R-0022-PRA V1.0,

Test Results Summary for 2014 Edition EHR Certification17-3199-R-0022-PRA V1.0, November 17, 2017

©2017 InfoGard. May be reproduced only in its original entirety, without revision 10

Appendix A: Safety Enhanced Design

Page 11: Test Results Summary for 2014 Edition EHR …connect.ul.com/rs/365-LEA-623/images/17-3199-R-0022-PRA...Test Results Summary for 2014 Edition EHR Certification 17-3199-R-0022-PRA V1.0,

Safety-enhanced design §170.314(g)(3)

Page 12: Test Results Summary for 2014 Edition EHR …connect.ul.com/rs/365-LEA-623/images/17-3199-R-0022-PRA...Test Results Summary for 2014 Edition EHR Certification 17-3199-R-0022-PRA V1.0,

2 | CompuGroup Medical

Safety-Enhanced Design

© Copyright 2014 CompuGroup Medical, Inc. All rights reserved. | May not be reproduced without prior written permission. | www.CGMus.com

EXECUTIVE SUMMARY

A usability test of CGM CLINICAL™, Version 8.2 was conducted by CompuGroup Medical in the CGM Irvine facility. Testing took place via two waves of testing which occurred the weeks of December 2, 2013 and February 24, 2014. The usability test followed the NISTIR 7741 User Centered Design approach.1 The purpose of this test was to test and validate the usability of the current user interface, and provide evidence of usability in the EHR Under Test (EHRUT). During the usability test, 5 healthcare providers matching the target demographic criteria served as participants and used the EHRUT in simulated, but representative tasks. This study collected performance data on 30 tasks typically conducted on an EHR:

List of Tasks

Certification Criteria Task Wave of Testing

§170.314(a)(6) Medication list Enter a medication 1

§170.314(a)(6) Medication list Change medication dosage 1

§170.314(a)(6) Medication list Discontinue a medication 1

§170.314(a)(6) Medication list View patient’s medication history 1

§170.314(a)(6) Medication list Mark that a patient is not taking any medications 1

§170.314(a)(7) Medication allergy list Enter a medication allergy 1

§170.314(a)(7) Medication allergy list Change a medication allergy 1

§170.314(a)(7) Medication allergy list Inactivate a medication allergy 1

§170.314(a)(7) Medication allergy list Indicate no known drug allergies 1

§170.314(b)(3) eRx Prescribe a medication 1

§170.314(b)(3) eRx Prescribe a medication with a different pharmacy 1

§170.314(b)(3) eRx Change the dose of a medication and eRx 1

§170.314(a)(2) Drug-drug, drug-allergy interactions checks

Trigger a major drug-drug interaction and cancel the order 2

§170.314(a)(2) Drug-drug, drug-allergy interactions checks

Prescribe a medication that will not trigger an interaction and stop 2

§170.314(a)(2) Drug-drug, drug-allergy interactions checks

Trigger a drug-allergy Interaction and override it 2

§170.314(a)(2) Drug-drug, drug-allergy interactions checks

Change the drug-drug interaction settings 2

§170.314(a)(1) Computerized provider order entry

Place and print a prescription (medication) 2

§170.314(a)(1) Computerized provider order entry

Enter a medication as a Non-CPOE prescription 2

§170.314(a)(1) Computerized provider order entry

Place a lab order and a radiology order 2

§170.314(a)(1) Computerized provider order entry

Place and change a lab order and print the order 2

1 Robert M. Schumacher User Centric. Inc, and Svetlana Z. Lowry Information Access Division Information

Technology Laboratory National Institute of Standards and Technology, NISTIR 7741 NIST Guide to the Processes Approach for Improving the Usability of Electronic Health Records (November 2010) p. 2-62

Page 13: Test Results Summary for 2014 Edition EHR …connect.ul.com/rs/365-LEA-623/images/17-3199-R-0022-PRA...Test Results Summary for 2014 Edition EHR Certification 17-3199-R-0022-PRA V1.0,

3 | CompuGroup Medical

Safety-Enhanced Design

© Copyright 2014 CompuGroup Medical, Inc. All rights reserved. | May not be reproduced without prior written permission. | www.CGMus.com

§170.314(a)(1) Computerized provider order entry

Place and change a radiology order and print the order 2

§170.314(b)(4) Clinical information reconciliation

Reconcile medications 2

§170.314(b)(4) Clinical information reconciliation

Reconcile medication allergies 2

§170.314(b)(4) Clinical information reconciliation

Reconcile problems 2

§170.314(a)(8) Clinical decision support

Trigger a CDS intervention from entered problem list data 2

§170.314(a)(8) Clinical decision support

Trigger a CDS intervention from entered prescription 2

§170.314(a)(8) Clinical decision support

Trigger a CDS intervention from entered medication allergy data 2

§170.314(a)(8) Clinical decision support

Trigger a CDS intervention from entered demographics 2

§170.314(a)(8) Clinical decision support

Trigger a CDS intervention from entered lab results 2

§170.314(a)(8) Clinical decision support

Trigger a CDS intervention from entered vital signs 2

During the approximately 90 minute one-on-one usability tests, each participant was greeted by the moderator; participants had previously signed a usability test user agreement. Participants had prior experience with the EHR. Prior to each set of tasks, training was provided in the same manner as what a real end user would receive when purchasing or upgrading CGM CLINICAL™. The moderator introduced the test, and instructed participants to complete a series of tasks (given one at a time) using the EHRUT. During the testing, the moderator, along with the data logger, recorded user performance data on paper and electronically. A trainer also observed the test. The moderator did not give the participant assistance in how to complete the task. Participant screens, head shots and audio were recorded for subsequent analysis. The following types of data were collected for each participant:

Number of tasks successfully completed within the allotted time without assistance

Time to complete the tasks

Number and types of errors

Path deviations

Participant’s verbalizations

Participant’s satisfaction ratings of the system

Steps taken to complete each task along with deviations from the optimal path All participant data was de-identified – no correspondence could be made from the identity of the participant to the data collected. Following the conclusion of the testing, participants were asked to complete a closing questionnaire and System Usability Scale form, and were compensated for their time in the amount of $200/hour. Various recommended metrics, in accordance with the examples set forth in the NIST Guide to the Processes Approach for Improving the Usability of Electronic Health Records,

Page 14: Test Results Summary for 2014 Edition EHR …connect.ul.com/rs/365-LEA-623/images/17-3199-R-0022-PRA...Test Results Summary for 2014 Edition EHR Certification 17-3199-R-0022-PRA V1.0,

4 | CompuGroup Medical

Safety-Enhanced Design

© Copyright 2014 CompuGroup Medical, Inc. All rights reserved. | May not be reproduced without prior written permission. | www.CGMus.com

were used to evaluate the usability of the EHRUT. Following is a summary of the performance and rating data collected on the EHRUT.

Task Success

% Error

%

Task Time (seconds)

Efficiency (Task Time)

Efficiency (Deviations) Task

Rating AVG STD AVG STD AVG STD

Enter a medication 80.00% 20.00% 42.50 8.74 1.02 0.21 1.00 0.00 1.20

Change medication dosage 40.00% 20.00% 38.00 21.21 1.31 0.73 1.90 1.27 1.80

Discontinue a medication 60.00% 20.00% 17.67 2.08 1.22 0.14 1.00 0.00 1.90

View patient’s medication history

80.00% 20.00% 6.75 2.63 1.00 0.29 1.13 0.25 1.00

Mark that a patient is not taking any medications

60.00% 40.00% 12.67 2.52 1.00 0.16 1.00 0.00 1.00

Enter a medication allergy 100.00% 0.00% 33.00 8.57 1.40 0.36 1.03 0.06 1.20

Change a medication allergy 100.00% 0.00% 39.80 16.02 1.24 0.50 1.10 0.15 1.20

Inactivate a medication allergy 100.00% 0.00% 24.00 6.04 1.71 0.43 1.10 0.22 1.00

Indicate no known drug allergies 100.00% 0.00% 29.50 4.80 1.69 0.27 1.03 0.06 1.00

Prescribe a medication 50.00% 50.00% 66.50 10.61 1.00 0.16 1.00 0.00 1.75

Prescribe a medication with a different pharmacy

60.00% 40.00% 87.67 36.53 1.16 0.48 1.52 0.89 1.60

Change the dose of a medication and eRx

40.00% 60.00% 62.50 13.44 1.11 0.24 1.00 0.00 1.80

Trigger a major drug-drug interaction and cancel the order

100.00% 0.00% 40.00 5.96 1.67 0.25 1.03 0.07 1.20

Prescribe a medication that will not trigger an interaction and stop

80.00% 0.00% 26.00 4.97 1.37 0.26 1.13 0.14 1.20

Trigger a drug-allergy Interaction and override it

80.00% 20.00% 46.00 10.23 1.31 0.29 1.00 0.00 1.40

Change the drug-drug interaction settings

40.00% 0.00% 16.00 0.00 2.00 0.00 1.00 0.00 1.20

Place and print a prescription (medication)

66.67% 33.33% 48.50 3.54 1.52 0.11 1.00 0.00 1.50

Enter a medication as a Non-CPOE prescription

80.00% 20.00% 30.25 5.12 1.32 0.22 1.00 0.00 1.60

Place a lab order and a radiology order

0.00% 75.00% n/a n/a n/a n/a n/a n/a 3.20

Place and change a lab order and print the order

0.00% 60.00% n/a n/a n/a n/a n/a n/a 2.40

Place and change a radiology order and print the order

0.00% 33.33% n/a n/a n/a n/a n/a n/a 2.50

Reconcile medications 66.67% 33.33% 131.00 8.49 1.72 0.11 1.05 0.07 2.00

Reconcile medication allergies 66.67% 33.33% 79.50 7.78 1.15 0.11 1.00 0.00 1.33

Reconcile problems 50.00% 50.00% 35.00 n/a 1.21 n/a 1.00 n/a 1.00

Trigger a CDS intervention from entered problem list data

100.00% 0.00% 48.20 12.19 1.30 0.33 1.06 0.13 1.40

Page 15: Test Results Summary for 2014 Edition EHR …connect.ul.com/rs/365-LEA-623/images/17-3199-R-0022-PRA...Test Results Summary for 2014 Edition EHR Certification 17-3199-R-0022-PRA V1.0,

5 | CompuGroup Medical

Safety-Enhanced Design

© Copyright 2014 CompuGroup Medical, Inc. All rights reserved. | May not be reproduced without prior written permission. | www.CGMus.com

Task Success

% Error

% Task Time (seconds)

Efficiency (Task Time)

Efficiency (Deviations)

Task Rating

Trigger a CDS intervention from entered prescription

0.00% 60.00% n/a n/a n/a n/a n/a n/a 1.80

Trigger a CDS intervention from entered medication allergy data

100.00% 0.00% 46.60 13.90 1.23 0.37 1.17 0.29 1.20

Trigger a CDS intervention from entered demographics

100.00% 0.00% 61.40 25.85 1.08 0.45 1.53 1.02 1.20

Trigger a CDS intervention from entered lab results

100.00% 0.00% 31.80 8.76 1.18 0.32 1.00 0.00 1.00

Trigger a CDS intervention from entered vital signs

100.00% 0.00% 41.60 11.59 1.09 0.30 1.00 0.00 1.20

The results from the System Usability Scale scored the subjective satisfaction with the system based on performance with these tasks to be: 70.625.2

In addition to the performance data, the following qualitative observations were made: Major findings:

Generally, participants in the usability study said that they liked the functionality of CGM CLINICAL™ and would recommend it to their colleagues. The majority of the tasks were rated between “very easy” and “easy”; more than one commented that it is improving all the time and they saw many helpful changes to the workflow. Participants in the usability study did not have that much to say in the way of commentary on the specific areas of the software. Based on the quantitative data, the stronger areas of the software are the medication allergy list, clinical decision support, and drug-interaction checking functionality. The test participants specifically mentioned that they found the CPOE for lab and radiology order sections troublesome, which the quantitative data also reflects. Judging by the quantitative data, many of the issues were caused by busy screens or extra drop-downs, which lead to less stream-lined workflows. Additionally, Meaningful Use specific workflows yielded lower success rate, which is most likely a result of unfamiliarity with the process and requirements. These workflows will become more of a habit over time through repetition for providers meeting the Meaningful Use requirements. Areas for improvement:

2 See Tullis, T. & Albert, W. (2008). Measuring the User Experience. Burlington, MA: Morgan Kaufman (p. 149).

Broadly interpreted, scores under 60 represent systems with poor usability; scores over 80 would be considered above average.

Page 16: Test Results Summary for 2014 Edition EHR …connect.ul.com/rs/365-LEA-623/images/17-3199-R-0022-PRA...Test Results Summary for 2014 Edition EHR Certification 17-3199-R-0022-PRA V1.0,

6 | CompuGroup Medical

Safety-Enhanced Design

© Copyright 2014 CompuGroup Medical, Inc. All rights reserved. | May not be reproduced without prior written permission. | www.CGMus.com

Based on the quantitative findings, verbal report of the participants, and observations from the data logger, the following changes would most likely improve the overall usability of CGM CLINICAL™. Medication List – Make the various options to modify or discontinue a medication more obvious. Medication Allergy List – Streamline the workflow to deactivate a medication allergy. eRx – Modify the sig screen so that users can directly print or electronically prescribe from it. Change the “add” button next to the favorites list drop down to make it more obvious. For example, make the font bold. When using the “New Rx” search, make it so that users can’t exit out of the search screen, which leads them to a blank sig screen. This last change would also assist in the usability of the CPOE for medication orders section. Drug-Drug, Drug-Allergy Interactions Checks – Have all drug interaction alerts appear on one screen at the same time. Computerized Provider Order Entry – Modify and clean up the interfaces for the radiology and lab order screens so that no scrolling is required and so that the workflow is obvious. Remove the “Place Order” button in the middle of the screen for lab orders. Clinical Information Reconciliation – Make the font larger for each of the lists to be reconciled, using the available white space on the interface. For structured summary of care documents, parse the data so that only the applicable section shows when performing reconciliation, and make it so that users can add the medication, allergy, or problem with one click. Show a record of what has been added to or removed from the record as reconciliation is being performed. On the final approve reconciliation screen, make it more obvious what is now going to be within the EHR, perhaps by bolding the applicable items. Clinical Decision Support – Make the CDS alert a more bright red color so it is more obvious. Add additional CDS alerts to areas of the software outside of the encounter so that providers do not need to navigate to the encounter to see if there is an alert.

INTRODUCTION

Page 17: Test Results Summary for 2014 Edition EHR …connect.ul.com/rs/365-LEA-623/images/17-3199-R-0022-PRA...Test Results Summary for 2014 Edition EHR Certification 17-3199-R-0022-PRA V1.0,

7 | CompuGroup Medical

Safety-Enhanced Design

© Copyright 2014 CompuGroup Medical, Inc. All rights reserved. | May not be reproduced without prior written permission. | www.CGMus.com

The EHRUT tested for this study was CGM CLINICAL™, Version 8.2. This product is designed to present medical information to healthcare providers in an ambulatory setting. The EHRUT is an easily adaptable integrated PM and EHR solution intended to support a broad range of specialties that allows providers to document patient health information and facilitates information sharing across multiple settings. The usability testing attempted to represent realistic exercises and conditions. The purpose of this study was to test and validate the usability of the current user interface, and provide evidence of usability in the EHRUT. To this end, measures of effectiveness, efficiency and user satisfaction, such as task time and number of clicks per task, were captured during the usability testing.

METHOD PARTICIPANTS A total of 5 participants were tested on the EHRUT(s). Participants in the test were MDs and MD, PhDs. Participants were recruited by CompuGroup Medical and were compensated for their time in the amount of $200/hour. In addition, participants had no direct connection to the development of or organization producing the EHRUT(s). Participants were not from the testing or supplier organization. Participants were given the opportunity to have the same orientation and level of training as the actual end users would have received. For the test purposes, end-user characteristics were identified and translated into a recruitment screener used to solicit potential participants; an example of a screener is provided in Appendix 1. Recruited participants had a mix of backgrounds and demographic characteristics conforming to the recruitment screener. The following is a table of participants by characteristics, including demographics, professional experience, and computing experience. Participant names were replaced with Participant IDs so that an individual’s data cannot be tied back to individual identities.

Part. # Sex Age

Range

Race/Ethni

c Group

Position

and Title Specialty

Time

Spent in

Position

Hours/week

spent on

computer

Method used to

Document

Patient Records

General EHR

Use

EHR

Knowledge

1 M 40 to 59 Asian MD, PhD Gastroenterology 16 years 40+ All Electronic Daily, 40

hrs/week

Good to

Excellent

2 M 60 to 74 Caucasian MD Neurology 31 years 40+ All Electronic Daily Excellent

4 M 60 to 74 Caucasian MD Primary Care/GP 33 years 40+ All Electronic Daily Excellent

5 M 60 to 74 Caucasian MD Cardiology 31 years 21 to 30 All Electronic Daily Average

6 F 40 to 59 Asian MD Pain Management 17 years 21 to 30 Some Paper,

Some Electronic

5-7

Days/Week Good

Page 18: Test Results Summary for 2014 Edition EHR …connect.ul.com/rs/365-LEA-623/images/17-3199-R-0022-PRA...Test Results Summary for 2014 Edition EHR Certification 17-3199-R-0022-PRA V1.0,

8 | CompuGroup Medical

Safety-Enhanced Design

© Copyright 2014 CompuGroup Medical, Inc. All rights reserved. | May not be reproduced without prior written permission. | www.CGMus.com

Five participants (matching the demographics in the section on Participants) were recruited five participated in the usability test. Zero participants failed to show for the study. For each wave, participants were scheduled for ninety minute sessions with at least 30 minutes in between each session for debrief by the administrator(s) and data logger(s), and to reset systems to proper test conditions. A spreadsheet was used to keep track of the participant schedule. STUDY DESIGN Overall, the objective of this test was to uncover areas where the application performed well – that is, effectively, efficiently, and with satisfaction – and areas where the application failed to meet the needs of the participants. The data from this test may serve as a baseline for future tests with an updated version of the same EHR and/or comparison with other EHRs provided the same tasks are used. In short, this testing serves as both a means to record or benchmark current usability, but also to identify areas where improvements must be made. During the usability test, participants interacted with one EHR. Each participant used the system in the same location, and was provided with the same instructions. The system was evaluated for effectiveness, efficiency and satisfaction as defined by measures collected and analyzed for each participant:

Number of tasks successfully completed within the allotted time without assistance

Time to complete the tasks

Number and types of errors

Path deviations

Participant’s verbalizations (comments)

Participant’s satisfaction ratings of the system TASKS A number of tasks were constructed that would be realistic and representative of the kinds of activities a user might do with this EHR, including:

List of Tasks

Certification Criteria Task Wave of Testing

§170.314(a)(6) Medication list Enter a medication 1

§170.314(a)(6) Medication list Change medication dosage 1

§170.314(a)(6) Medication list Discontinue a medication 1

§170.314(a)(6) Medication list View patient’s medication history 1

§170.314(a)(6) Medication list Mark that a patient is not taking any medications 1

§170.314(a)(7) Medication allergy list Enter a medication allergy 1

§170.314(a)(7) Medication allergy list Change a medication allergy 1

§170.314(a)(7) Medication allergy list Inactivate a medication allergy 1

§170.314(a)(7) Medication allergy list Indicate no known drug allergies 1

§170.314(b)(3) eRx Prescribe a medication 1

§170.314(b)(3) eRx Prescribe a medication with a different pharmacy 1

Page 19: Test Results Summary for 2014 Edition EHR …connect.ul.com/rs/365-LEA-623/images/17-3199-R-0022-PRA...Test Results Summary for 2014 Edition EHR Certification 17-3199-R-0022-PRA V1.0,

9 | CompuGroup Medical

Safety-Enhanced Design

© Copyright 2014 CompuGroup Medical, Inc. All rights reserved. | May not be reproduced without prior written permission. | www.CGMus.com

§170.314(b)(3) eRx Change the dose of a medication and eRx 1

§170.314(a)(2) Drug-drug, drug-allergy interactions checks

Trigger a major drug-drug interaction and cancel the order 2

§170.314(a)(2) Drug-drug, drug-allergy interactions checks

Prescribe a medication that will not trigger an interaction and stop 2

§170.314(a)(2) Drug-drug, drug-allergy interactions checks

Trigger a drug-allergy Interaction and override it 2

§170.314(a)(2) Drug-drug, drug-allergy interactions checks

Change the drug-drug interaction settings 2

§170.314(a)(1) Computerized provider order entry

Place and print a prescription (medication) 2

§170.314(a)(1) Computerized provider order entry

Enter a medication as a Non-CPOE prescription 2

§170.314(a)(1) Computerized provider order entry

Place a lab order and a radiology order 2

§170.314(a)(1) Computerized provider order entry

Place and change a lab order and print the order 2

§170.314(a)(1) Computerized provider order entry

Place and change a radiology order and print the order 2

§170.314(b)(4) Clinical information reconciliation

Reconcile medications 2

§170.314(b)(4) Clinical information reconciliation

Reconcile medication allergies 2

§170.314(b)(4) Clinical information reconciliation

Reconcile problems 2

§170.314(a)(8) Clinical decision support

Trigger a CDS intervention from entered problem list data 2

§170.314(a)(8) Clinical decision support

Trigger a CDS intervention from entered prescription 2

§170.314(a)(8) Clinical decision support

Trigger a CDS intervention from entered medication allergy data 2

§170.314(a)(8) Clinical decision support

Trigger a CDS intervention from entered demographics 2

§170.314(a)(8) Clinical decision support

Trigger a CDS intervention from entered lab results 2

§170.314(a)(8) Clinical decision support

Trigger a CDS intervention from entered vital signs 2

Tasks were selected based on their frequency of use, criticality of function, and those that may be most troublesome for users. Tasks should always be constructed in light of the study objectives. PROCEDURES Upon arrival, participants were greeted and brought to the testing room. Participants had been assigned a participant ID previously during the recruitment portion. To ensure that the test ran smoothly, three staff members participated in this test, the usability moderator, who served as the main administrator, the data logger, and a trainer.

Page 20: Test Results Summary for 2014 Edition EHR …connect.ul.com/rs/365-LEA-623/images/17-3199-R-0022-PRA...Test Results Summary for 2014 Edition EHR Certification 17-3199-R-0022-PRA V1.0,

10 | CompuGroup Medical

Safety-Enhanced Design

© Copyright 2014 CompuGroup Medical, Inc. All rights reserved. | May not be reproduced without prior written permission. | www.CGMus.com

The moderator moderated the session, including administering instructions and tasks. The data logger took notes on task times, task success, path deviations, number and type of errors, and comments. Participants were instructed to perform the tasks (see specific instructions below):

As quickly as possible making as few errors and deviations as possible.

Without assistance; administrators were allowed to give immaterial guidance and clarification on tasks, but not instructions on use.

Without using a think aloud technique. Task timing began once the administrator finished reading the question. The task time was stopped once the participant indicated they had successfully completed the task. Scoring is discussed below. Following the session, each participant was sent the Closing Questionnaire and System Usability Scale Form (See Appendices 6 and 7), compensated them for their time, and thanked each for their participation. Participants' demographic information, task success rate, time on task, errors, deviations, and verbalizations recorded into a spreadsheet. TEST LOCATION The test facility included a waiting area and a quiet testing room with tables and chairs, computer for the participant, and recording computer for the moderator. Only the participant, moderator, and trainer were in the test room. The data logger worked remotely, where they could see the participant’s screen and face shot, and listen to the audio of the session. To ensure that the environment was comfortable for users, noise levels were kept to a minimum with the ambient temperature within a normal range. All of the safety instruction and evacuation procedures were valid, in place, and visible to the participants. TEST ENVIRONMENT The EHRUT would be typically be used in a healthcare office or facility. In this instance, the testing was conducted in the CGM Irvine facility. For testing, the computer used was a HP Elite Book 8470P running Windows 7 Professional. The participants used a mouse and keyboard when interacting with the EHRUT. The EHRUT used an Acer 24” LCD monitor with 1920 x 1080 resolution and True Color (32 bit) settings. The application was set up by the vendor. The application itself was running on a SQL platform using a test database on a LAN connection. Technically, the system performance (i.e., response time) was representative to what actual users would experience in a field implementation. Additionally, participants were instructed not to change any of the default system settings (such as control of font size or resolution). TEST FORMS AND TOOLS During the usability test, various documents and instruments were used, including:

1. Usability Test User Agreement 2. Addendum to Usability Test User Agreement

Page 21: Test Results Summary for 2014 Edition EHR …connect.ul.com/rs/365-LEA-623/images/17-3199-R-0022-PRA...Test Results Summary for 2014 Edition EHR Certification 17-3199-R-0022-PRA V1.0,

11 | CompuGroup Medical

Safety-Enhanced Design

© Copyright 2014 CompuGroup Medical, Inc. All rights reserved. | May not be reproduced without prior written permission. | www.CGMus.com

3. Moderator’s Guide 4. Closing Questionnaire 5. System Usability Scale Questionnaire

Examples of these documents can be found in Appendices 2-7 respectively. The Moderator’s Guide was devised so as to be able to capture required data. The participant’s interaction with the EHRUT was captured and recorded digitally with screen capture software running on the test machine. A webcam recorded each participant’s facial expressions synced with the screen capture, and verbal comments were recorded with. PARTICIPANT INSTRUCTIONS Thank you for participating in this study. Our session today will last approximately 90 minutes. I will ask you to complete a few tasks using this system and answer some questions. We are interested in how easy (or how difficult) this system is to use, what in it would be useful to you and how we could improve it. You will be asked to complete these tasks on your own trying to do them as quickly as possible with the fewest possible errors or deviations. Do not do anything more than asked. If you get lost or have difficulty, we cannot answer any questions or help you with the task within the system itself. We ask that you offer your honest opinions regarding the usability of these functions, but please save your detailed comments until the end of a task or the end of the session as a whole when we can discuss freely. The product version you will be using today is still in the development stages. Some of the data may not make sense. During the tasks you will be performing in each session, we will ask you to make decisions that may contradict how you would normally practice. Please follow the tasks as we have outlined them because we have designed them to test the usability of specific workflows. We are recording the audio and video of our session today. Your voice, actions and number of clicks will be recorded. All of the information that you provide will be kept confidential and your name will not be associated with your comments at any time. We will be utilizing Team Viewer today for the session which will allow you to watch a brief training of the Clinical areas you will be performing tasks in; once the training is completed, I will ask you to review the task you will be performing before we begin, then you will take control of the session so that you can complete each task. Following the procedural instructions, participants were shown the EHR and were given time to explore the system. Once this was complete, the administrator gave the following instructions: During these tasks we will ask you to make decisions that may differ from how you would practice. Please follow the tasks as we have outlined them because we have designed them to test the usability of specific workflows. Please take a few moments to review Task _ and let me know when you are ready and we will begin. Once the participant was ready to begin, the moderator read the task and instructed the participant to say begin, and to say “Stop” when he/she was done.

Page 22: Test Results Summary for 2014 Edition EHR …connect.ul.com/rs/365-LEA-623/images/17-3199-R-0022-PRA...Test Results Summary for 2014 Edition EHR Certification 17-3199-R-0022-PRA V1.0,

12 | CompuGroup Medical

Safety-Enhanced Design

© Copyright 2014 CompuGroup Medical, Inc. All rights reserved. | May not be reproduced without prior written permission. | www.CGMus.com

USABILITY METRICS According to the NIST Guide to the Processes Approach for Improving the Usability of Electronic Health Records, EHRs should support a process that provides a high level of usability for all users. The goal is for users to interact with the system effectively, efficiently, and with an acceptable level of satisfaction. To this end, metrics for effectiveness, efficiency and user satisfaction were captured during the usability testing. The goals of the test were to assess:

1. Effectiveness of CGM CLINICAL™ by measuring participant success rates and errors 2. Efficiency of CGM CLINICAL™ by measuring the average task time and path deviations 3. Satisfaction with CGM CLINICAL™ by measuring ease of use ratings

DATA SCORING The following table details how tasks were scored, errors evaluated, and the time data analyzed

Measures Rationale and Scoring Effectiveness:

Task Success A task was counted as a “Success” if the participant was able to achieve the correct outcome, without assistance, within the time allotted on a per task basis. The total number of successes were calculated for each task and then divided by the total number of times that task was attempted. The results are provided as a percentage. Task times and number of clicks were recorded for successes.

Effectiveness: Task Failures

If the participant abandoned the task, did not reach the correct answer or performed it incorrectly, or reached the end of the allotted time before successful completion, the task was counted as a “Failures.” No task times were taken for errors. If the participant completed the task incorrectly, this was counted as an error. The total number of errors were calculated for each task and then divided by the total number of times that task was attempted. The results are provided as a percentage.

Efficiency:

Task Deviations The participant’s path (i.e., steps) through the application was recorded. Deviations occur if the participant, for example, went to a wrong screen, clicked on an incorrect menu item, followed an incorrect link, or interacted incorrectly with an on-screen control. This path was compared to the optimal path. The number of steps in the observed path is divided by the number of optimal steps to provide a ratio of path deviation. Efficiency was only counted for tasks that were successfully completed.

Efficiency: Task Time

Each task was timed from when the moderator said “Start” until the participant said, “Stop.” If he or she failed to say “Stop,” the time was stopped when the participant stopped performing the task. Only task times for tasks that were successfully completed were included in the average task time analysis. Average time per task was calculated for each task. Standard deviation was also calculated. Task time efficiency was calculated by dividing the observed task time by the optimal time.

Satisfaction:

Task Rating Participant’s subjective impression of the ease of use of the application was measured by administering both a simple post-task question as well as a post-session questionnaire. After each task, the participant was asked to rate “Overall, this task was:” on a scale of 1 (Simple) to 5 (Difficult). These data are averaged across participants.

Page 23: Test Results Summary for 2014 Edition EHR …connect.ul.com/rs/365-LEA-623/images/17-3199-R-0022-PRA...Test Results Summary for 2014 Edition EHR Certification 17-3199-R-0022-PRA V1.0,

13 | CompuGroup Medical

Safety-Enhanced Design

© Copyright 2014 CompuGroup Medical, Inc. All rights reserved. | May not be reproduced without prior written permission. | www.CGMus.com

To measure participants’ confidence in and likeability of CGM CLINICAL™ overall, the testing team administered the System Usability Scale (SUS) post-test questionnaire. Questions included, “I think I would like to use this system frequently,” “I thought the system was easy to use,” and “I would imagine that most people would learn to use this system very quickly.” See full System Usability Score questionnaire in Appendix 7.

RESULTS DATA ANALYSIS AND REPORTING The results of the usability test were calculated according to the methods specified in the Usability Metrics section above. Participants who failed to follow session and task instructions had their data excluded from the analyses. The usability testing results for the EHRUT are detailed below:

Task Success

% Error

%

Task Time (seconds)

Efficiency (Task Time)

Efficiency (Deviations) Task

Rating AVG STD AVG STD AVG STD

Enter a medication 80.00% 20.00% 42.50 8.74 1.02 0.21 1.00 0.00 1.20

Change medication dosage 40.00% 20.00% 38.00 21.21 1.31 0.73 1.90 1.27 1.80

Discontinue a medication 60.00% 20.00% 17.67 2.08 1.22 0.14 1.00 0.00 1.90

View patient’s medication history

80.00% 20.00% 6.75 2.63 1.00 0.29 1.13 0.25 1.00

Mark that a patient is not taking any medications

60.00% 40.00% 12.67 2.52 1.00 0.16 1.00 0.00 1.00

Enter a medication allergy 100.00% 0.00% 33.00 8.57 1.40 0.36 1.03 0.06 1.20

Change a medication allergy 100.00% 0.00% 39.80 16.02 1.24 0.50 1.10 0.15 1.20

Inactivate a medication allergy 100.00% 0.00% 24.00 6.04 1.71 0.43 1.10 0.22 1.00

Indicate no known drug allergies 100.00% 0.00% 29.50 4.80 1.69 0.27 1.03 0.06 1.00

Prescribe a medication 50.00% 50.00% 66.50 10.61 1.00 0.16 1.00 0.00 1.75

Prescribe a medication with a different pharmacy

60.00% 40.00% 87.67 36.53 1.16 0.48 1.52 0.89 1.60

Change the dose of a medication and eRx

40.00% 60.00% 62.50 13.44 1.11 0.24 1.00 0.00 1.80

Trigger a major drug-drug interaction and cancel the order

100.00% 0.00% 40.00 5.96 1.67 0.25 1.03 0.07 1.20

Prescribe a medication that will not trigger an interaction and stop

80.00% 0.00% 26.00 4.97 1.37 0.26 1.13 0.14 1.20

Trigger a drug-allergy Interaction and override it

80.00% 20.00% 46.00 10.23 1.31 0.29 1.00 0.00 1.40

Change the drug-drug interaction settings

40.00% 0.00% 16.00 0.00 2.00 0.00 1.00 0.00 1.20

Place and print a prescription 66.67% 33.33% 48.50 3.54 1.52 0.11 1.00 0.00 1.50

Page 24: Test Results Summary for 2014 Edition EHR …connect.ul.com/rs/365-LEA-623/images/17-3199-R-0022-PRA...Test Results Summary for 2014 Edition EHR Certification 17-3199-R-0022-PRA V1.0,

14 | CompuGroup Medical

Safety-Enhanced Design

© Copyright 2014 CompuGroup Medical, Inc. All rights reserved. | May not be reproduced without prior written permission. | www.CGMus.com

Task Success

% Error

% Task Time (seconds)

Efficiency (Task Time)

Efficiency (Deviations)

Task Rating

(medication)

Enter a medication as a Non-CPOE prescription

80.00% 20.00% 30.25 5.12 1.32 0.22 1.00 0.00 1.60

Place a lab order and a radiology order

0.00% 75.00% n/a n/a n/a n/a n/a n/a 3.20

Place and change a lab order and print the order

0.00% 60.00% n/a n/a n/a n/a n/a n/a 2.40

Place and change a radiology order and print the order

0.00% 33.33% n/a n/a n/a n/a n/a n/a 2.50

Reconcile medications 66.67% 33.33% 131.00 8.49 1.72 0.11 1.05 0.07 2.00

Reconcile medication allergies 66.67% 33.33% 79.50 7.78 1.15 0.11 1.00 0.00 1.33

Reconcile problems 50.00% 50.00% 35.00 n/a 1.21 n/a 1.00 n/a 1.00

Trigger a CDS intervention from entered problem list data

100.00% 0.00% 48.20 12.19 1.30 0.33 1.06 0.13 1.40

Trigger a CDS intervention from entered prescription

0.00% 60.00% n/a n/a n/a n/a n/a n/a 1.80

Trigger a CDS intervention from entered medication allergy data

100.00% 0.00% 46.60 13.90 1.23 0.37 1.17 0.29 1.20

Trigger a CDS intervention from entered demographics

100.00% 0.00% 61.40 25.85 1.08 0.45 1.53 1.02 1.20

Trigger a CDS intervention from entered lab results

100.00% 0.00% 31.80 8.76 1.18 0.32 1.00 0.00 1.00

Trigger a CDS intervention from entered vital signs

100.00% 0.00% 41.60 11.59 1.09 0.30 1.00 0.00 1.20

EFFECTIVENESS The areas of CGM CLINICAL™ with the highest level of success were medication allergy list, drug-interaction checks, and clinical decision support. All the tasks for medication allergy list were passed with a 100% success rate and only minor deviations. For the drug-interaction checks tasks, all but one task had a 0% error rate. The first task, (trigger a severe drug-drug interaction and cancel the order), had a success rate of 100%. Although the tasks to “prescribe a medication that will not trigger an interaction and stop” and “change the drug-drug interaction settings” had success rates below 100%, the only failures were caused by participants going over the allotted time. In the case of the task to change the drug-drug interaction settings, this is a task that would not typically be performed as a regular workflow; since it’s set-up most practices would most likely set-up the practice settings when setting up the EHR and rarely change them. Additionally, it’s the type of task that an administrator would usually do, so it is understandable that many of the providers would struggle. Finally, only one participant was unable to complete the task to trigger a drug-allergy interaction and override it. The participant originally clicked on the area to review drug-drug interaction and could not figure out how to get back to view the drug-allergy interactions. Given the relatively high success rate for these tasks, the drug-interaction alerts and workflow are relatively easy to navigate, but the layout of how they are presented could use some modification.

Page 25: Test Results Summary for 2014 Edition EHR …connect.ul.com/rs/365-LEA-623/images/17-3199-R-0022-PRA...Test Results Summary for 2014 Edition EHR Certification 17-3199-R-0022-PRA V1.0,

15 | CompuGroup Medical

Safety-Enhanced Design

© Copyright 2014 CompuGroup Medical, Inc. All rights reserved. | May not be reproduced without prior written permission. | www.CGMus.com

With the exception of the task to trigger a clinical decision support from an entered prescription, all of the clinical decision support tasks had a success rate of 100%. For the task to trigger a CDS alert off of a prescription, two of the participants were able to complete it, but due to issues getting back to the encounter to view the alert button, went over the allotted time. The other three participants did not actually go through the process to prescribe the medication, and therefore the CDS alert did not trigger. All three of those participants were able to distinguish that the alert was not triggered, and did not click on the CDS alert, per the instructions. Changes to the e-prescribing workflow in general could help alleviate many if not all of these issues, as the data for those tasks show similar usability issues. In terms of path deviation data, participants struggled some with the task to trigger a CDS alert off of entered demographic data. Similarly to the task for CDS off of prescription information, participants struggled with being able to get back to the encounter to view the CDS alert. Tasks where the data entry was right in the encounter were by far more successful in this respect. The areas of CGM CLINICAL™ not reaching the highest level of success were medication list, eRx, CPOE, and clinical information reconciliation. For the medication list functionality, the low success rate was often caused by participants going over the allotted time. With the exception of the task to mark that a patient is not taking any medication, each task had an error percentage rate of 20%, meaning that each task only had one participant with any errors. For the task to mark that a patient was not taking any medications, the participants with errors correctly inactivated the medications on the list but did not actively select the option “not taking any medications” which is a required workflow for Meaningful Use. Training and repetition should help alleviate this issue, as this step is not a typical workflow that providers are in the habit of doing. For the other tasks, failures were the results of participants deviating off the path and then not figuring out how to fix their mistakes, such as entering the wrong start date. The options for altering medications caused confusion for the participants and caused them to deviate from the optimal path. For example, participants would use “modify medication” as opposed to “modify medication hx”. Making the options for changing medications, both status and information, clearer would stream-line the workflow. As mentioned previously when discussing clinical decision support for an entered prescription, tasks for e-prescribing orders had errors when participants failed to actually electronically prescribe a medication. Generally, all participants were able to enter the order and correctly pass the sig screen, but more than one participant didn’t follow through submitting for the task to electronically prescribe an alternative medication, electronically prescribe a medication with a different pharmacy, and change the dose of a medication and eRx. A change to the sig screen where a user could directly e-prescribe would help with these issues. Additionally, participants had some difficultly using the favorites list. They would select the correct medication from the drop-down, but then not use the “add” function and just select “new Rx”. This caused confusion, as participants would then just exit out of the search screen, going to the sig screen, most likely because they believed using the favorites list would automatically add the medication. Some of these issues were most likely caused by participants unfamiliar with the favorite list functionality, but changes to the interface could also help. During the clinical information reconciliation tasks, participants did not always add or remove the proper medications, allergies, or problems. In general this seemed to be caused by confusion over what the reconciled list contained, and changes to the screen to confirm the list would be helpful. Since this functionality is brand new, some of the issues will also most likely be alleviated with training and repetition, which will allow users to become more familiar with the workflow requirements.

Page 26: Test Results Summary for 2014 Edition EHR …connect.ul.com/rs/365-LEA-623/images/17-3199-R-0022-PRA...Test Results Summary for 2014 Edition EHR Certification 17-3199-R-0022-PRA V1.0,

16 | CompuGroup Medical

Safety-Enhanced Design

© Copyright 2014 CompuGroup Medical, Inc. All rights reserved. | May not be reproduced without prior written permission. | www.CGMus.com

CPOE for medication orders had an acceptable success rate; however, the Non-CPOE checkbox caused some confusion. One of the participants didn’t seem to understand when to use it. This issue, like all of those relating to Meaningful Use specific workflows, should be alleviated with training. The participants most likely didn’t understand the reason for the checkbox, but since most were able to complete the tasks correctly, the workflow itself doesn’t seem to be a problem. CPOE for labs and radiology was the least successful area of the software. Users especially encountered some challenges with the lab order area. The lab order tasks required participants to scroll to the bottom of the screen to complete the steps to add a diagnosis and place the order, but many did not do that. Since they could not see the areas they needed to complete, they could not figure out how to actually place the order. With radiology, the main issues were participants going over the allotted time, which often was because they deviated from the optimal path. Like with lab orders, participants would forget to add the diagnosis code, causing them to have to go back and add it. Additionally, one user forgot to remove the original radiology order for the place and change radiology order task. For both areas of the software, changes are needed to the workflow so that users clearly know what needs to happen to be able to complete the task; some of the scrolling that is currently required should also be removed.

EFFICIENCY Overall, users who were able to complete the usability tasks did so with decent task time efficiency scores (close to 1). The strongest area in this respect was eRx. Second to this was clinical decision support, most likely because it relied on previous workflow that participants were used to. Participants had no trouble finding the CDS button and actually viewing the CDS alert, as long as they could navigate back to the encounter. The medication list tasks had a high failure rate due to participants going over the allotted task time, but overall users were very efficient when it came to these tasks. The users who could not complete the tasks within the allotted time most likely were not as used to the application or medication list functionality as those who could successfully complete the task. As mentioned previously, the different options to “Modify Medication”, “Modify Medication Hx” and “Discontinue Mediation” seemed to prove confusing. Similarly, the clinical information reconciliation for allergies and problems yielded high efficiency score. The medication reconciliation task gave a lower score in this respect, which shows that users probably just need to get familiar with the workflow, especially given the much improved scores after just one task. Additionally, time was added in some cases when participants were trying to figure out what was on each of the lists (the summary of care and the EHR list) as well as for scrolling to view the proper section on the summary of care document. Participants generally were able to complete the medication allergy tasks efficiently. All users were able to complete the tasks within the allotted time. For the first two tasks, participants hesitated when selecting the proper medication allergy on the search screen since there were so many allergies to choose from. The tasks with lower efficiency scores were the second two which both required the participant to deactivate an allergy. The drug-interaction checking tasks also had decent efficiency scores, with the main exception of the task to change the drug-drug interaction settings. As explained above, this is understandable as it is a workflow not used during the regular course of practice, and would not really affect a provider in the

Page 27: Test Results Summary for 2014 Edition EHR …connect.ul.com/rs/365-LEA-623/images/17-3199-R-0022-PRA...Test Results Summary for 2014 Edition EHR Certification 17-3199-R-0022-PRA V1.0,

17 | CompuGroup Medical

Safety-Enhanced Design

© Copyright 2014 CompuGroup Medical, Inc. All rights reserved. | May not be reproduced without prior written permission. | www.CGMus.com

day to day. Slightly lower efficiency scores for the other tasks can most likely be attributed to the requirement to click on each separate interaction type. Having everything displayed on one screen automatically could help a lot with this. Finally, as mentioned above, participants had challenges efficiently completing the CPOE tasks, especially those for lab and radiology orders due to busy screens and unclear workflows.

SATISFACTION The results from the SUS (System Usability Scale) scored the subjective satisfaction with the system based on performance with these tasks to be: 70.625. Broadly interpreted, scores under 60 represent systems with poor usability; scores over 80 would be considered above average. On an individual task level, users ranked, on average, most tasks 1-2 (between “very easy” and “easy”). The tasks slightly outside this ranking were:

Place a lab order and a radiology order

Place and change a lab order and print the order

Place and change a radiology order and print the order

Reconcile medications The task ranking scores support the previously discussed quantitative data in these cases. For clinical information reconciliation, participants rated the task to reconcile allergies and problems much easier than the task to reconcile medications, which shows, once again, that once participants get used to this brand new workflow they will have very few challenges.

MAJOR FINDINGS Generally, participants in the usability study said that they liked the functionality of CGM CLINICAL™ and would recommend it to their colleagues. The majority of the tasks were rated between “very easy” and “easy”; more than one commented that it is improving all the time and they saw many helpful changes to the workflow. Participants in the usability study did not have that much to say in the way of commentary on the specific areas of the software. Based on the quantitative data, the stronger areas of the software are the medication allergy list, clinical decision support, and drug-interaction checking functionality. The test participants specifically mentioned that they found the CPOE for lab and radiology order sections challenging, which the quantitative data also reflects. Judging by the quantitative data, many of the challenges were caused by busy screens or extra drop-downs, which lead to less stream-lined workflows. Additionally, Meaningful Use specific workflows yielded lower success rate, which is most likely a result of unfamiliarity with the process and

Page 28: Test Results Summary for 2014 Edition EHR …connect.ul.com/rs/365-LEA-623/images/17-3199-R-0022-PRA...Test Results Summary for 2014 Edition EHR Certification 17-3199-R-0022-PRA V1.0,

18 | CompuGroup Medical

Safety-Enhanced Design

© Copyright 2014 CompuGroup Medical, Inc. All rights reserved. | May not be reproduced without prior written permission. | www.CGMus.com

requirements. These workflows will become more of a habit over time through repetition for providers meeting the Meaningful Use requirements.

AREAS FOR IMPROVEMENT Based on the quantitative findings, verbal report of the participants, and observations from the data logger, the following changes would most likely improve the overall usability of CGM CLINICAL™. Medication List – Make the various options to modify or discontinue a medication more obvious. Medication Allergy List – Streamline the workflow to deactivate a medication allergy. eRx – Modify the sig screen so that users can directly print or electronically prescribe from it. Change the “add” button next to the favorites list drop down to make it more obvious. For example, make the font bold. When using the “New Rx” search, make it so that users can’t exit out of the search screen, which leads them to a blank sig screen. This last change would also assist in the usability of the CPOE for medication orders section. Drug-Drug, Drug-Allergy Interactions Checks – Have all drug interaction alerts appear on one screen at the same time. Computerized Provider Order Entry – Modify and clean up the interfaces for the radiology and lab order screens so that no scrolling is required and so that the workflow is obvious. Remove the “Place Order” button in the middle of the screen for lab orders. Clinical Information Reconciliation – Make the font larger for each of the lists to be reconciled, using the available white space on the interface. For structured summary of care documents, parse the data so that only the applicable section shows when performing reconciliation, and make it so that users can add the medication, allergy, or problem with one click. Show a record of what has been added to or removed from the record as reconciliation is being performed. On the final approve reconciliation screen, make it more obvious what is now going to be within the EHR, perhaps by bolding the applicable items. Clinical Decision Support – Make the CDS alert a more bright red color so it is more obvious. Add additional CDS alerts to areas of the software outside of the encounter so that providers do not need to navigate to the encounter to see if there is an alert.

Page 29: Test Results Summary for 2014 Edition EHR …connect.ul.com/rs/365-LEA-623/images/17-3199-R-0022-PRA...Test Results Summary for 2014 Edition EHR Certification 17-3199-R-0022-PRA V1.0,

19 | CompuGroup Medical

Safety-Enhanced Design

© Copyright 2014 CompuGroup Medical, Inc. All rights reserved. | May not be reproduced without prior written permission. | www.CGMus.com

APPENDICES The following appendices include supplemental data for this usability test report. Following is a list of the appendices provided:

1. Sample Recruiting Screener 2. Usability Test User Agreement 3. Addendum to Usability Test User Agreement 4. Usability Testing Moderator Introduction 5. Example Moderator’s Guide (Medication List) 6. Closing Questionnaire 7. System Usability Scale Questionnaire

Page 30: Test Results Summary for 2014 Edition EHR …connect.ul.com/rs/365-LEA-623/images/17-3199-R-0022-PRA...Test Results Summary for 2014 Edition EHR Certification 17-3199-R-0022-PRA V1.0,

20 | CompuGroup Medical

Safety-Enhanced Design

© Copyright 2014 CompuGroup Medical, Inc. All rights reserved. | May not be reproduced without prior written permission. | www.CGMus.com

APPENDIX 1 – SAMPLE RECRUITING SCREENER Thank you for your interest in CGM Clinical’s Usability Study. Your willingness to participate and provide feedback is extremely important to us. Please take a moment to complete the following questionnaire. We are collecting this information as we are required to use a diverse group of participants for this study. While unlikely, it is possible that not all interested providers will be able to participate. These studies will become more prevalent as we continue stay ahead of regulations and maintain industry certifications. If, for some reason, you are unable to participate in this study, we will keep your information on file as an interested party for future studies. We are conducting the Usability testing into 2 phases. Participants will come to our Irvine office and

complete the test cases with us on two separate occasions. The following lists the requirements to be

tested in each phase and time line for each. Participants must be willing to complete both phases.

Phase 1 Timeframe: First Week Dec. (12/2/13 – 12/6/13) Duration: Approx. 2 Hours

170.314(a)(6) Medication List for Ambulatory

170.314(a)(7) Medication Allergy List for Ambulatory

170.314(b)(3) Electronic Prescribing

Phase 2 Timeframe: TBD (Possible first of the year) Duration: Approx. 3 Hours

170.314(b)(4) Clinical Information Reconciliation

170.314(a)(1) Computerized Provider Order Entry

170.314(a)(2) Drug Interaction

170.314(a)(8) Clinical Decision Support Please return your completed questionnaire to: xxx via email at [email protected] or via fax at xxx-xxx-xxxx

Page 31: Test Results Summary for 2014 Edition EHR …connect.ul.com/rs/365-LEA-623/images/17-3199-R-0022-PRA...Test Results Summary for 2014 Edition EHR Certification 17-3199-R-0022-PRA V1.0,

21 | CompuGroup Medical

Safety-Enhanced Design

© Copyright 2014 CompuGroup Medical, Inc. All rights reserved. | May not be reproduced without prior written permission. | www.CGMus.com

Usability Testing Questionnaire

First Name:

Last Name:

Practice Name:

Practice Address:

Phone Number:

Email Address:

1. Have you participated in a focus group or usability test in the past 6 months?

☐ Yes

☐ No 2. Do you, or does anyone in your home, work in marketing research, usability research, or web

design?

☐ Yes

☐ No

3. Do you, or does anyone in your home, have a commercial or research interest in an electronic health record software or consulting company?

☐ Yes

☐ No

4. Which of the following describes your age?

☐ 23 to 39

☐ 40 to 59

☐ 60 to 74

☐ 75 and older 5. Which of the following best describes your race or ethnic group?

☐ Caucasian

☐ Asian

☐ Black/African-American

☐ Others

6. What is your sex?

☐ Male

☐ Female

Page 32: Test Results Summary for 2014 Edition EHR …connect.ul.com/rs/365-LEA-623/images/17-3199-R-0022-PRA...Test Results Summary for 2014 Edition EHR Certification 17-3199-R-0022-PRA V1.0,

22 | CompuGroup Medical

Safety-Enhanced Design

© Copyright 2014 CompuGroup Medical, Inc. All rights reserved. | May not be reproduced without prior written permission. | www.CGMus.com

7. What is your professional title? [You must be healthcare provider that can prescribe in order to participate in this particular study.]

☐MD

☐DO

☐NP

☐PA

☐Other – please list.

8. How long have you held this position?

9. What is your specialty?

10. What is your work affiliation and environment?

☐ Private Practice

☐ Health System

☐ Government Clinic

☐ Other - please describe.

11. Do you require any assistive technologies to use a computer? If so, please describe. [For example, if

you wear corrective lenses, require text to speech, utilize Windows Accessibility Features]

☐ Yes

☐ No

12. Besides reading email or accessing the CGM Clinical, what other activities do you do on the

computer? [e.g., research; reading news; shopping/banking; digital pictures; programming/word processing, etc.]

Page 33: Test Results Summary for 2014 Edition EHR …connect.ul.com/rs/365-LEA-623/images/17-3199-R-0022-PRA...Test Results Summary for 2014 Edition EHR Certification 17-3199-R-0022-PRA V1.0,

23 | CompuGroup Medical

Safety-Enhanced Design

© Copyright 2014 CompuGroup Medical, Inc. All rights reserved. | May not be reproduced without prior written permission. | www.CGMus.com

13. About how many hours per week do you spend on the computer?

☐ 1 to 10

☐ 11 to 20

☐ 21 to 30

☐ 31 to 40

☐ 40 plus 14. What computer platform\operating systems do you usually use?

☐ Windows

☐ Mac

☐ Linux

☐ Other

15. In the last month, how often have you used an electronic health record? [e.g. daily, x times per

week/month, etc.]

16. How many years have you used an electronic health record?

17. How long have you used CGM Clinical (formerly Alteer Office)?

18. How many EHRs do you use or are you familiar with?

19. How would you rate your ability to learn software applications?

☐ Excellent

☐ Good

☐ Average

☐ Poor

Page 34: Test Results Summary for 2014 Edition EHR …connect.ul.com/rs/365-LEA-623/images/17-3199-R-0022-PRA...Test Results Summary for 2014 Edition EHR Certification 17-3199-R-0022-PRA V1.0,

24 | CompuGroup Medical

Safety-Enhanced Design

© Copyright 2014 CompuGroup Medical, Inc. All rights reserved. | May not be reproduced without prior written permission. | www.CGMus.com

20. How would you rate your overall knowledge of CGM Clinical?

☐ Excellent

☐ Good

☐ Average

☐ Poor 21. How does your practice typically document patient records?

☐ On paper

☐ Some paper, some electronic

☐ All electronic 22. Do you currently use the “Orders” functionality in the CGM Clinical for lab and radiology orders?

☐ Yes

☐ No

☐ Sometimes 23. Do you currently use the e-prescribing functionality in the EHR?

☐ Yes

☐ No

☐ Sometimes 24. Will you be able to participate in usability testing on location at our Irvine office for both sessions?

☐ Yes

☐ No 25. If yes, what times of day are you available [morning, afternoon, evening]? What days of the week

are best for you?

We are planning to conduct usability testing on site in Irvine; we may be conducting remote usability testing in the future. Therefore, we would also like to gather some information regarding your ability to participate in remote usability testing via WebEx. Please complete the following questions: 1. Do you have a private work area to perform usability testing free of distractions [such as a private

office]?

☐ Yes

☐ No

Page 35: Test Results Summary for 2014 Edition EHR …connect.ul.com/rs/365-LEA-623/images/17-3199-R-0022-PRA...Test Results Summary for 2014 Edition EHR Certification 17-3199-R-0022-PRA V1.0,

25 | CompuGroup Medical

Safety-Enhanced Design

© Copyright 2014 CompuGroup Medical, Inc. All rights reserved. | May not be reproduced without prior written permission. | www.CGMus.com

2. What internet connectivity options do you have available to you in the area where you would be performing remote usability testing?

☐ DSL

☐ T1

☐ Fiber

☐ Wireless Card

☐ Other

3. Does the computer you will use for usability testing have a camera?

☐ Yes

☐ No 4. How would you rate your internet bandwidth/connectivity speed in your office?

☐ Excellent

☐ Good

☐ Fair

☐ Poor Thank you for taking the time to complete this questionnaire. Thank you again for your interest and willingness to participate in the Meaningful Use 2 Usability Study. We will be contacting you to schedule your test time. We look forward to working with you. Please return your completed questionnaire to: xxxx via email at [email protected] or via fax at xxx-xxx-xxxx Best Regards, CGM Clinical Product Team

Page 36: Test Results Summary for 2014 Edition EHR …connect.ul.com/rs/365-LEA-623/images/17-3199-R-0022-PRA...Test Results Summary for 2014 Edition EHR Certification 17-3199-R-0022-PRA V1.0,

26 | CompuGroup Medical

Safety-Enhanced Design

© Copyright 2014 CompuGroup Medical, Inc. All rights reserved. | May not be reproduced without prior written permission. | www.CGMus.com

APPENDIX 2 – USABILITY TEST USER AGREEMENT This Usability Test User Agreement (“Agreement”) is made and entered into on November___, 2013 (“Effective Date”) by and between CompuGroup Medical, Inc., a Delaware corporation, having its principal place of business at 125 High Street, 8th Floor, Boston, MA 02110 (“CGM”) and the person or entity executing this Agreement (“User”). From time to time, CGM and User shall collectively be referred to herein as “parties,” and individually as “party.” This Agreement shall become effective upon execution of this Agreement by the User (the “Effective Date”), whose authorized signature below shall also serve to acknowledge the User’s acceptance of the terms and conditions herein, on behalf of User and User’s medical practice .

RECITALS WHEREAS, CGM licenses electronic medical records software; WHEREAS, User has certain knowledge and skill with regards to these electronic medical records software products; and WHEREAS, CGM desires to sponsor a usability test group to evaluate an electronic health record system for the purpose of promoting collaboration, knowledge acquisition and overall product improvement (the “Purpose”). NOW, THEREFORE, in consideration of the foregoing and the promises and mutual covenants contained herein and for other good and valuable consideration, the receipt and adequacy of which are hereby acknowledged by the Parties, and intending to be legally bound hereby, the Parties hereto agree as follows:

AGREEMENT

The recitals set forth above in this Agreement are, by this reference, incorporated into and deemed a part of this Agreement.

1. Term. The term of this Agreement shall commence on the Effective Date stated above and shall end on the earlier of February 28, 2014 or when terminated by either party upon giving thirty (30) days’ written notice to the other party. Notwithstanding any termination, the obligations of the parties concerning confidentiality will, with respect to Confidential Information (as defined below) that constitutes a “trade secret” (as that term is defined under applicable law), be perpetual, and will, with respect to other Confidential Information, remain in full force and effect during the term and for five (5) years following the receipt of the Confidential Information or the termination of this Agreement, whichever is later.

2. Obligations.Obligation of CGM. CGM shall use commercially reasonable efforts to host two (2) usability test sessions of approximately two (2) to three (3) hours in length during the term of this Agreement (“Usability Test Sessions”). CGM shall compensate User at an hourly rate of two hundred U.S. dollars (US$200.00) per hour of in-person participation in every Usability Test Session and shall additionally pay or reimburse User for all reasonable, preapproved expenses related to User’s attendance at Usability Test Sessions. All expenses must be supported by appropriate detailed receipts, or in the absence of a receipt, an explanation together with any indirect supporting evidence (such as a credit card statement) must be provided, before reimbursement will be authorized. User participation in Usability Test Sessions shall be paid based on rounding up to the nearest ten (10) minute increment.

Page 37: Test Results Summary for 2014 Edition EHR …connect.ul.com/rs/365-LEA-623/images/17-3199-R-0022-PRA...Test Results Summary for 2014 Edition EHR Certification 17-3199-R-0022-PRA V1.0,

27 | CompuGroup Medical

Safety-Enhanced Design

© Copyright 2014 CompuGroup Medical, Inc. All rights reserved. | May not be reproduced without prior written permission. | www.CGMus.com

2.2. Obligation of User. User agrees to use best efforts to attend in person each in-person Usability Test Session hosted by CGM during the term of this Agreement. User will be trained on the system then asked to perform several tasks using a prototype and give feedback. User will adhere to all requirements regarding protection and use of Confidential Information.

3. Confidential lnformation.

3.1. Definition. “Confidential Information” means: (i) the terms and conditions of this Agreement; (ii) all information marked as “Confidential,” “Proprietary” or a similar legend if disclosed in writing or other tangible form; (iii) all information identified as “confidential,” “proprietary” or the like at the time of disclosure if information is disclosed orally; or (iv) all information User knows or reasonably should know is confidential, proprietary or trade secret information of CGM. For the avoidance of doubt, CGM product roadmaps, product development plans, pre-release products or product information, sales and marketing plans, and research and development activities, constitute Confidential Information whether or not designated as “Confidential” or “Proprietary.”

3.2. Exceptions to Confidential Information. User shall have no obligation with respect to information which (i) was rightfully in possession of or known to User without any obligation of confidentiality prior to receiving from CGM; (ii) is, or subsequently becomes, legally and publicly available without breach of this Agreement; (iii) is rightfully obtained by User from a source other than CGM without any obligation of confidentiality; or (iv) is developed by or for the User without use of the Confidential Information and such independent development can be shown by documentary evidence. Further, User may disclose Confidential Information pursuant to a valid order issued by a court or government agency, provided that User provides to CGM: (a) prior written notice of such obligation; and (b) a reasonable opportunity to oppose such disclosure or obtain a protective order.

3.3. User’s Obligation Regarding Confidential Information Received from CGM. User may only use Confidential Information in furtherance of the Purpose and shall not disclose the Confidential Information to any third party; provided, however, User may disclose Confidential Information to other members and employees of User’s medical practice pursuant to the terms of this Agreement, where applicable and in furtherance of the Purpose. User shall safeguard CGM’s Confidential Information with the same degree of care, but not less than reasonable care, as it uses to protect its own confidential or proprietary information.

3.4. CGM Ownership of Confidential Information. CGM retains all right, title and interest to the Confidential Information. No license to any existing or future intellectual property right is either granted or implied by the disclosure of Confidential Information. User may not reverse-engineer, decompile, or disassemble, modify or copy (except for making a single back-up copy) any software disclosed under this Agreement or in connection with the usability testing. User shall not remove, overprint, deface or change any notice of confidentiality, copyright, trademark, logo, legend or other notice on or related to Confidential Information, whether originals or copies.

3.5. Return or Destruction of Confidential Information. Upon written demand, User shall: (i) cease using the Confidential Information, (ii) return the Confidential Information and all copies, notes or extracts thereof to CGM within seven (7) calendar days of receipt of demand and/or destroy same, at the election of CGM; and (iii) certify in writing that User has complied with the obligations set forth in this paragraph.

3.6. Disclaimer. All CONFIDENTIAL INFORMATION PROVIDED BY CGM TO USER IS PROVIDED “AS IS.” CGM shall not be liable for the accuracy or completeness of the Confidential Information,

Page 38: Test Results Summary for 2014 Edition EHR …connect.ul.com/rs/365-LEA-623/images/17-3199-R-0022-PRA...Test Results Summary for 2014 Edition EHR Certification 17-3199-R-0022-PRA V1.0,

28 | CompuGroup Medical

Safety-Enhanced Design

© Copyright 2014 CompuGroup Medical, Inc. All rights reserved. | May not be reproduced without prior written permission. | www.CGMus.com

and there are no express or implied representations or warranties by CGM with respect to the infringement of any intellectual property rights, or any right of privacy, or any rights of third persons.

4. User Information. User acknowledges and agrees that: (i) this Agreement does not protect disclosures made by User to CGM; and (ii) CGM does not wish to receive confidential, proprietary or trade secret information from User in connection with Usability Test Sessions. In the event that User wishes to disclose confidential, proprietary or trade secret information to CGM outside the scope of Usability Test Sessions or otherwise, the parties will execute a separate non-disclosure agreement detailing with specificity the information to be disclosed and the purpose(s) therefor.

5. Feedback. By providing any comments, suggestions, improvements or any other information or materials in connection with the Usability Test Sessions (collectively “Feedback”), User grants to CGM (including its sublicensees and assigns) a non-exclusive, irrevocable, worldwide, perpetual, royalty-free license, under all of User’s intellectual property rights, to use, display, copy, edit, create derivative works, market, sell, import, and distribute (including through resellers or multiple tiers of distribution) such Feedback. CGM may disclose and sublicense Feedback to third parties for any purpose. Any use of Feedback by CGM is in its sole discretion. User warrants and represents that it has all necessary rights to disclose Feedback or any other information or materials in connection with Usability Test Sessions.

6. Press Release/Disclosure. User shall not issue any press release or public disclosure regarding this Agreement or Usability Test Sessions without the prior written consent of CGM. User authorizes CGM to disclose User's name and practice information to other members of Usability Test Sessions and other third parties at CGM’s sole discretion.

7. Agreement to Participate in Usability Testing. User agrees to participate in Usability Testing conducted and recorded by CGM. User understands that participation in this Usability Testing is voluntary and agrees to immediately raise any concerns or areas of discomfort during the session with the testing administrator. User understands that she or he can leave a Usability Testing Session at any time; however, if User chooses to do so, User will be compensated only for the time he or she actually participated in the Usability Testing Session.

8. Recording Release. User agrees to participate in audio, video, and/or digital recording during the Usability Testing Sessions. User understands and consents to the use and release of any such recordings by CGM. User understands that the information and recording is for research and certification purposes only, and CGM agrees that User’s name and image will not be used for any other purpose without written authorization from User. User relinquishes any rights to the recording and understands the recording may be copied and used by CGM without further permission.

9. General.

9.1. Each party acknowledges that monetary remedies may be inadequate to protect Confidential Information and that CGM may seek injunctive relief in the event of any threatened or actual breach of any of the obligations hereunder.

9.2. This Agreement does not create a joint venture, employment relationship, agency, or partnership between the parties, which are independent contractors.

9.3. User may not assign this Agreement.

9.4. If any term of this Agreement shall be held to be illegal or unenforceable, such provision shall be modified to the minimum extent necessary so as to make it valid and enforceable or

Page 39: Test Results Summary for 2014 Edition EHR …connect.ul.com/rs/365-LEA-623/images/17-3199-R-0022-PRA...Test Results Summary for 2014 Edition EHR Certification 17-3199-R-0022-PRA V1.0,

29 | CompuGroup Medical

Safety-Enhanced Design

© Copyright 2014 CompuGroup Medical, Inc. All rights reserved. | May not be reproduced without prior written permission. | www.CGMus.com

severed if such modification is impossible, and as so modified the entire Agreement shall remain in full force and effect.

9.5. This Agreement shall be construed in accordance with the laws of the Commonwealth of Massachusetts, excluding its conflict of laws rules. Any legal or equitable action by or against Company arising out of or related in any way to this Agreement shall be brought solely in the federal or state courts located in Suffolk County, Massachusetts, the parties each expressly consenting to (and waiving any such challenge or objection to) such sole and exclusive personal jurisdiction and venue.

9.6. This Agreement is the entire agreement of the parties pertaining to the subject matter of this Agreement and may be modified only by a writing signed by both parties. This Agreement supersedes any and all prior oral discussions and/or written correspondence or agreements between the parties with respect thereto, all of which are excluded. The failure of a party to enforce its rights in the case of any breach of this Agreement shall not be construed to constitute a waiver of its rights with respect to any subsequent breach.

9.7. Except as set forth below, any notice required or permitted to be given by either party under this Agreement shall be in writing and will be effective and deemed given: (a) when delivered personally; (b) when sent by confirmed facsimile or e-mail (followed by the actual document by first class mail/overnight delivery service); (c) three days after having been sent by registered or certified mail, return receipt requested, postage prepaid; or (d) one day after deposit with a commercial overnight delivery service specifying next day delivery (or two days for international courier packages specifying two-day delivery), with written verification of receipt. To be effective, any notice to CGM hereunder must be addressed as follows: Legal Department, CompuGroup Medical, Inc., 125 High Street, 8th Floor, Boston, MA 02110, E-mail [email protected], Fax +1 617-507-5886.

CompuGroup Medical, Inc.

By: ______________________________________

Print Name: _______________________________

Title: ____________________________________

Date: ____________________________________

Company:_______________________________

By: _____________________________________

Print Name: _____________________________

Title: __________________________________

Date: __________________________________

Page 40: Test Results Summary for 2014 Edition EHR …connect.ul.com/rs/365-LEA-623/images/17-3199-R-0022-PRA...Test Results Summary for 2014 Edition EHR Certification 17-3199-R-0022-PRA V1.0,

30 | CompuGroup Medical

Safety-Enhanced Design

© Copyright 2014 CompuGroup Medical, Inc. All rights reserved. | May not be reproduced without prior written permission. | www.CGMus.com

APPENDIX 3 – ADDENDUM TO USABILITY TEST USER AGREEMENT This Addendum is made by and between CompuGroup Medical, Inc. (“CGM”) and the undersigned person or entity executing this Addendum (“User”), as of ________________________, 2014 to the Usability Test User Agreement with an effective date of November ___, 2013 (the “Agreement”) between CGM and User. This Addendum is attached and incorporated by reference, as if fully set forth in such Agreement. Except as modified by this Addendum, the parties hereby reaffirm all terms, covenants and conditions contained in the Agreement, which shall remain in full force and effect. A. The reference in Section 1 of the Agreement to “February 28, 2014” as the end of the term of the Agreement is hereby replaced with “March 31, 2014” as the new end of the term of the Agreement. EXECUTED BY THE AUTHORIZED SIGNATURES BELOW OF: Company (“User”): ________________________

CompuGroup Medical, Inc.

Signature: _______________________________ Signature: __________________________________

Printed Name: ____________________________ Printed Name: _______________________________

Title/Position: ____________________________

Date: ___________________________________

Title/Position: _______________________________

Date: _______________________________________

Page 41: Test Results Summary for 2014 Edition EHR …connect.ul.com/rs/365-LEA-623/images/17-3199-R-0022-PRA...Test Results Summary for 2014 Edition EHR Certification 17-3199-R-0022-PRA V1.0,

31 | CompuGroup Medical

Safety-Enhanced Design

© Copyright 2014 CompuGroup Medical, Inc. All rights reserved. | May not be reproduced without prior written permission. | www.CGMus.com

APPENDIX 4 – USABILITY TESTING MODERATOR INTRODUCTION Thank you for participating in this study. Our session today will last approximately 90 minutes. I will ask you to complete a few tasks using this system and answer some questions. We are interested in how easy (or how difficult) this system is to use, what in it would be useful to you and how we could improve it. You will be asked to complete these tasks on your own trying to do them as quickly as possible with the fewest possible errors or deviations. Do not do anything more than asked. If you get lost or have difficulty, we cannot answer any questions or help you with the task within the system itself. We ask that you offer your honest opinions regarding the usability of these functions, but please save your detailed comments until the end of a task or the end of the session as a whole when we can discuss freely. The product version you will be using today is still in the development stages. Some of the data may not make sense. During the tasks you will be performing in each session, we will ask you to make decisions that may contradict how you would normally practice. Please follow the tasks as we have outlined them because we have designed them to test the usability of specific workflows. We are recording the audio and video of our session today. Your voice, actions and number of clicks will be recorded. All of the information that you provide will be kept confidential and your name will not be associated with your comments at any time. We will be utilizing Team Viewer today for the session which will allow you to watch a brief training of the Clinical areas you will be performing tasks in; once the training is completed, I will ask you to review the task you will be performing before we begin, then you will take control of the session so that you can complete each task. Do you have any questions or concerns? Are you ready to begin?

Page 42: Test Results Summary for 2014 Edition EHR …connect.ul.com/rs/365-LEA-623/images/17-3199-R-0022-PRA...Test Results Summary for 2014 Edition EHR Certification 17-3199-R-0022-PRA V1.0,

32 | CompuGroup Medical

Safety-Enhanced Design

© Copyright 2014 CompuGroup Medical, Inc. All rights reserved. | May not be reproduced without prior written permission. | www.CGMus.com

APPENDIX 5 – EXAMPLE MODERATOR’S GUIDE (MEDICATION LIST)

Summative Testing Process for §170.314(a)(6) Medication List Pre-Entered Data: Patient Record: James Whitcomb DOB: 05/29/1953 Previously Entered Medication:

Simvastatin 20 mg tablet by mouth once daily. [Moderator] During these tasks we will ask you to make decisions that may differ from how you would practice. Please follow the tasks as outlined them as they are designed to test the usability of specific workflows. Please take a few moments to review Task A and let me know when you are ready. Please don’t begin any task until I ask you to begin. PAUSE and wait for the participant to say they are ready.

TASK A: ENTER A MEDICATION Patient: James Whitcomb Start Point: Navigate to James Whitcomb’s Facesheet Medication Pane [Moderator] Patient James Whitcomb is in the clinic today and during your visit today with him he gives you information about medications that have been prescribed previously or elsewhere that need to be added to his medication list. Enter the following active medication information: Synthroid 100 mcg 1 tablet daily Count: 30 Duration: None Refills: None Start Date: 10/1/13 End Date: None Please Begin

Page 43: Test Results Summary for 2014 Edition EHR …connect.ul.com/rs/365-LEA-623/images/17-3199-R-0022-PRA...Test Results Summary for 2014 Edition EHR Certification 17-3199-R-0022-PRA V1.0,

33 | CompuGroup Medical

Safety-Enhanced Design

© Copyright 2014 CompuGroup Medical, Inc. All rights reserved. | May not be reproduced without prior written permission. | www.CGMus.com

PARTICIPANT SAYS STOP [Moderator] Thank you for completing Task A On a scale of 1 to 5, one being simple and 5 being difficult, how would you rate this task? Optimal Path: START Start Facesheet Medication Pane

Right Click Add Medication to History

Enter Synthroid

Click “Search”

Select: Synthroid 100 mcg

Enter Dosage: 1 Tab

Enter Interval: QD

Enter: Count 30

Enter Start Date: 10/1/13

Click “Done” *Verify the sig, should match what the provider is supposed to enter

STOP [Moderator] Please take a few moments to review Task B and let me know when you are ready and we will begin. PAUSE and wait for the participant to say they are ready.

TASK B: CHANGE MEDICATION STRENGTH/DOSAGE Patient: James Whitcomb Start Point: Navigate to James Whitcomb’s Facesheet- Medication Pane [Moderator] James Whitcomb has told you that his Synthroid dosage has changed. Modify the Synthroid 100 mcg tablet to Synthroid 125 mcg mg. Only the dosage has changed, the instructions remain the same. Make the change in the medication list, making sure that Synthroid 125 mcg tablet remains on the “Current Medication List”.

Page 44: Test Results Summary for 2014 Edition EHR …connect.ul.com/rs/365-LEA-623/images/17-3199-R-0022-PRA...Test Results Summary for 2014 Edition EHR Certification 17-3199-R-0022-PRA V1.0,

34 | CompuGroup Medical

Safety-Enhanced Design

© Copyright 2014 CompuGroup Medical, Inc. All rights reserved. | May not be reproduced without prior written permission. | www.CGMus.com

Please Begin PARTICIPANT SAYS STOP [Moderator] “Thank you for completing Task B” “On a scale of 1 to 5, one being simple and 5 being difficult, how would you rate this task?” Optimal Path: START Start Facesheet Medication Pane

Right Click Synthroid 100 mcg

Select “Modify Medication Hx”

Leave “ Discontinue Date” default of today’s date – Click “OK”

Select the dosage for Synthroid 125 mcg

Click Done *Verify the sig, should match what the provider is supposed to enter

STOP [Moderator] Please take a few moments to review Task C and let me know when you are ready and we will begin. PAUSE and wait for the participant to say they are ready.

TASK C: DISCONTINUE A MEDICATION Patient: James Whitcomb Start Point: Navigate to James Whitcomb’s Facesheet- Medication Pane [Moderator] James Whitcomb tells you he has completed taking Simvastatin 20 mg on November 1, 2013. Discontinue Simvastatin 20 mg. The patient completed this medication on November 1, 2013. Please Begin PARTICIPANT SAYS STOP

Page 45: Test Results Summary for 2014 Edition EHR …connect.ul.com/rs/365-LEA-623/images/17-3199-R-0022-PRA...Test Results Summary for 2014 Edition EHR Certification 17-3199-R-0022-PRA V1.0,

35 | CompuGroup Medical

Safety-Enhanced Design

© Copyright 2014 CompuGroup Medical, Inc. All rights reserved. | May not be reproduced without prior written permission. | www.CGMus.com

[Moderator] Thank you for completing Task C On a scale of 1 to 5, one being simple and 5 being difficult, how would you rate this task? Optimal Path: START Start Facesheet Medication Pane

Right Click Simvastatin 20 mg

Select Discontinue Medication

Change date to November 1, 2013

Click OK

STOP [Moderator] Please take a few moments to review Task D and let me know when you are ready and we will begin. PAUSE and wait for the participant to say they are ready.

TASK D: VIEW THE PATIENT’S MEDICATION HISTORY Patient: James Whitcomb Start Point: Navigate to James Whitcomb’s Facesheet- Medication Pane [Moderator] View James Whitcomb’s Medication History Please Begin PARTICIPANT SAYS STOP [Moderator] Thank you for completing Task D On a scale of 1 to 5, one being simple and 5 being difficult, how would you rate this task? Optimal Path: START Start Facesheet Medication Pane

Click Medication List Drop Down

Page 46: Test Results Summary for 2014 Edition EHR …connect.ul.com/rs/365-LEA-623/images/17-3199-R-0022-PRA...Test Results Summary for 2014 Edition EHR Certification 17-3199-R-0022-PRA V1.0,

36 | CompuGroup Medical

Safety-Enhanced Design

© Copyright 2014 CompuGroup Medical, Inc. All rights reserved. | May not be reproduced without prior written permission. | www.CGMus.com

Choose “All Medications”

STOP [Moderator] Please take a few moments to review Task E and let me know when you are ready and we will begin. PAUSE and wait for the participant to say they are ready.

TASK E: MARK THAT PATIENT IS CURRENTLY NOT TAKING ANY MEDICATIONS Patient: James Whitcomb Start Point: Navigate to James Whitcomb’s Facesheet- Medication Pane Current Medications [Moderator] James Whitcomb has informed you that he is no longer taking any medications. Mark that change in his record. Please Begin PARTICIPANT SAYS STOP

[Moderator] Thank you for completing Task E On a scale of 1 to 5, one being simple and 5 being difficult, how would you rate this task? Optimal Path: START Start Facesheet Medication Pane Current Medications

Right Click Synthroid 125mcg

Choose Discontinue

End Date: Leave as current (today’s date – default)

Choose “Ok”

Right Click in Medications pane

Select “Not taking any medications”

STOP

Page 47: Test Results Summary for 2014 Edition EHR …connect.ul.com/rs/365-LEA-623/images/17-3199-R-0022-PRA...Test Results Summary for 2014 Edition EHR Certification 17-3199-R-0022-PRA V1.0,

37 | CompuGroup Medical

Safety-Enhanced Design

© Copyright 2014 CompuGroup Medical, Inc. All rights reserved. | May not be reproduced without prior written permission. | www.CGMus.com

APPENDIX 6 – CLOSING QUESTIONNAIRE

CLOSING QUESTIONNAIRE Please answer the below questions:

1) What was your overall impression of this system?

2) What aspects of the system did you like the least?

3) Were there any features that you were surprised to see?

4) What features did you expect to encounter, but did not see? That is, is there anything that is missing in this application?

5) Compare this system to other systems that you have used.

6) Would you recommend this system to your colleagues based on the functionality that you saw today?

Page 48: Test Results Summary for 2014 Edition EHR …connect.ul.com/rs/365-LEA-623/images/17-3199-R-0022-PRA...Test Results Summary for 2014 Edition EHR Certification 17-3199-R-0022-PRA V1.0,

38 | CompuGroup Medical

Safety-Enhanced Design

© Copyright 2014 CompuGroup Medical, Inc. All rights reserved. | May not be reproduced without prior written permission. | www.CGMus.com

APPENDIX 7 – SYSTEM USABILITY SCALE QUESTIONNAIRE (SUS)

Strongly Strongly Disagree Agree

1. I think that I would like to use this system frequently.

1 2 3 4 5

2. I found the system unnecessarily complex.

1 2 3 4 5

3. I thought the system was easy to use.

1 2 3 4 5

4. I think that I would need the support of

a technical person to be able to use this system.

1 2 3 4 5

5. I found the various functions in this system were well integrated.

1 2 3 4 5

6. I thought there was too much inconsistency

in this system.

1 2 3 4 5

7. I would imagine that most people would learn to

use this system very quickly.

1 2 3 4 5

8. I found the system very cumbersome to use.

1 2 3 4 5

9. I felt very confident using the system.

1 2 3 4 5

10. I needed to learn a lot of things before

I could get going with this system.

1 2 3 4 5

11. Please provide any additional feedback on the back of the questionnaire.

Page 49: Test Results Summary for 2014 Edition EHR …connect.ul.com/rs/365-LEA-623/images/17-3199-R-0022-PRA...Test Results Summary for 2014 Edition EHR Certification 17-3199-R-0022-PRA V1.0,

CompuGroup Medical US • 3300 N. Central Ave, Suite 2100 • Phoenix, AZ 85012

Synchronizing Healthcare • www.CGMus.com

Amendment: User-Centered Design Process §170.314(g)(3) Safety-enhanced design

The below user-centered design process was used in the development of CGM CLINICAL™ 8.2:

NISTIR 7741: NIST Guide to Processes Approach for improving the Usability of Electronic Health Records https://www.nist.gov/manuscript-publication-seach.cfm?pub_id=907313

The UCD method was applied to the following certification criteria:

§170.314(a)(1) Computerized provider order entry

§170.314(a)(2) Drug-drug, drug-allergy interaction checks

§170.314(a)(6) Medication list

§170.314(a)(7) Medication allergy list

§170.314(a)(8) Clinical decision support

§170.314(b)(3) Electronic prescribing

§170.314(b)(4) Clinical information reconciliation

Page 50: Test Results Summary for 2014 Edition EHR …connect.ul.com/rs/365-LEA-623/images/17-3199-R-0022-PRA...Test Results Summary for 2014 Edition EHR Certification 17-3199-R-0022-PRA V1.0,

CompuGroup Medical US • 3300 N. Central Ave, Suite 2100 • Phoenix, AZ 85012

Synchronizing Healthcare • www.CGMus.com

Amendment: Demographics §170.314(g)(3) Safety-enhanced design

Participants

Part ID

Gender Age Range

Education Occupation/role Professional Experience

Computer Experience

Product Experience

Assistive Technology Needs

1 Male 40 to 59

MD, PhD Doctor of Medicine – Gastroenterology

16 years in position

Uses computer 40+ hours a week for reading news, shopping/banking, digital photos, video, word processing; typically uses windows computer

Uses product daily, 40 hours per week – used for 12 years

None

2 Male 60 to 74

MD Doctor of Medicine – Neurology

31 years in position

Uses computer 40+ hours a week for reading news, shopping/banking, digital photos, video, word processing; typically uses windows computer at work, Mac at home

Uses product daily – has used for 13 years

None

4 Male 60 to 74

MD Doctor of Medicine – Primary Care/GP

33 years in position

Uses computer 40+ hours a week for banking, shopping, composing, reading news, research; typically uses windows and mac computers

Uses product daily – has used for over 10 years

None

5 Male 60 to 74

MD Doctor of Medicine – Cardiology

31 years in position

Uses computer 21 to 30 hours a week for research, reading news, shopping/banking,

Uses product daily – has used for 10 years

None

Page 51: Test Results Summary for 2014 Edition EHR …connect.ul.com/rs/365-LEA-623/images/17-3199-R-0022-PRA...Test Results Summary for 2014 Edition EHR Certification 17-3199-R-0022-PRA V1.0,

CompuGroup Medical US • 3300 N. Central Ave, Suite 2100 • Phoenix, AZ 85012

Synchronizing Healthcare • www.CGMus.com

digital pictures, programming/word processing; typically uses windows computers

6 Female 40 to 59

MD Doctor of Medicine – Pain Management

17 years in position

Uses computer 21 to 30 hours a week for reading news, shopping/banking, digital photos, video, word processing/excel; typically uses windows and Mac computers

Use product 5-7 days a week – has used for about 5 years

None

Page 52: Test Results Summary for 2014 Edition EHR …connect.ul.com/rs/365-LEA-623/images/17-3199-R-0022-PRA...Test Results Summary for 2014 Edition EHR Certification 17-3199-R-0022-PRA V1.0,

Test Results Summary for 2014 Edition EHR Certification17-3199-R-0022-PRA V1.0, November 17, 2017

©2017 InfoGard. May be reproduced only in its original entirety, without revision 11

Appendix B: Quality Management System

Page 53: Test Results Summary for 2014 Edition EHR …connect.ul.com/rs/365-LEA-623/images/17-3199-R-0022-PRA...Test Results Summary for 2014 Edition EHR Certification 17-3199-R-0022-PRA V1.0,

Quality Management System Attestation Form-EHR-37-V02

InfoGard Laboratories, Inc. Page 1

For reporting information related to testing of 170.314(g)(4).

Vendor and Product Information

Vendor Name CompuGroup Medical

Product Name CGM CLINICAL™

Product Version 8.2

Quality Management System

Type of Quality Management System (QMS) used in the development, testing, implementation, and maintenance of EHR product.

Based on Industry Standard (for example ISO9001, IEC 62304, ISO 13485, etc.). Standard:

A modified or “home-grown” QMS.

No QMS was used.

Was one QMS used for all certification criteria or were multiple QMS applied?

One QMS used.

Multiple QMS used.

Description or documentation of QMS applied to each criteria: 14-03-03 CGM Clinical QMS. This document was previously provided with the vendor questionnaire test packet.

Not Applicable.

Statement of Compliance

I, the undersigned, attest that the statements in this document are completed and accurate.

Vendor Signature by an Authorized Representative

Liza Patchen

Date 4/4/14

Page 54: Test Results Summary for 2014 Edition EHR …connect.ul.com/rs/365-LEA-623/images/17-3199-R-0022-PRA...Test Results Summary for 2014 Edition EHR Certification 17-3199-R-0022-PRA V1.0,

Test Results Summary for 2014 Edition EHR Certification17-3199-R-0022-PRA V1.0, November 17, 2017

©2017 InfoGard. May be reproduced only in its original entirety, without revision 12

Appendix C: Privacy and Security

Page 55: Test Results Summary for 2014 Edition EHR …connect.ul.com/rs/365-LEA-623/images/17-3199-R-0022-PRA...Test Results Summary for 2014 Edition EHR Certification 17-3199-R-0022-PRA V1.0,

Privacy and Security Attestation Form-EHR-36-V02

InfoGard Laboratories, Inc. Page 1

Vendor and Product Information

Vendor Name CompuGroup Medical

Product Name CGM CLINICAL™

Product Version 8.2

Privacy and Security

170.314(d)(2) Auditable events and tamper-resistance

Not Applicable (did not test to this criteria)

Audit Log:

Cannot be disabled by any user.

Audit Log can be disabled.

The EHR enforces that the audit log is enabled by default when initially configured

Audit Log Status Indicator:

Cannot be disabled by any user.

Audit Log Status can be disabled

The EHR enforces a default audit log status. Identify the default setting (enabled or disabled):

There is no Audit Log Status Indicator because the Audit Log cannot be disabled.

Encryption Status Indicator (encryption of health information locally on end user device):

Cannot be disabled by any user.

Encryption Status Indicator can be disabled

The EHR enforces a default encryption status. Identify the default setting (enabled or disabled):

There is no Encryption Status Indicator because the EHR does not allow health information to be stored locally on end user devices.

Identify the submitted documentation that describes the inability of the EHR to allow users to disable the audit logs, the audit log status, and/or the encryption status: This is described in the file named

"14-03-04 Amendment Vendor Questionnaire CGM Clinical Auditable Events and Tamper-Resistance_V4" document that was submitted with the CGM Clinical vendor questionnaire packet. The document title is

"Amendment Vendor Questionnaire section §170.314(d)(2) Auditable events and tamper-resistance page 20-Encryption and Audit log." We use SHA-1 key size 256.

Page 56: Test Results Summary for 2014 Edition EHR …connect.ul.com/rs/365-LEA-623/images/17-3199-R-0022-PRA...Test Results Summary for 2014 Edition EHR Certification 17-3199-R-0022-PRA V1.0,

Privacy and Security Attestation Form-EHR-36-V02

InfoGard Laboratories, Inc. Page 2

Identify the submitted documentation that describes the method(s) by which the EHR protects 1) recording of actions related to electronic health information, 2) recording of audit log status, and 3) recording of encryption status from being changed, overwritten, or deleted by the EHR technology: Demostrated during testing of D2. Please see previously submitted vendor questionnaire called "CompuGroup Medical EHR 2014 Vendor Questionnaire-CGM Clinical" on page 21-23. The answers to this section outline this process.

Identify the submitted documentation that describes the method(s) by which the EHR technology detects whether the audit log has been altered: During testing the audit log demostration had a label for audit purposes "no tamper evident" . Tamper is detected by comparing hash values as descrided in the CompuGroup Medical EHR 2014 Vendor Questionnaire-CGM Clinical" on page 23. The answers to this section outline this process.

170.314(d)(7) End-user device encryption

Storing electronic health information locally on end-user devices (i.e. temp files, cookies, or other types of cache approaches).

Not Applicable (did not test to this criteria)

The EHR does not allow health information to be stored locally on end-user devices.

Identify the submitted documentation that describes the functionality used to prevent health information from being stored locally:

The EHR does allow health information to be stored locally on end user devices.

Identify the FIPS 140-2 approved algorithm used for encryption: SHA-1

Identify the submitted documentation that describes how health information is encrypted when stored locally on end-user devices:

This description can be found in the the file "14-03-03 Amendment Vendor Questionnaire section CGM Clinical End-User Encryption_V2" document previously provided with the CGM Clinical vendor questionnaire packet. The document title is "Amendment Vendor Questionnaire section §170.314(d)(7) End-User Encryption page 25-"

The EHR enforces default configuration settings that either enforces the encryption of locally stored health information or prevents health information from being stored locally.

Identify the default setting:

170.314(d)(8) Integrity

Not Applicable (did not test to this criteria)

Identify the hashing algorithm used for integrity (SHA-1 or higher): SHA-1

Statement of Compliance

Page 57: Test Results Summary for 2014 Edition EHR …connect.ul.com/rs/365-LEA-623/images/17-3199-R-0022-PRA...Test Results Summary for 2014 Edition EHR Certification 17-3199-R-0022-PRA V1.0,

Privacy and Security Attestation Form-EHR-36-V02

InfoGard Laboratories, Inc. Page 3

I, the undersigned, attest that the statements in this document are accurate.

Vendor Signature by an Authorized Representative Liza Patchen

Date 4/4/14

Page 58: Test Results Summary for 2014 Edition EHR …connect.ul.com/rs/365-LEA-623/images/17-3199-R-0022-PRA...Test Results Summary for 2014 Edition EHR Certification 17-3199-R-0022-PRA V1.0,

Test Results Summary for 2014 Edition EHR Certification17-3199-R-0022-PRA V1.0, November 17, 2017

©2017 InfoGard. May be reproduced only in its original entirety, without revision 13

Test Results Summary Document History Version

V1.0 Initial release

END OF DOCUMENT

Description of Change DateNovember 17, 2017