EHR Real World Testing Plan 2025
Current Plan and Results
Professional Imaging, LLC
PI EMR v2 - CHPL ID 15.07.05.2835.PROI.01.01.1.230127
Developer Real World Test Plan URL: https://www.proimagetx.com/ehr-real-world-testing-plan
2025 RWT Plan
This Real World Test Plan is ACB approved and the Plan ID # is 20241106pro.
Justification for Real World Testing Approach
The overall approach for our Real World Testing includes extraction and analysis of data regarding ongoing activity and simulated testing where there is no ongoing use of relevant certified functionality.
The PI EMR v2.0 Certified Health IT Module is self-developed. It is not available for sale. It is only used by Professional Imaging LLC and its affiliates. As a self-developer with a very narrow practice scope, resources for implementing RWT must, by necessity, be limited to mitigate negative financial impact.
The care setting in which this module is used is an ambulatory interaction with Nursing Facilities. For some of the criteria covered in the Real World Testing plan, testing will be conducted using simulated environments utilizing a pass/fail metric due to the absence of real world use. The affected criteria in this regard are those related to Care Coordination, Public Health, Application Program Interfaces and Electronic Exchange. As there is no ongoing activity in these cases, it is not possible to measure ongoing compliance.
The circumstances for this are as follows:
As yet, no Nursing Facilities have requested use of the interoperability functionality being tested. This is due to the fact that Nursing Facilities are not required to use an EHR as doing so is not seen as a benefit.
No Nursing Facility, patient, patient representative or providing clinician has ever requested access to our API.
We are either unqualified for or unable to (due to unavailability of agencies prepared to receive the data) perform the activities covered by criteria related to Transmission to public health Agencies.
In the cases mentioned above, our Real World Testing approach measures success through demonstrating conformance and interoperability through simulated testing. Testing will be conducted to confirm and document that each data point transmitted and received is accurate and complete as required by the relevant certification criteria.
For certified functionality that we do have ongoing activity, the records of that activity will be analyzed regarding the successful transmission and receipt of relevant information. Documentation of that analysis will be provided.
The RWT plan may also utilize data transfer with an affiliate practice. The affiliate practice, which provides the same type of specialized service as Professional Imaging, has been selected due to their ongoing relationship with our practice and their availability for this testing.
Where applicable, several certification criteria may be tested simultaneously.
Criteria to be Tested
§ 170.315(b)(1) Transitions of care
§ 170.315(b)(2) Clinical information reconciliation and incorporation
§ 170.315(b)(10) Electronic Health Information Export
§ 170.315(e)(1) View, download, and transmit to 3rd party.
§ 170.315(f)(2) Syndromic surveillance
§ 170.315(f)(7) Health care surveys
§ 170.315(g)(9) Application access— all data request
§ 170.315(h)(1) Direct Project
Standards Updates
There have been no standards updates.
Schedule of Key Milestones
1Q-2025: Begin communication with affiliate practice to schedule participation in real-world testing.
2Q-3Q 2025: During the 2nd and 3rd quarter, real world testing will be scheduled and performed. It is expected that a preparatory call will be made with the affiliate practice to prepare them for testing activities. Capture and analysis of data relevant to functionality will be conducted. Test results will be documented and ultimately used to build the test report. If any non-compliances are observed, we will notify the ONC-ACB of the findings and make the necessary changes required. Testing for (b)(10) may also occur in 4Q as the relied upon software vendor sees fit.
4Q-2025: During the last quarter of the year, the CY 2026 real-world test plan will be completed according to ONC and ONC-ACB requirements and expectations. The test plan will be prepared and submitted per the deadline requirements.
February 1 2026: Document our CY 2025 test results into our RWT Test Report and submit to our ONC-ACB.
Care Coordination
The following outlines the measure that has been identified to best demonstrate conformance to criteria concerning Care Coordination (170.315(b)(1) and 170.315(b)(2)).
Measure Description
The metric used for testing functionality covered by these criteria is the successful secure transmission, receipt, re-incorporation and data verification of the required transition of care documentation. Direct messaging will be conducted using MaxMD Direct mdEmail v.3.0 third party software.
Measure Justification
The stated metric was selected as it provides full testing of the functionality.
Expected Outcome
The expected outcome is that the data transmitted will be identical with that of the data received. This will be confirmed by the reconciliation process covered under (b)(2) and the received data is successfully imported and applied to the target patient record. That record will also be validated using the USCDI v1 R2.1 Validator Tool.
Test Methodology
· As self-developers, our practice does not use CCDA R2.1 documents in a real world setting to send or receive patient health records with outside providers. As such, we will conduct testing using synthetic patient data.
· For the b1 criteria, we will generate both a Continuity of Care and a Referral Note CCDA R2.1 document using a representative synthetic patient. Both CCDA xml files will be validated using the USCDI v1 R2.1 Validator Tool (at https://ett.healthit.gov/ett/#/validators/ccdar3).
· The b2 criteria will be tested using a representative synthetic patient. First, we will import an initial patient record (CCDA R2.1 xml) file. Next, we will import an updated patient record for the same patient containing new medications, problems and alerts and reconcile (merge) it with the medications, problems and alerts in the initial patient record. To simulate a referral from another practice, we will generate the initial and updated patient records (CCDA R2.1 files) at a practice at one location and then transmit the CCDA files to a 2nd practice at a different location using a secure messaging system. Importing, reconciliation and incorporation of the initial and updated CCDA files will then be performed at the 2nd practice. The resulting patient record will be exported in CCDA R2.1 format and validated using the USCDI v1 R2.1 Validator Tool (at https://ett.healthit.gov/ett/#/validators/ccdar3).
As we are using relied upon software Darena Solutions MeldRx for compliance with the requirements for 170.315(b)(10) Electronic Health Information Export, the following details the Real World Testing approach for this criteria.
Measure Description
Export USCDIv1 clinical data for a population of patients for use in a different health information technology product or a third-party system. The metric used for testing functionality is the successful export batches of patient data in a straightforward fashion and providing the exported data in the form of valid C-CDA files that conform to the HL7 standards as described in the HL7 Implementation Guide for CDA® Release 2: Consolidated CDA Templates for Clinical Notes.
Measure Justification
We will test the §170.315 (b)(10) criteria independently. The RWT will be witnessed via a Zoom session with the participants using a production environment and real patient data. Upon completion, we will observe the successful conformance of the certified technology to be able to Authenticate a patient/authorized representative and for them to request portions of the PHI using an application of their choice outside of the EHR’s domain.
Expected Outcome
• Date and time ranges can be configurable via the UI
• Targeted Practices can be configurable via the UI
• Patients exported can be configurable via the UI
• Logging in as a Vendor Admin will allow access to the export functionality
• Logging in as a non-Vendor Admin will not allow access to the export functionality
• Use the Edge Test Tool to check validity of output file
• Data was available for the entered date and time range
• The export summary contained data only within that date and time range
• Export summary was created and completed successfully
• Saving to a preferred location is allowed
• Visually confirming the export after save is performed and successful
• Prepare RWT results report with 95% plus success rate
Patient Engagement
The following outlines the measure that has been identified to best demonstrate conformance to the certification criteria concerning Patient Engagement (170.315(e)(1)).
Measure Description
This use case is tracking and counting how patients are given access to their portal account over the course of the reporting period. Third party software employed for testing includes MaxMD Direct mdEmail v.3.0 for direct messaging, Meinberg NTP (Version 4.2.8p12) for NTP compliance and time server synchronization and WebSupergoo ABCpdf (Version 10117) for zipping files requested for download or transmission.
Measure Justification
This use case measure will provide a numeric value to indicate both how often this interoperability feature is being used as well as its compliance to the requirement. An increment to this measure indicates that the EHR can create a new patient portal account and give the patient access to it. A survey can often provide more information on the impact and value of an interoperability element than a standard software test evaluation. The patient portal is intended to support patient engagement with their health records and the ability to transmit their patient data as a C-CDA or human readable copy.
Expected Outcome
We will get reporting values on patient portal access as well as patients use of the portal’s interoperability features. Measure #1: Report the number of new patient accounts created over a three (3) month period. The measurement will produce numeric results over a given interval. We will utilize various reports and audit logsto determine our measure count. A successful measure increment indicates compliance to the underlying ONC criteria.
Testing Methodology:
We will use analysis of the usage logs from our patient portals to provide total counts of documents added to our portals and total counts of documents viewed and downloaded. At our practice's request the option to transmit patient records has not been enabled. To test the transmission of patient records we will transmit a representative sample of synthetic patient records in CCDA R2.1 format from our EHR to our HISP provider. A representative sample of downloaded and transmitted CCDA R2.1 files will be validated using the USCDI v1 R2.1 Validator Tool (at https://ett.healthit.gov/ett/#/validators/ccdar3).
Public Health –
The following outlines the measure that has been identified to best demonstrate conformance to the certification criteria concerning Transmission to Public Health Agencies (170.315(f)(2) and 170.315(f)(7).
Measure Description
The metric used for testing functionality covered by these criteria is the successful secure transmission, receipt, verification and processing of the required public health data.
Measure Justification
The stated metric was selected as it provides full testing of the functionality.
Expected Outcome
The expected outcome is that the data transmitted will be identical with that of the data received and the data will be successfully processed by the receiving entity. This will be confirmed by the validation websites.
Test Methodology -
f2: Generate NIST HL7v2 Syndromic Surveillance xml files for a representative sample of synthetic patient data based on our actual patient cases. This process may produce as many as 5synthetic patient files, confirm the accuracy and completion of these test files then use the NIST HL7v2 Syndromic Surveillance validation website at https://hl7v2-ss-r2-testing.nist.gov/ss-r2/#/home for validation .
f7: generate a NHCS xml survey file for a representative sample of synthetic patient data based on our actual patient cases. This process may produce as many as 5 synthetic patientfiles,confirm the accuracy and completion of these test files, then use the NHCS validation website at https://cda-validation.nist.gov/cda-validation/muNHCS12.html to validate the generated xml test files.
Application Programming Interfaces
The following outlines the measure that has been identified to best demonstrate conformance to the certification criteria concerning APIs, 170.315(g)(9).
Measure Description
The metric used for testing functionality covered by these criteria is the successful secure retrieval and verification of the applicable data by a simulated patient. Third party software employed for testing includes MaxMD Direct mdEmail v.3.0 for direct messaging.
Measure Justification
The stated metric was selected as it provides full testing of the functionality.
Expected Outcome
The expected outcome is that the data received by the simulated patient will be identical with that of the data transmitted. This will be confirmed by the comparison between the data sets.
Testing Methodology
We will use an in-house developed application named PI_EHR_API_Test that demonstrates use of our certified EHR API.
Testing of g9 will be performed by downloading a CCDA R2.1 document for a synthetic patient. This file will be validated using the USCDI v1 R2.1 Validator Tool (at https://ett.healthit.gov/ett/#/validators/ccdar3).
Electronic Exchange
The following outlines the measure that has been identified to best demonstrate conformance to the certification criteria concerning Electronic Exchange (170.315(h)(1)).
Measure Description
The metric used for testing functionality covered by these criteria is the successful secure retrieval and verification of the applicable data for a simulated patient. Third party software employed for testing includes MaxMD Direct mdEmail v.3.0 for direct messaging.
Measure Justification
The stated metric was selected as it provides full testing of the functionality.
Expected Outcome
The expected outcome is that the data received by the simulated patient will be identical with that of the data transmitted. This will be confirmed by the comparison between the data sets.
Testing Methodology
Send 2 Direct messages to a test Direct address, one message with an attachment and one with no attachment. Then we will recover the Direct message replies to these messages, again one with an attachment and one with no attachment. This will be done by sending the message to a HISP provider test address then verifying the contents of the received messages on the HISP provider user website match our EHR.
This Real World Testing plan is complete with all required elements, including measures that address all certification criteria and care settings. All information in this plan is up to date and fully addresses the health IT developer Real World Testing requirements.
Authorized Representative Name: Michael Goforth
Authorized Representative Email: mike@proimagetx.com
Authorized Representative Phone: 360-359-3690
Authorized Representative Signature:
Date: November 6, 2024
Professional Imaging, LLC PI EMR v2
2024 Real World Testing Results
CHPL Product Number: 15.07.05.2835.PROI.01.01.1.231027
Plan Report ID: 20231103pro
Developer RWT Plan/Results Page URL: https://www.proimagetx.com/ehr-real-world-testing-plan
Care Setting: Ambulatory Interaction with Nursing Facilities
PI EMR is a Self-Developed EHR
Changes to Original Plan
Summary of Change
Reason
Impact
Throughout the testing, for criteria requiring validation of CCDA files, the most recently updated validator was used. (i.e. v3 was used instead of v1.)
We felt it was important to test the files using the most up to date validation method as it provided the best available test of the system.
The testing showed all CCDA files successfully validated without errors.
The date range for sampling of patient portal access records for the Patient Engagement criteria was expanded from 3 months to 9 months.
We elected to take a larger sample size as we felt it would give a more complete picture of patient engagement.
The testing showed there were no failures to access documents.
Summary of Testing Methods and Key Findings
Care Coordination
The following outlines testing methods and findings regarding criteria concerning Care Coordination (170.315(b)(1) and 170.315(b)(2)).
The b1 criteria was tested by generating both a Continuity of Care and a Referral Note CCDA R2.1 document using a representative synthetic patient. Both CCDA xml files generated were successfully validated using the ONC C-CDA R2.1 Validator for USCDI v3 Tool (at https://ett.healthit.gov/ett/#/validators/ccdauscidv3#ccdaValdReport).
The b2 criteria was tested using a representative synthetic patient. First, we imported an initial patient record (CCDA R2.1 xml) file. Then we imported an updated patient record for the same patient containing new medications, problems and alerts and reconciled (merged) it with the medications, problems and alerts in the initial patient record. To simulate a referral from another practice, we generated the initial and updated patient records (CCDA R2.1 files) at a practice at one location and then transmitted the CCDA files to a 2nd practice at a different location using a secure messaging system. Importing, reconciliation and incorporation of the initial and updated CCDA files was then performed at the 2nd practice. The resulting patient record was exported in CCDA R2.1 format and validated using the ONC C-CDA R2.1 Validator for USCDI v3 Tool (at https://ett.healthit.gov/ett/#/validators/ccdauscidv3#ccdaValdReport). This test resulted in the data being successfully imported and applied to the target patient with no errors.
Patient Engagement
The following outlines testing methods and results for the certification criteria concerning Patient Engagement (170.315(e)(1)).
We used analysis of the usage logs from our patient portals to provide total counts of documents viewed and downloaded from our patient portal. At our practice's request the option to transmit patient records has not been enabled. To test the transmission of patient records we transmitted a representative sample of synthetic patient records in CCDA R2.1 format from our EHR to our HISP provider. A representative sample of downloaded and transmitted CCDA R2.1 files were successfully validated using the ONC C-CDA R2.1 Validator for USCDI v3 Tool (at https://ett.healthit.gov/ett/#/validators/ccdauscidv3#ccdaValdReport).
We used analysis of use logs for our Patient Portals to provide (e)(1) RWT results for viewing and downloading of patient records. Instead of using new accounts for a 3 month period, we included all new accounts generated in 2024 up to 9/9/2024. Regarding transmission of patient records, we provide printed copies to the Nursing Facilities. This negates the demand for electronic transmission of patient records. As a result, we have used a representative sample of synthetic patient records to show that we do have compliant functionality for transmitting patient records.
The test results for patient records accessed from the patient portal as of 9/9/2024 showed a total of 7646 documents viewed. No failures were recorded. The test results for downloading patient records from the patient portal as of 9/9/2024 showed, that out of 3910 documents downloaded, no failures were detected
The transmit test results using CCDA test documents for two representative synthetic patients were validated using the ONC C-CDA R2.1 Validator for USCDI v3 Tool (at https://ett.healthit.gov/ett/#/validators/ccdauscidv3#ccdaValdReport) and showed no failures.
Public Health
The following outlines the test methods and results for the certification criteria concerning Transmission to Public Health Agencies (170.315(f)(2) and 170.315(f)(7)).
For f2 we generated NIST HL7v2 Syndromic Surveillance xml files for a representative sample of synthetic patient data based on our actual patient cases. This process produced two synthetic patient files. We confirmed the accuracy and completion of these test files then used the NIST HL7v2 Syndromic Surveillance validation website for validation.
Also, two NIST HL7v2 Syndromic Surveillance xml files were generated by PI EMR as a representative sample of synthetic patient data based on our actual patient cases. To verify the completeness and accuracy of the test files, the NIST HL7v2 Syndromic Surveillance validation tool at https://hl7v2-ss-r2-testing.nist.gov/ss-r2/#/home was used and showed no errors.
For f7, we generated a NHCS xml survey file for a representative sample of synthetic patient data based on our actual patient cases. This process produced two synthetic patient files. We confirmed the accuracy and completion of these test files, then used the NHCS validation website to validate the generated xml test files.
Also, two NHCS xml survey files were generated by PI EMR as a representative sample of synthetic patient data based on our actual patient cases. To verify the completeness and accuracy of these test files, we used the NHCS validation at https://cda-validation.nist.gov/cda-validation/muNHCS12.html was used and showed no errors.
Application Programming Interfaces
The following outlines the testing methods and results for the certification criteria concerning (170.315(g)(9)).
We used an in-house developed application named PI_EHR_API_Test that demonstrates use of our certified EHR API.
First, we generated an encrypted api key that allows access to a specific representative synthetic patient (name RWT_PT PATIENT_01, DOB 1960-05-01).
Testing of g9 was performed by downloading a CCDA R2.1 document for the aforementioned synthetic patient. This file was validated using the ONC C-CDA R2.1 Validator for USCDI v3 Tool (at https://ett.healthit.gov/ett/#/validators/ccdauscidv3#ccdaValdReport) and no errors were found.
Electronic Exchange
The following outlines the testing methods and results for the certification criteria concerning Electronic Exchange (170.315(h)(1)).
Using our EHR, we sent a Direct message to a test Direct address containing a zip file attachment. This attachment was downloaded from the inbox at the HISP provider’s website, renamed and sent back in a reply to our EHR to ensure the zip file contents remain unchanged.
The above test zip file was generated by looking up a synthetic test patient then exporting it in CCDA R2 xml format and then zipping the file along with a separate file containing a 256 SHA-2 hash digest.
The received zip file was then downloaded by the test user and renamed from
RWT_PT_PATIENT_01_Pt_Smy(2022-06-30).zip to RWT_PT_PATIENT_01_Pt_Smy(2022-06-30) - Reply.zip
The test user then replied to the original Direct message and sent the renamed zip file as an attachment. The PI EMR Messaging function was then used to receive the reply and download the attachment.
The contents of the original zip file sent by PI EMR were electronically compared to the contents of the downloaded attachment in the reply from the test user and were confirmed to contain exactly the same data.
Also, the CCDA R2 xml file in the zip file attachment was successfully validated using the ONC C-CDA R2.1 Validator for USCDI v3 Tool (at https://ett.healthit.gov/ett/#/validators/ccdauscidv3#ccdaValdReport).
Relied Upon Software
· MaxMD for b1, b2, g9, and h1
· MaxMD, Meinberg NTP, and WebSupergooABCpdf for e1
Key Milestones
Communication with our affiliate practice to schedule participation was conducted as necessary via phone conversation throughout the first 3 quarters of 2024.
Dates testing was performed were as follows:
· (b)(1) and (b)(2) – September 9, 2024
· (e)(1) – September 8/9, 2024
· (f)(2) and (f)(7) – September 8/9, 2024
· (g)(9) – September 9, 2024
· (h)(1) – September 8/9, 2024
Our CY 2025 real-world test plan was completed and submitted to our ONC-ACB on October 14, 2024
All test documents associated with this report are available upon request.
Overall Summary
The testing methods employed, and necessary changes made from those planned are specified in the report sections for each category of functionality tested. The reported results of this testing effectively verify that the certified functionality tested complies with the criteria requirements as no errors were encountered.
Standards Updates
There were no updates to the certified criteria during the period in which testing was conducted.