contact us

If you have questions or comments, feel free to connect with us using the form to the right.

One of our office staff will be happy to provide you with any information you might need.  

523 N Sam Houston Pkwy Suite 125
Houston, TX 77060
USA

(866) 675-6277

Professional Imaging is the largest provider of consultations for swallowing disorders in the United States.  

Our objective to provide the most timely and in-depth medical evaluation based on the patient's needs.   

We take pride in offering the most comprehensive, patient-centered, on-site evaluations available in the medical community.  

Each Professional Imaging clinic is staffed with a licensed physician, a certified speech language pathologist, and a driver technician whom aids in transportation of patients to and from facilities.  The physician compiles a complete medical history of each patient by performing extensive chart review, interview of family, staff, or patient as appropriate. Once a patient’s case history is completed the doctor then performs a focus-expanded physical exam of the patient along with the radiographic MBSS in conjunction with the speech language pathologist. Upon completion of the consultation, recommendations are made to the primary care physician, facility SLP, and facility nursing staff.  

Not only will we provide efficient delivery of high quality services, but we will also be dedicated to treating each and every patient with dignity and respect.   We continually strive to be the leader in service providers of specialized dysphagia evaluations.

 

2024 EHR RWT Results

 Professional Imaging, LLC PI EMR v2

2024 Real World Testing Results

 

 

CHPL Product Number: 15.07.05.2835.PROI.01.01.1.231027

Plan Report ID: 20231103pro

Developer RWT Plan/Results Page URL: https://www.proimagetx.com/ehr-real-world-testing-plan

Care Setting: Ambulatory Interaction with Nursing Facilities

PI EMR is a Self-Developed EHR

 

Changes to Original Plan

 

Summary of Change

Reason

Impact

Throughout the testing, for criteria requiring validation of CCDA files, the most recently updated validator was used. (i.e. v3 was used instead of v1.)

We felt it was important to test the files using the most up to date validation method as it provided the best available test of the system.

The testing showed all CCDA files successfully validated without errors.

The date range for sampling of patient portal access records for the Patient Engagement criteria was expanded from 3 months to 9 months.

We elected to take a larger sample size as we felt it would give a more complete picture of patient engagement.

The testing showed there were no failures to access documents.

 

Summary of Testing Methods and Key Findings

 

Care Coordination

The following outlines testing methods and findings regarding criteria concerning Care Coordination (170.315(b)(1) and 170.315(b)(2)).

 

The b1 criteria was tested by generating both a Continuity of Care and a Referral Note CCDA R2.1 document using a representative synthetic patient. Both CCDA xml files generated were successfully validated using the ONC C-CDA R2.1 Validator for USCDI v3 Tool (at https://ett.healthit.gov/ett/#/validators/ccdauscidv3#ccdaValdReport).

 

The b2 criteria was tested using a representative synthetic patient. First, we imported an initial patient record (CCDA R2.1 xml) file. Then we imported an updated patient record for the same patient containing new medications, problems and alerts and reconciled (merged) it with the medications, problems and alerts in the initial patient record. To simulate a referral from another practice, we generated the initial and updated patient records (CCDA R2.1 files) at a practice at one location and then transmitted the CCDA files to a 2nd practice at a different location using a secure messaging system. Importing, reconciliation and incorporation of the initial and updated CCDA files was then performed at the 2nd practice. The resulting patient record was exported in CCDA R2.1 format and validated using the ONC C-CDA R2.1 Validator for USCDI v3 Tool (at https://ett.healthit.gov/ett/#/validators/ccdauscidv3#ccdaValdReport). This test resulted in the data being successfully imported and applied to the target patient with no errors.

 

Patient Engagement

The following outlines testing methods and results for the certification criteria concerning Patient Engagement (170.315(e)(1)).

 

We used analysis of the usage logs from our patient portals to provide total counts of documents viewed and downloaded from our patient portal. At our practice's request the option to transmit patient records has not been enabled. To test the transmission of patient records we transmitted a representative sample of synthetic patient records in CCDA R2.1 format from our EHR to our HISP provider. A representative sample of downloaded and transmitted CCDA R2.1 files were successfully validated using the ONC C-CDA R2.1 Validator for USCDI v3 Tool (at https://ett.healthit.gov/ett/#/validators/ccdauscidv3#ccdaValdReport).

 

We used analysis of use logs for our Patient Portals to provide (e)(1) RWT results for viewing and downloading of patient records. Instead of using new accounts for a 3 month period, we included all new accounts generated in 2024 up to 9/9/2024. Regarding transmission of patient records, we provide printed copies to the Nursing Facilities. This negates the demand for electronic transmission of patient records. As a result, we have used a representative sample of synthetic patient records to show that we do have compliant functionality for transmitting patient records.

 

The test results for patient records accessed from the patient portal as of 9/9/2024 showed a total of 7646 documents viewed. No failures were recorded. The test results for downloading patient records from the patient portal as of 9/9/2024 showed, that out of 3910 documents downloaded, no failures were detected

 

The transmit test results using CCDA test documents for two representative synthetic patients were validated using the ONC C-CDA R2.1 Validator for USCDI v3 Tool (at https://ett.healthit.gov/ett/#/validators/ccdauscidv3#ccdaValdReport) and showed no failures.

 

Public Health

The following outlines the test methods and results for the certification criteria concerning Transmission to Public Health Agencies (170.315(f)(2) and 170.315(f)(7)).

 

For f2 we generated NIST HL7v2 Syndromic Surveillance xml files for a representative sample of synthetic patient data based on our actual patient cases. This process produced two synthetic patient files. We confirmed the accuracy and completion of these test files then used the NIST HL7v2 Syndromic Surveillance validation website for validation.

 

Also, two NIST HL7v2 Syndromic Surveillance xml files were generated by PI EMR as a representative sample of synthetic patient data based on our actual patient cases. To verify the completeness and accuracy of the test files, the NIST HL7v2 Syndromic Surveillance validation tool at https://hl7v2-ss-r2-testing.nist.gov/ss-r2/#/home was used and showed no errors.

 

For f7, we generated a NHCS xml survey file for a representative sample of synthetic patient data based on our actual patient cases. This process produced two synthetic patient files. We confirmed the accuracy and completion of these test files, then used the NHCS validation website to validate the generated xml test files.

 

Also, two NHCS xml survey files were generated by PI EMR as a representative sample of synthetic patient data based on our actual patient cases. To verify the completeness and accuracy of these test files, we used the NHCS validation at https://cda-validation.nist.gov/cda-validation/muNHCS12.html was used and showed no errors. 

 

Application Programming Interfaces

The following outlines the testing methods and results for the certification criteria concerning (170.315(g)(9)).

 

We used an in-house developed application named PI_EHR_API_Test that demonstrates use of our certified EHR API.

 

First, we generated an encrypted api key that allows access to a specific representative synthetic patient (name RWT_PT PATIENT_01, DOB 1960-05-01).

 

Testing of g9 was performed by downloading a CCDA R2.1 document for the aforementioned synthetic patient. This file was validated using the ONC C-CDA R2.1 Validator for USCDI v3 Tool (at https://ett.healthit.gov/ett/#/validators/ccdauscidv3#ccdaValdReport) and no errors were found.

 

Electronic Exchange

The following outlines the testing methods and results for the certification criteria concerning Electronic Exchange (170.315(h)(1)).

 

Using our EHR, we sent a Direct message to a test Direct address containing a zip file attachment. This attachment was downloaded from the inbox at the HISP provider’s website, renamed and sent back in a reply to our EHR to ensure the zip file contents remain unchanged.

 

The above test zip file was generated by looking up a synthetic test patient then exporting it in CCDA R2 xml format and then zipping the file along with a separate file containing a 256 SHA-2 hash digest.

 

The received zip file was then downloaded by the test user and renamed from

RWT_PT_PATIENT_01_Pt_Smy(2022-06-30).zip   to   RWT_PT_PATIENT_01_Pt_Smy(2022-06-30) - Reply.zip

 

The test user then replied to the original Direct message and sent the renamed zip file as an attachment. The PI EMR Messaging function was then used to receive the reply and download the attachment.

 

The contents of the original zip file sent by PI EMR were electronically compared to the contents of the downloaded attachment in the reply from the test user and were confirmed to contain exactly the same data.

 

Also, the CCDA R2 xml file in the zip file attachment was successfully validated using the ONC C-CDA R2.1 Validator for USCDI v3 Tool (at https://ett.healthit.gov/ett/#/validators/ccdauscidv3#ccdaValdReport).

 

Relied Upon Software

·         MaxMD for b1, b2, g9, and h1

·         MaxMD, Meinberg NTP, and WebSupergooABCpdf for e1

 

Key Milestones

Communication with our affiliate practice to schedule participation was conducted as necessary via phone conversation throughout the first 3 quarters of 2024.

 

Dates testing was performed were as follows:

·         (b)(1) and (b)(2) – September 9, 2024

·         (e)(1) – September 8/9, 2024

·         (f)(2) and (f)(7) – September 8/9, 2024

·         (g)(9) – September 9, 2024

·         (h)(1) – September 8/9, 2024

 

Our CY 2025 real-world test plan was completed and submitted to our ONC-ACB on October 14, 2024

 

All test documents associated with this report are available upon request.

 

Overall Summary

The testing methods employed, and necessary changes made from those planned are specified in the report sections for each category of functionality tested. The reported results of this testing effectively verify that the certified functionality tested complies with the criteria requirements as no errors were encountered.

 

Standards Updates

There were no updates to the certified criteria during the period in which testing was conducted.