Child pages
  • V3 Vendor Data Quality Dashboard
Skip to end of metadata
Go to start of metadata

 

 

 

Introduction

The goal of the V3 Data Quality Dashboard is to provide vendor representatives with a detailed dashboard using nationally submitted data that allows them to assess:

  • the completeness and
  • the accuracy of their nationally submitted data

V3 Data Quality Dashboard Description

The purpose of the dashboard is to provide vendor representatives with access to detailed summary information about the data they are submitting to NEMSIS and the activities of EMS agencies under their purview. The dashboard provides summary information about both the quality and completeness of data submitted as well as the ability to make comparisons across agencies using a vendor's software. To this end, this dashboard contains two views:

  • Data Quality Report: A listing of different measures of data quality along with the corresponding individual, category and total scores, and
  • Agency Comparison Report: A line graph and table displaying the scores for all the agencies using a vendor's software submitting data using the V3 standard.

Because of the level of detail offered in these dashboards, access is restricted to vendor representatives whose software is used for V3 data submitted nationally.

Global Inclusion Criteria and Filters

The global inclusion criteria defines the minimum characteristics each event record must meet in order for it to be included in any of the calculations for the V3 Data Quality Dashboard. For details on each element, its code, and range of values see the NEMSIS Version 3.4 Data Dictionary.

For this dashboard, unlike the version 2 performance dashboard, there are no specific global inclusion criteria. All records submitted to NEMSIS will be a part of this dashboard. However, individual measurements of data quality and completeness may look at specific subsets of the data reported. We also offer several filtering options at the top of the dashboard in order to subset the data used to calculate the scores.

Available filters:

 

National Element Label

National Element Code

Agency Number

eResponse.01

Primary Role of Unit

eResponse.07

Software (Creator + Name)

eRecord.02 + eRecord.03

Software Version

eRecord.04

(Unit Notified by Dispatch) Date

eTimes.03

Type of Service

eResponse.05

Submitting State-

 

 

Data Quality Report

The data quality report page of the dashboard provides key metrics related to the quality and quantity of data submitted to NEMSIS for the selected reporting period. It contains seven different categories of measures used to assess performance.

Patient Information Section

The first group of metrics is Patient Information these measures evaluate the completeness of certain values when Incident/Patient Disposition is populated with any of the following values (i.e. values suggesting a patient contact):

  • Patient Dead at Scene-No Resuscitation Attempted (With Transport)
  • Patient Dead at Scene-No Resuscitation Attempted (Without Transport)
  • Patient Dead at Scene-Resuscitation Attempted (With Transport)
  • Patient Dead at Scene-Resuscitation Attempted (Without Transport)
  • Patient Evaluated, No Treatment/Transport Required
  • Patient Refused Evaluation/Care (With Transport)
  • Patient Refused Evaluation/Care (Without Transport)
  • Patient Treated, Released (AMA)
  • Patient Treated, Released (per protocol)
  • Patient Treated, Transferred Care to Another EMS Professional
  • Patient Treated, Transported by EMS
  • Patient Treated, Transported by Law Enforcement
  • Patient Treated, Transported by Private Vehicle

The metrics in this category simply check for completion with valid values, defined for each metric as follows:

  • Patient Age is not 0 and Age Unit is neither a “Not Value” nor NULL
  • Patient Gender is neither a “Not Value” nor NULL
  • Incident Location Type is neither a “Not Value” nor NULL
  • Incident State is neither a “Not Value” nor NULL
  • Incident County is neither a “Not Value” nor NULL
  • Incident Zip Code is greater than 0 and less than 99999

Percentage completion scores fall into color categories for >95% (green), 90%-95% (yellow), and less than 90% (red). Target percentage for these measures is AT LEAST 95%.

The total section completion score is shown to the far right of the section name and description. This score is also colored based on the above scale. To see more information about the denominator used in calculating the overall section score, mouse over the overall score and a tooltip will appear containing the information.

Cardiac Arrest Section

The Cardiac Arrest metrics evaluate the completeness of certain values when Cardiac Arrest (eArrest.01) has any of the following values: “Yes, Prior to EMS Arrival” or “Yes, After EMS Arrival”.

The measures in this category are considered accurate as follows:

  • At least one valid value (neither a “Not Value” nor NULL) is recorded for Arrest Witnessed By.
  • At least one valid value (neither a “Not Value” nor NULL) is recorded for Resuscitations Attempted.
  • Cardiac Arrest Etiology is not a “Not Value” or NULL.
  • At least one valid value (neither a “Not Value” nor NULL ) is recorded for Any Return of Spontaneous Circulation.
  • Date and Time of Cardiac Arrest is no earlier than 1/1/1900.
  • Resuscitation Attempted Value matches Type of CPR Provided is valid when these measures are completed with corresponding values, e.g. if a Resuscitation Attempted value includes “compressions” then there should be a Type of CPR Provided value recorded that includes “compressions”

Percentage completion scores fall into color categories for >95% (green), 90%-95% (yellow), and less than 90% (red). Target percentage for these measures is AT LEAST 95%.

The total section score is shown to the far right of the section name and description. This score is also colored based on the above scale. To see more information about the denominator used in calculating the overall section score, mouse over the overall score and a tooltip will appear containing the information.

Valid System Times Section

The Valid System Times metrics evaluate the completeness of certain values when Incident/Patient Disposition has the following value: “Patient Treated, Transported by EMS”.

The measures in this section are defined, and considered accurate as follows:

  • Scene Response Time is defined as the duration between Unit En Route Date Time and Unit Arrived on Scene Date TimeScene Response Time is considered valid when the value is greater than 0 and less than 1440 minutes (24 hours).
  • Scene Time is defined as the duration between Unit Arrived on Scene Date Time and Unit Left Scene Date TimeScene Time is considered valid when the value is greater than 0 and less than 1440 minutes (24 hours).
  • Transport Time is defined as the duration between Unit Left Scene Date Time and Unit Arrived at Destination Date TimeTransport Time is considered valid when the value is greater than 0 and less than 1440 minutes (24 hours).

Percentage completion scores fall into color categories for >95% (green), 90%-95% (yellow), and less than 90% (red). Target percentage for these measures is AT LEAST 95%.

The total section score is shown to the far right of the section name and description. This score is also colored based on the above scale. To see more information about the denominator used in calculating the overall section score, mouse over the overall score and a tooltip will appear containing the information.

Injury Information Section

The Injury Information metrics evaluate the completeness of certain values when Possible Injury (eSituation.02) is Yes, and certain values when Possible Injury is No.

These measures are considered valid as follows:

  • When Possible Injury is Yes, Cause of Injury should be neither a “Not Value” nor NULL
  • When Possible Injury is No, Vehicular, Pedestrian or Other Injury Risk Factor should be a “Not Value”, NULL or Pertinent Negative.
  • When Possible Injury is No, Trauma Center Criteria should be a “Not Value”, NULL or Pertinent Negative.

Percentage completion scores fall into color categories for >95% (green), 90%-95% (yellow), and less than 90% (red). Target percentage for these measures is AT LEAST 95%.

The total section score is shown to the far right of the section name and description. This score is also colored based on the above scale. To see more information about the denominator used in calculating the overall section score, mouse over the overall score and a tooltip will appear containing that information.

Clinical Times Recorded Section

The Clinical Times Recorded metrics evaluate the completeness of certain values when a particular “clinical” (medication given, procedure performed, vital sign recorded, or pre-arrival alert) is recorded.

The measures in this section are defined, and considered accurate as follows:

  • When Medication Given is recorded (other than “Not Value” or  NULL), Date/Time Medication Administered should have date/time no earlier than 1/1/1900.
  • When Procedure Performed is recorded (other than “Not Value” or  NULL), Date/Time Procedure Performed should have date/time no earlier than 1/1/1900.
  • When a Vital Sign is recorded (includes eVitals.03, eVitals.06, eVitals.07, eVitals.10, eVitals.12-18, eVitals.23-24, eVitals.26-27, eVitals.29, eVitals.31-33, see eVitals section in V3 Data Dictionary for more information types of vital signs) other than “Not Value” or NULL , Date/Time Vital Signs Taken should have date/time no earlier than 1/1/1900.
  • When Pre-Arrival Alert or Activation is recorded (other than “Not Value” or NULL), Date/Time of Pre-Arrival Alert or Activation should have date/time no earlier than 1/1/1900.

Percentage completion scores fall into color categories for >95% (green), 90%-95% (yellow), and less than 90% (red). Target percentage for these measures is AT LEAST 95%.

The total section score is shown to the far right of the section name and description. This score is also colored based on the above scale. To see more information about the denominator used in calculating the overall section score, mouse over the overall score and a tooltip will appear containing the information.

Other Incident Information Section

The Other Incident Information section is a catch-all group for metrics that did not fit into any of the above sections.

The measures in this category are defined, and considered accurate as follows:

  • When Mass Casualty Incident is ‘Yes’, Number of Patients is neither a “Not Value” nor NULL.
  • When Other Associated Symptoms is recorded (other than “Not Value” or NULL), a valid value for Primary Symptom is recorded.
  • When Provider’s Secondary Impression is recorded (other than “Not Value” or NULL), a valid value for Provider’s Primary Impression is recorded.
  • When Medication Given is recorded (other than “Not Value” or NULL), a valid value for Role/Type of Person Administering Medication (other than “Not Value” or NULL) is recorded.
  • When Procedure Performed is recorded (other than “Not Value” or NULL), a valid value for Role/Type of Person Performing the Procedure (other than “Not Value” or NULL) is recorded.

Percentage scores fall into color categories for >95% (green), 90%-95% (yellow), and less than 90% (red). Target percentage for these measures is AT LEAST 95%.

The total section score is shown to the far right of the section name and description. This score is also colored based on the above scale. To see more information about the denominator used in calculating the overall section score, mouse over the overall score and a tooltip will appear containing the information.

Agency Comparison Section

The agency comparison page of the dashboard provides vendor representatives with a way to compare agency data quality for the selected reporting period. It is composed of two views, a graph view, to give a visual representation of how the agencies using a vendor's software compare, as well as a table view, to give a detailed breakdown of overall scores as well as individual section scores.

Overall Score Graph Section

The overall score graph provides a way to visualize how the different agencies using a vendor's software compare to one another with respect to the data quality measures described above.

Each dash represents a single agency, information about the agency, such as the agency number, and score breakdown can be found by simply holding your cursor over a dash to bring up the tooltip.

Clicking on a dash will take you back to the data quality report, with the dashboard filtered on the agency selected on the agency comparison page.

Overall Score Ranking Section

The overall score ranking table gives the full breakdown of each agencies’ overall and category scores for the measures described above. The scores in the table are color coded with the same scheme used elsewhere in the dashboard, >95% (green), 90%-95% (yellow), and <90% (red).

When this view is opened the agencies will be sorted from highest to lowest based on their overall score. However, agencies can be sorted based on any of the given category scores by holding your cursor over the category name and clicking the sort icon:

Clicking on an agency’s number in this table will take you back to the data quality report, with the dashboard filtered on the agency selected on the agency comparison page.

Tableau Toolbar

All Tableau dashboards share the same toolbar at the bottom of the view. Note that while most dashboards will have the same selections, some items may differ based on user permissions.

Undo, Redo, Revert, Refresh and Pause

These selections are found on the left side of the Tableau toolbar and can be used to help navigate the dashboard.

 

Undo will remove the last selection made, if you have used undo to remove a selection, you can then use redo to return to that selection. Revert returns the dashboard to its original view. Refresh will reload the data on the dashboard subject to any selections made. Pause will stop the dashboard from refreshing as you make selections.

As an example, if you want to make a set of complete selections in a dashboard’s filters, you could use these buttons to speed up the process. Normally every time a change is made, the dashboard will refresh. If you select “Pause” however, the dashboard will remain the same no matter how many selections you make. Once you have made all your selections, you can select “Refresh” to reload the dashboard based on those selections.

Subscribe, Custom Views, Edit, Share, and Download

These selections are found on the right side of the Tableau toolbar and can be used to interact with the dashboard in external ways.

 

The Subscribe button allows users to subscribe to a daily or weekly email that will show a static image of the dashboard as well as a link to the view.

You can select either a particular view (tab) or all the views (tabs) available in the dashboard. Under “Email Subscriptions” you can choose the frequency of emails and in “Subject Line” you can assign the subject for the emails. As a default, the subscription will be assigned to the email address associated with your NEMSIS account.

Custom Views allows users to save any particular set of filters or selections for quick access later. A user can create their view and then name it and save it either for their own personal use or use by anyone with access to this dashboard. Previously saved views will appear in the custom views tab.

Edit will open the dashboard in the web editor for Tableau. Note that this option is not available on all dashboards and will require some knowledge of how to use Tableau to build and edit dashboards.

Share will provide a couple of options for sharing the dashboard with others; specifically an embed code and a regular HTML link. While you can use the link to share the dashboard with others, please keep in mind that many of the NEMSIS dashboards will require a username and password so only those with credentials will be able to access them.

The last option is Download. Clicking on the download button opens up several options to download either the workbook, the data within it, or a static image of the view.

Selecting any one of these file types will produce a download of the requested file. Remember that not all options will be available for all dashboards and views.

Contact Info

Any questions, comments, concerns or suggestions with regard to this or other reports made available by the NEMSIS TAC, please contact us or leave a comment below!

Contact information is also available on our website.

Kevin White, BS
University of Utah School of Medicine
NEMSIS Technical Assistance Center
Phone: (801) 213-3408
Fax: (801) 581-8686
Email: kevin.white@hsc.utah.edu
Laurel Baeder, BA, MStat
University of Utah School of Medicine
NEMSIS Technical Assistance Center
Phone: (801) 587-7367
Fax: (801) 581-8686
Email: laurel.baeder@hsc.utah.edu




  • No labels
Write a comment…