2016
DOI: 10.1111/acem.12988
|View full text |Cite
|
Sign up to set email alerts
|

Implementing Data Definition Consistency for Emergency Department Operations Benchmarking and Research

Abstract: Objectives: The objective was to obtain a commitment to adopt a common set of definitions for emergency department (ED) demographic, clinical process, and performance metrics among the ED Benchmarking Alliance (EDBA), ED Operations Study Group (EDOSG), and Academy of Academic Administrators of Emergency Medicine (AAAEM) by 2017.Methods: A retrospective cross-sectional analysis of available data from three ED operations benchmarking organizations supported a negotiation to use a set of common metrics with ident… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
5
0

Year Published

2018
2018
2019
2019

Publication Types

Select...
6

Relationship

2
4

Authors

Journals

citations
Cited by 7 publications
(5 citation statements)
references
References 19 publications
0
5
0
Order By: Relevance
“…The EDBA survey is administered annually to EDBA member institutions and former member institutions that submitted survey responses in prior years. EDBA membership and survey development details are available in previous publications and the EDBA website [7, 8, 10, 11]. Within the survey instrument (Additional file 2: Table S2), respondents reported clinical operations metrics attributable to the ED site level, labeling each ED as academic-affiliated (defined as participating in training EM and other residents) or not academic-affiliated.…”
Section: Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…The EDBA survey is administered annually to EDBA member institutions and former member institutions that submitted survey responses in prior years. EDBA membership and survey development details are available in previous publications and the EDBA website [7, 8, 10, 11]. Within the survey instrument (Additional file 2: Table S2), respondents reported clinical operations metrics attributable to the ED site level, labeling each ED as academic-affiliated (defined as participating in training EM and other residents) or not academic-affiliated.…”
Section: Methodsmentioning
confidence: 99%
“…Since no single ED data comparison effort exists to compare appropriately the clinical operations of academic versus non-academic EDs, we sought to investigate differences between these two groups by combining data from the AAAEM/AACEM and EDBA performance surveys. Both sources are characterized by high response rates, include comprehensive clinical operations metrics, have significant overlap in their definition of clinical metrics [8], and their methodologies allow for the accurate establishment of academic and non-academic cohorts, repsectively [1, 8].…”
Section: Introductionmentioning
confidence: 99%
“…We asked each site to provide demographic information for their facility, percentage of patients with ESI 1 or 2, CMI 1,2 annual ED volume, 3,4,5 academic status, 6 inpatient admission rate, 6,7 percent of patient admitted under observation status, percentage of Medicare patients, patients-seen-per-attending-hour per day, and the aggregated percentage of billed charts falling into each ambulatory payment classification (APC) coding level category. 8,9,10 Each site PI agreed to accurate data collection and reporting to the EDOSG. Most frequently site-PIs are medical or clinical operations directors.…”
Section: Methodsmentioning
confidence: 99%
“…ED LOS is defined as the time from ED arrival to ED departure. 18 Change in EF is calculated as the difference between the last EF measured prior to the patients with STEMI and the first documented after hospital discharge. Hospital LOS is the time from hospital admission to hospital discharge.…”
Section: Methodsmentioning
confidence: 99%