Ten years ago, James Adams, MD, kicked off a conference aimed at pinning down elusive quality-of-care measures in emergency medicine by challenging the longstanding assumption that such a task couldn’t be done. A short time later, in a special issue of Academic Emergency Medicine, “Assuring Quality in Emergency Care,” he predicted a future of metrics, noting that the very idea might cause “frustration, resentment and opposition,” but also could lead t0 “problem-solving, creativity and success.” (2011;17.)
The article, also authored by Michelle Biros, MD, forecast a time when the notion wouldn’t seem radical but rather an essential focus of ED operations. (Acad Emerg Med 2002;9:1067.) It looks like that day has arrived.
Three efforts in the past year, one led by the Intermountain Institute for Health Care Delivery Research, another by the Emergency Nurses Association (ENA), and a third by Urgent Matters sponsored by the Robert Wood Johnson Foundation, offer a prescriptive means for assessing quality of care in emergency medicine, suggesting how it can be quantified and offering ways in which it can be more precisely defined.
The Intermountain summit meeting group, as it has been informally called, identified metrics that can be universally applied and potentially used by the Centers for Medicare & Medicaid Services (CMS). The metrics formulated by the group are specific to ED operations, and are measures that can be “broadly applied, widely adopted,” explained Steven Davidson, MD, an emergency physician who is the chief medical informatics officer at Maimonides Medical Center in Brooklyn and a member of the group.
Back when Dr. Adams spoke presciently about the likely coming of such “operational metrics,” Dr. Davidson and several of his colleagues were on hand, too, helping frame that discussion by coining the term “community expectations,” which underscored the idea that EDs would need to meet such expectations to build credibility and trust. (Acad Emerg Med 2002;9:1085.)
New metrics by the Intermountain group provide definitions supporting that community concept, notably time-stamp protocols that track patients at different points during the ED stay. Will they find their way to the Joint Commission, which has suggested that time-sensitive measures can be markers for quality of care? “I believe we might see that,” said James Augustine, MD, an emergency physician who was part of the Intermountain summit and who last year was named the chair of the Joint Commission’s Hospital Professional and Technical Advisory Committee.
Dr. Augustine, who also is part of the ED Benchmarking Alliance (EDBA) that provided much of the data, said he believes the recommendations by the Intermountain summit meeting group will have far-reaching effects. Performance measures, which only last year were updated from the inaugural EDBA meeting held in 2006, take into account new models for receiving and sorting patients to ED care. These new intake models include triage, rapid medical screening, team triage, and physician-in-triage.
Yet evidence of looming problems persists, including increases in lengths of stay, lower quality care, and worse outcomes linked to ED crowding.
In a conference last year published as a reference, the Intermountain group, which invited experts representing EDs from across the country, ED staffing groups, leaders of professional societies, and representatives of regulatory agencies, appears close to completing its quest for common metrics, though there is room to settle on the particular standards upon which they would be based. “Just about everyone in the world was invited,” Dr. Davidson said.
The group looked at current ambiguous definitions, such as decision-to-admit time, and suggested revisions that call for implementing time-stamps instead. These would include arrival time, emergency medical services off-load time, provider-contact time, and the time it takes for data such as test results to arrive after being ordered, which would be referred to as a ”laboratory interval.” (Ann Emerg Med 2011;58:33.)
Establishing that kind of standard terminology was a priority at the meeting. “It is imperative that further regulatory requirements use parameters developed by experts from within the specialty who understand its practice,” was one conclusion. Another was that it is important to create definitions aligned with performance measures from CMS.
Around the same time, the Emergency Nurses Association issued a set of standard definitions as well, and they were approved as a consensus statement by the American College of Emergency Physicians. They are similar to those put forth by the emergency physician group at the summit meeting, but that report provides a more detailed breakdown of discrete time measures compared with ENA’s.
Both provide similar definitions for ED arrival time, ED “offload” time, and ED length-of-stay. But the Intermountain summit group and the EDBA report add definitions for discrete activities during a patient’s stay in the ED, such as provider contact time or ED consultation interval. Other metrics are specifically quantifiable as well, such as the number of patients who leave the ED before being seen, before complete treatment, or against medical advice. Complaint ratios can be calculated and units of emergency service measured, such as looking at the rates of specific imaging studies per 100 ED visits.
Some of the definitions by ENA seem relatively unchanged from previous versions, and AnnMarie Papa, DNP, RN, the president of ENA, explained that the group wanted to avoid “reinventing the wheel.”
On the other hand, nine leading associations of for emergency medicine, among them ACEP, the National Association of EMS Physicians, and the American Academy of Emergency Physicians, support ENA’s development of ED metrics, and subsequently, the benchmarks to underpin them. “CMS is well aware of this,” she added. “I am hoping our document can help them.”
In contrast, the Urgent Matters Learning Network, which includes six hospitals engaged in best practice strategies to see how much difference that can make in outcome, is an “improvement collaborative,” Dr. Davidson explained. The network has measures that may seem similar, he said, but they are aimed at enhancing quality of care, which complements the summit group’s efforts.
Such improvements are crucially needed because differences in outcomes among EDs become a matter of public record. Why does a typical Medicare patient have the best chance of surviving emergency hospitalization in Phoenix or Milwaukee or Cincinnati? A 2011 study by the consumer-oriented HealthGrades online information service, which included data from 5,000 hospitals nationwide, found that medical centers in these cities provide patients with the best opportunity for survival by being top performers in providing care. People seeking treatment at them are nearly 40 percent less likely to die during emergency hospitalization by one of these EDs, according to the Denver-based HealthGrades survey.
Sepsis and pneumonia represent treatment groups where the most lives could be saved by closing the quality chasm, the study showed. Diabetic acidosis, COPD, pneumonia, and acute myocardial infarction also were evaluated to determine the variability of care and how much it affected morbidity and mortality. Measures for those conditions weren’t as definitive, but geographic trends in care were seen among them, too.
Standardization of ED care, like that being done in the Urgent Matters project, isn’t just good for patients, it can have a dramatic effect on the bottom line as well. When North Mississippi Medical Center in Tupelo instituted metrics for wound care, protocol usage became transparent so it could be evaluated for outliers. One result: The number of similar wound care products was substantially reduced, and clinical champions of the effort were identified, and they helped support the move to standardization. Healing rates went up, and supply expenses were reduced by $300,000. (Healthc Financ Manage 2011;65:70.)
Until such advances are able to be implemented widely, the negative outlook for nonprofit hospitals is destined to worsen, with EDs one reason for the decline, according to Moody’s Investor Services. Patient volumes for elective procedures will decrease this year, and hospitals will take on more uninsured care. (Moody’s Not-for-Profit Healthcare Sector Report, Feb. 3, 2011.) Historic lows in health care spending are expected to continue, in fact, until the Patient Protection and Affordable Care Act begins in 2014. (Health Aff 2011;30:1594.)
Findings from a working group on ED management of acute heart failure suggest a similar outcome. A group of emergency physicians and cardiologists convened by the National Heart, Lung, and Blood Institute found that not only is there a need for better methods of early detection and monitoring of acute heart failure, but time-critical interventions and quicker diagnostic confirmations save morbidity and money. (J Am Coll Cardiol 2010;56:343.)
Use of standardized techniques as a mark of excellence is nothing new. It started at nearly the same time as recorded medicine. In dressing a wound, for example, one medieval text advised that the cut be “bandaged over completely so that the poultice cannot be removed from the place in which it should be,” and then called for “renewal,” a daily check to see that the wrap remained that way till healed. (A Medieval Surgical Pharamacopeia and Formulary. New York: Xlibris Corp.; 1999.) When such advisories weren’t carefully followed and therapeutic failure occurred, the reputation of the practitioner was damaged as a result.
This form of metric underwent a renaissance — literally. Measures of natural derivatives from that era — lists of plant ingredients, for instance, and how to combine them — became incorporated into health books for home use. (Prospecting for Drugs in Ancient and Medieval European Texts: A Scientific Approach. Amsterdam: Harwood Publishing; 1996.)
Some of these antiquated metrics were largely lost to time. An attempt 40 years ago to obtain approval from the U.S. Food and Drug Administration for clinical use of cantharidin, an ancient wart remedy made from a beetle secretion that induces antiviral blisters, was allegedly thwarted by the fact that old-time protocols were not refined from their basic origins, then carefully reapplied. Thirty years later, a pair of dermatologists wrote what they called a “blistering defense” of the centuries-old practice, inciting new interest. (Arch Dermatol 2002;137:1357.) Currently, the chemical may be used on warts resistant to other treatments, though validation of beetle juice is still missing to make it a first choice.
Conversely, some tests and treatments are so skillfully marketed that they work their way into standard practice despite inadequate evidence of their effectiveness or an imperfect understanding of their risks, according to Arthur Kellermann, MD, an emergency physician and the director of RAND Health in Santa Monica, CA. (Evidence-Based Emergency Medicine. Oxford: Wiley-Blackwell; 2008.)
“No one can predict which test or treatment in routine use today will join Ewald tubes, corticosteroids for head trauma, intravenous aminophylline, and military antishock trousers in the dustbin of emergency medicine history,” he pointed out. No one, that is, except scientists who have studied it. Finding and analyzing available evidence on any clinical question yields the best answers, he stated. — Anne Scheck
• Read the initial EDBA metrics publication by Shari Welch, MD, James Augustine, MD, and Carlos Camargo, MD, et al, at http://bit.ly/EDsummit.
• A more thorough listing of new definitions is available in the Emergency Department Operations Dictionary: Results of the Second Performance Measures and Benchmarking Summit, an abstract of which is available at http://bit.ly/EDdictionary.
• The five articles in the Academic Emergency Medicine series, “Assuring Quality in Emergency Care” are free at http://bit.ly/EMquality.
• Read the HealthGrades 2011 Emergency Medicine Study at http://bit.ly/US-EDs.
• Comments about this article? Write to EMN at firstname.lastname@example.org.