It needs to be stated on the front end that the hospital industry is not happy with the current quality measurements. However, the vast majority of these measures were derived from, or had extensive input from, the health-care industry. The industry criticizes these measures as not being accurate. Of course, the exception is when a facility scores No. 1; then there seems to be no limit to marketing the result.
A new star ranking system was introduced by the Centers for Medicaid & Medicare Services (CMS) in July representing a composite scoring system of 64 quality measurements. This system has been applauded by consumers but criticized by the industry and many in Congress. However, the measurements have been subjected to rigorous scientific review and risk adjustment, along with being subjected to extensive pre-release comment and review by the industry.
The largest area of concern appears to be the use of patient safety indicators (PSIs), which measure quality and are taken from a facility’s billing submissions. The industry says they are not accurate. However, they are derived from patient chart audits by coders, a methodology similar to that used by the Agency for Healthcare Research and Quality, whose quality measurements are highly touted by the industry. The hospital bill is just the submission form.
Scores on accreditation from The Joint Commission are also touted as the measurement of choice for quality. However, this accreditation organization was forged out of the health-care industry and all too often gives a gold star of approval. The accuracy of its accreditation process has also been brought into question by a recent CMS report which found for fiscal year 2014 (latest data available) a 42 percent “disparity rate” for The Joint Commission surveys. In other words, these surveys were out of compliance. By law, the compliance rate needs to be 80 percent or greater.
Never miss a local story.
Thus, this year we are not presenting data derived from accreditation processes or a multitude of different CMS metrics. Instead, we are focusing on the following four quality ranking systems:
▪ The CMS, whose star system is published on Hospital Compare and is a composite of 64 quality measurements.
▪ The Leapfrog Group, a business-purchasing alliance to which many Fortune 500 businesses belong, which primarily measures safety in health-care delivery. Their home pages stresses hospitalized patients have a 1 in 25 chance of acquiring a new infection.
▪ Consumer Reports, a consumer orientated non-profit and publisher, which primarily measures safety in health-care delivery using a “composite of five key measures of patient safety: readmissions, complications, communication, overuse of CT scans and infections.”
▪ U.S. News and World Report, which uses a variety of measures, including reputation, to determine quality. This score not only reflects safety but also the ability to handle complex problems. Thus, larger hospitals and educational institutions often do better with this ranking system. The University of Louisville, exemplifies this dichotomy (see Table).
We have also included resources on hospital complaint investigation (inspection) reports performed by the Centers for Medicare and Medicaid Services.
So how did the hospitals do? In the Lexington region, they overall have improved and are doing rather well. St. Joseph and St. Joseph East both showed marked improvement.
Overall, these safety-ranking systems measure different aspects of care and will give different results. However, if a hospital performs in the lower tier of several systems, one needs to take notice. This appears to have happened with several facilities in Louisville, which had dismal results.
The University of Louisville continues to struggle and shares with Jewish/St. Mary’s Healthcare a “D” rating from the Leapfrog Group and scored low on the Consumer Union safety score. They were also only awarded one to two stars (out of five), respectively, by CMS. Concerns regarding U of L were mentioned in our last report on hospital quality and were a harbinger for the string of stories in the The Courier-Journal regarding unsafe conditions at the institution.
This is a testament to the accuracy and usefulness of these quality-ranking systems.
Another observation is that often smaller, non-academic institutions do better on safety scores than universities and major medical centers. This should be expected. Major teaching hospitals deliver a different quality of services. Resident and medical education makes for a more thorough analysis of complex problems but also makes it easier to mess up simple things, since they may be relegated to less experienced individuals.
Long sleepless hours and draconian cutbacks in the funding of universities in Kentucky make matters worse. Residents are too often up all night and their teachers are juggling research, teaching, patient care and generating funds for university support.
After the recent national election, I suspect our current health care may shift from a highly government-regulated to a consumer-driven system. However, the latter cannot take place unless the publicly available hospital data on finance and quality are improved, verified and made timely.
Only then can a consumer make truly informed choices.
Kevin Kavanagh of Somerset is a retired physician and board chairman of Health Watch USA.
How Kentucky hospitals rate on quality, safety measures
Hospital performance data as of Dec. 1, 2016 (1)
Hospital Investigative Reports 2014 to 2015 Number of Inspections (Total Violations) (6)
Baptist Health, Lexington
No. 3 (a tie) (8)
St. Joseph Hospital, Lexington
St. Joseph East
University of Kentucky
No. 1 (8)
Pikeville Medical Center
St. Joseph - Mt. Sterling
St. Elizabeth Florence
St. Elizabeth Med. Ctr. (Edgewood) (*)
No. 3 (a tie) (7)
St. Elizabeth Fort Thomas
Norton Hospital Healthcare (**)
No. 5 (6)
Norton Audubon Hospital
Norton Woman’s and Children’s
Norton Brownsboro Hospital
Jewish/St. Mary’s Healthcare (***)
Baptist Health, Louisville
No. 2 (9)
University of Louisville Hospital
(1) Most data are more than 9 months old, some well over 1 year.
(2) The Leapfrog Hospital Safety Grade (sm) scores hospitals on data related to patient safety.
(3) Consumer Reports, Hospital Safety Data - Higher is better - Scale 1 to 100
(4) CMS Composite Star Rating Score. Scale is 1 to 5 stars. This data is from hospital compare. https://www.medicare.gov/hospitalcompare/search.html?/
(5) Five meet high U.S. News standards and are ranked in the state.
(6) Hospital Inspection (Investigative) Reports listings may be incomplete.
(*) Listed As: St. Elizabeth Healthcare - St. Elizabeth Medical Center North - St. Elizabeth Edgewood
(**) Listed as Norton Hospitals, or Norton Healthcare
(***) Listed as Jewish Hospital, Jewish Hospital & St. Mary’s Healthcare or St. Mary’s & Elizabeth Hospital;
Consumer Reports combines Jewish and St. Mary’s Healthcare; CMS combines the data from Jewish and St. Mary’s Healthcare;
The Leapfrog Group (sm) surveys the hospitals separately and reports a Safety Score for both Jewish and St. Mary’s & Elizabeth Hospital
U.S. News & World Report’s Best Hospitals; Copyright © 2016 U.S. News & World Report, L.P. Data reprinted with permission from U.S. News.