The trusted source for
healthcare information and
The nation’s third-largest nonprofit healthcare system realized double-digit improvement in several key quality and safety measures in just 12 months by starting with its data.
Catholic Health Initiatives (CHI), based in Englewood, CO, has 100 hospitals operating in 17 states and has seen significant improvements in several quality measures. For example, catheter-associated urinary tract infections (CAUTIs) were reduced 38% in 2016 and an additional 19% in 2017, while pressure ulcers were reduced 22% in 2016 and 13% in 2017. Postoperative hip fractures fell 41% and 33% in those respective years.
Those results came after CHI leaders realized the problem was not a lack of data, but what they were doing — or, rather, not doing — with it, says Jim Reichert, MD, PhD, CHI’s vice president of analytics and transformation.
“We were bringing in a lot of data from our different facilities across the country into our data warehouse, but we really weren’t putting the data together in a way that was useful for quality improvement,” Reichert says. “The markets would come forward each quarter and share their own quality and safety performance measures with national, but national couldn’t compare one hospital with another or see if any one hospital was improving on a particular opportunity over time.”
An additional problem was that the national office would create new quality improvement goals every year, but local facilities would balk at some of them, saying they were already performing well on those measures and shouldn’t waste resources on marginal improvements.
“So we set up standard reporting across the enterprise, making it transparent so that everyone could see everyone else’s results,” Reichert says. “We also realized that we needed risk-adjusted data and standard definitions in metrics that would be used for all of the measures. We could create percentile ranks that in near-real time could let the facilities know how they were performing on their mortality measures, infections, patient experience, and patient safety events.”
At around the same time, CHI also was implementing a strategy for focusing on “Living Our Mission” measures, nine areas that the healthcare system would use to measure success among all its facilities. Those nine areas are service to the poor and vulnerable, employee engagement, physician satisfaction, quality, patient experience, safety, growth, transformation, and operating earnings before interest, depreciation, and amortization.
“These were nine measures from the board down that every facility in our organization would be measured against,” Reichert explains.
Of those nine, the analytics group was tasked with three of them: quality, safety, and patient experience. Those are composite measures, so quality consisted of mortality and hospital-acquired infections; safety included the Patient Safety and Adverse Events Composite known as PSI 90; and patient satisfaction was derived from Hospital Consumer Assessment of Healthcare Providers and Systems (HCAHPS) scores.
“We would give monthly reports on how each facility was doing with each measure and how that mapped out to the division, and we also distributed a quarterly report that the board would use,” Reichert says. “The goal was to align the board with senior-level management and the facilities to actually get alignment around these measures of interest and make steady, gradual improvements in care.”
CHI also committed to a longer focus on quality improvement with a three-year plan rather than introducing new goals each year.
“I think that was significant because there are always early adopters who move the needle quite soon, the middling adopters, and the later adopters who are slower to change,” Reichert says. “The three-year change allowed everyone to get on the bus and make movements in the right direction before we turned the tables on them and gave them a whole new set of priorities.”
One challenge involved transparency. Everyone involved supported the idea of transparency, but achieving that was difficult when there was little agreement over the sources and meaning of data, Reichert says.
“There are people in each facility or market who are the analytic folks, the data domain experts for their market, so if you give a report to the leaders in that market they are going to take it to their own experts and ask if it is correct. We had to do a top-down and a bottom-up approach where we establish relationships with those folks,” Reichert explains. “We provided a ton of education and communication to build those relationships between national analytic resources and the local markets so we could get on the same page.”
Reichert says a key component of quality improvement for a large organization is having single sources of truth for quality data, and CHI relied on several vendors, including SAS in Cary, NC, to provide that basis from which to work. It also is important to use benchmarks from outside the organization and not fall prey to the idea that your own healthcare system is large enough to be its own source of quality measures without comparing itself to others.
Reichert also advises quality improvement leaders to modernize their data management as much as possible.
“A lot of organizations still do a significant amount of quality work in Excel and manual spreadsheets, and the sooner you get out of that the better. Centralizing and automating the data processing will make it more likely for you to move forward with quality improvement because you’re not using so many resources for just data wrangling and data prep,” Reichert says.
“You can put those resources toward care improvement, which is where it really needs to be.”
Financial Disclosure: Author Greg Freeman, Editor Jesse Saffron, Editor Jill Drachenberg, Nurse Planner Amy M. Johnson, Editorial Group Manager Terrey L. Hatcher, and Consulting Editor Patrice Spath report no consultant, stockholder, speaker’s bureau, research, or other financial relationships with companies having ties to this field of study.