Value-based care models are becoming increasingly complex. The shift is requiring more sophisticated data, analytics and payment structures, and this complexity is leading to an increased analytics burden -- particularly for payers.
Now, payers have to integrate electronic health records and claims data, determine data governance, aggregate data from disparate silos and pinpoint reliable metrics to measure progress.
They also need to be able to report this data in ways that foster trust between organizations in value-based contracts. Only in this way can they promote the success of value-based care for themselves but also for providers and the federal government.
Learn on-demand, earn credit, find products and solutions. Get Started >>
But according to Dr. Emad Rizk, CEO of healthcare analytics company Cotiviti -- which has a business stake in the healthcare data situation -- many payers are finding this burden difficult to navigate. Most lack the processes and general know-how to predict the cost of care, create new quality metrics and overcome interoperability challenges. These strategies are crucial to the implementation of value-based care.
QUALITY AND OUTCOMES
Rizk said that two important questions are rarely asked, but should be: What is the risk burden of the individual? And what is the risk outcome?
By way of example, Rizk shared the story of his mother, 81, who recently experienced spinal issues and underwent a fusion of her vertebrae.
"If you did this with someone who is 40 years old, and they're ambulating and everything is fine, it's different than if you do it with someone who is 82 and has diabetes," said Rizk. "You have to bring in data that's identified retrospectively. What is the disease burden of the person you're intervening in? You have to look at retrospective claims, clinical data. You have to look at pharmaceutical data to make sure you understand the pharmaceutical burden he or she has, and then you have to put together what you want the outcome to be.
"A spinal surgeon just wants to do the surgery and be done with it," he said. "But that's not it. You still want the person to have controlled hypertension, controlled pain, no infections, and you have to manage that and they're all in different databases."
So how does one bring those disparate databases together? It takes a single patient identifier, in which a patient's every visit and every surgical procedure, for years, is tied to that person in a longitudinal database. That's particularly important given the siloed nature of most healthcare data. A hospital likely won't have access to retrospective claims. Payers will have retrospective claims, but they won't have clinical outcomes.
Rizk's prescription for this dilemma is to put all of the data into a data lake, a repository that has the requisite patient identifiers. Every time the patient goes to a doctor, the results and outcomes "hydrate" the data lake.
It takes an investment in data and technology to make it happen, but Rizk said it's necessary.
"This is the only way you're ever going to get true value-based care," he said.
Quality measures are based largely on utilization, which typically examines whether the patient has been readmitted within 30 days of leaving the hospital. Rizk isn't a fan of this mindset, contending it's not a true quality metric.
"As a patient develops an infection, and goes and gets antibiotics and goes back into the hospital, I don't believe that's a quality metric," said Rizk. "All of that stuff should be less than 1%. What I believe about quality is that if you have a patient who needs a new knee replacement, how much are you managing their sugar when you have them on steroids, because surgery raises your sugar?
"They don't manage that because they don't know if the patient is a diabetic or not," he said. "Or they tell a patient prior to a procedure, 'Stop all your medicine.' Well if you have thyroid disease, you could go into some sort of toxic hyperthyroidism. People right now don't have the standard information, post-hospitalization."
There are other, similar hypothetical situations -- take CT scans, for example. There's a relatively high percentage of people who are allergic to the dye used in CT scans and could have a negative reaction. A provider armed with this information could give a patient an antihistamine prior to giving them the dye. This becomes more important as a patient ages; older patients tend to have more comorbidities.
Luckily, Rizk feels that healthcare has gotten over the "data hump"over the past three to four years. The industry, in his view, is now on the other side of the hill and is finding the road a bit easier, due to factors such as physicians using some form of digitized EHR, and more adoption of data technology generally.
"I believe we're at this pivotal point where every single area in the delivery system has been automated, and been technologically enabled," said Rizk. "The next level is to create this interoperability. This is the one area where the government has tried to help, but not pushed it as much as they should. You have three or four companies that, between them, have 80 percent of the market. We should be able to force them, from a regulatory perspective, to share data. It's not going to hurt their model at all. It's going to help patients."