By Jennifer Bresnick, Health IT Analytics | March 4, 2019

Healthcare stakeholders are broadly on board with artificial intelligence, but the industry will have to overcome its data aggregation challenges in order to succeed.

It hasn’t taken very long for artificial intelligence to become the most popular topic of conversation in the world of healthcare data analytics.

In what seems like an instant, AI has started to support incredible advances in imaging analytics, clinical decision support, operational efficiencies, and patient engagement.

Algorithms can now identify anomalies, make suggestions, and solve problems at rates that are comparable to human accuracy – in some cases, sophisticated deep learning tools and neural networks are surpassing the abilities of mere mortal clinicians.

At the annual HIMSS Conference and Exhibition, AI appears to have moved from uncertain novelty to absolute necessity in the blink of an eye.

HIMSS serves as one of the best opportunities to take the industry’s technological temperature, and AI has definitely caught fire over the past twelve months.

The 2018 edition of the conference saw EHR vendors and big data experts introducing AI to a skeptical audience still coping with the unintentionally negative impacts of so-called innovations for their decision-making processes.

Vendors tried every trick in the book to allay fears of robot doctors, mass layoffs, and the mechanization of the patient-provider relationship, positioning AI as the engine that will eliminate waste and drive continuous improvement rather than being the harbinger of professional doom.

Since then, major solutions providers have joined regulatory agencies in promoting AI as both a competitive advantage and an inevitability, urging organizations to accept machine learning as a fact of life.

They have largely succeeded in this task.

At HIMSS18, a vendor could attract a very large, if cautiously curious, crowd to its booth just by saying a new product included machine learning.

In 2019, attendees were over their fascination with fancy math.

Instead, they wanted proof that an investment in AI could measurably move them along the analytics maturity curve from retrospective descriptive analytics into the realm of prescriptive insights: the ability to forestall future events by using data to suggest actionable interventions in a timely manner.

The shift in public opinion is a positive development for vendors who are eager to help providers get to grips with machine learning.

But while the battle over the perception of AI may be winding down, the industry has another big problem on its hands before AI can become operational: accessing the enormous volumes of clean, complete, timely data required to train, validate, and deploy AI for use in the real-world environment.

Creating a fluid, accessible data aggregation environment in which AI can flourish means overhauling everything from basic infrastructure design to the business case for sharing information.

Organizations may be more willing to take on the challenge as AI starts to prove its value, but the industry still has a lot of work to do before provider groups can consistently access the prescriptive insights that are critical for achieving their long-term goals.

PUTTING TOGETHER THE BIG DATA PUZZLE PIECES

Healthcare organizations have entered a pivotal year for artificial intelligence and advanced analytics, says Mark Morsch, Vice President of Technology at Optum, which works with both payers and providers to generate data-driven insights.

“We do see 2019 as a significant year for adoption and for the continuing maturity of AI,” he said.  “It’s progressing very quickly, especially in the realms of deep learning and natural language processing (NLP).”

“I certainly see more organizations moving from the diagnostic and descriptive level up to the more predictive and prescriptive levels – that is very exciting. People are starting to appreciate the challenges and the opportunities of putting AI into practice.”

At EHR giant Epic Systems, Seth Hain, Director of Analytics and Machine Learning, has been observing a similar progression.

“The hype is dying down and folks are really starting to move from getting their arms around AI,” he told HealthITAnalytics.com.  “Many, many of our customers are starting to build the teams and processes that integrate machine learning into the standard way they do business.”

Seth Hain, Director of AI/ML at Epic Systems
Seth Hain, Director of AI/ML at Epic Systems

Source: Xtelligent Media

“It’s been nice to see that shift and it’s great to be able to start getting deeper into optimizing clinical workflows and gaining operational efficiencies with machine learning as part of the toolset.”

Advances in computing power are allowing data scientists to refine their approaches and expand upon the well-established foundations of machine learning, added Bharat Rao, PhD, Principal in KPMG’s Advisory Services practice.

“AI has been competent at tasks like imaging analytics for more than 20 years, and it has been doing that very well,” said Rao, who holds a doctorate in machine learning studies.

“The data science isn’t really new – it’s important to know that we do have experience with the technology and we trust it to work.  The technology isn’t really the problem.”

“The problem has been healthcare’s readiness to do this, and we are certainly seeing that change.  The industry has a fairly good grasp on the descriptive side of things, but the sweeping vision for AI is to unlock the value in unstructured data and merge it with structured information to get predictive instead of retrospective.  The problem is getting all the data in one place to make it happen.”

Data siloes are everywhere in healthcare, and breaking down the barriers to data access while still adhering to privacy and security principles is an immense challenge.

“The technology isn’t really the problem…the problem is getting all the data in one place to make it happen.”

Interoperability has been a major theme at HIMSS conventions for more than half a decade, although in recent years, interest has turned away from the plumbing and more towards what organizations can do with their data.

But the topic popped back up to the surface in 2019 as CMS released its sweeping roadmapfor combatting purposeful information blocking and encouraging broader access to data for patient engagement, population health management, and quality improvement.

Vendors were generally in favor of the new rules of the road, not least because they understand that an open data exchange environment is vital for the success of AI.

David Dimond, CTO at Dell EMC Global Healthcare
David Dimond, CTO at Dell EMC Global Healthcare

Source: Xtelligent Media

The ability to build longitudinal patient records without gaps is crucial for creating AI models that are accurate and actionable, pointed out David Dimond, Chief Technology Officer and Distinguished Engineer at Dell EMC’s Global Healthcare Business.

“Analyzing a particular set of data with AI isn’t that useful unless you can build a health record around it and develop a longitudinal view of that individual’s health,” Dimond said. “How do you get a comprehensive picture of a person when you don’t have all the data in one place?  How useful will your AI be if you’ve got holes in your record?”

“Data aggregation is going to have to be a top priority for organizations right now,” he stressed.

“There is simply too much data in too many different places for us to employ AI at scale, and organizations don’t have the strategies that they will need to bring all that data together for the type of work they are trying to accomplish.”

ADDRESSING THE COMPLEXITY OF A FRAGMENTED DATA WORLD

Consternation over data fragmentation in healthcare is nothing new.  Organizations have known for many years that they need to improve their storage and warehousing architecture if they want to succeed in the big data environment.

In 2017, a survey from Accenture found that more than 80 percent of healthcare executives had started working on creating centralized data platforms to underpin their artificial intelligence plans.

Eighty-two percent believed that a good leader in the modern health IT landscape will be defined by how well they architect seamless, interoperable environments for conducting large-scale analytics.

But as with most health IT initiatives, solving the problem is easier said than done.  Everything appears to be working against the average heath system.

Bharat Rao, PhD, Principal at KPMG
Bharat Rao, PhD, Principal at KPMG

Source: Xtelligent Media

Lingering legacy systems, regrettable EHR implementation decisions, lack of time and money to tackle data integrity from the bottom up, and the proliferation of potential data sources is compounding the difficulty of developing those silo-free ecosystems.

“We have dozens of different clinical-grade and consumer-grade devices and apps and tools producing data that could be very valuable, but many of these devices send their data to different places,” said Dimond.

“Sometimes you know where the data is, but most of the time you don’t, because it’s now living in its own unknown, proprietary cloud somewhere.”

“We keep talking about how great it will be for consumer-grade devices and Internet of Thingsdevices to contribute to our knowledge of what happens outside of the clinic, but we can’t access those insights if we don’t know where the data is or what we need to do in order to get at it.”

Even when data is known and theoretically accessible, the very nature of the beast makes it appear unsuitable to the type of advanced analytics most organizations are aiming for.

“Sometimes you know where the data is, but most of the time you don’t.”

“The majority of data in medical records is unstructured, especially if you’re looking at faxes and the output of chart retrieval, which can be pretty messy,” said Morsch from Optum.  “We call it the ‘paper towel roll’ – it’s just this huge ream of data that might include every document ever produced for a patient.”

“There are layers of machine learning technology that need to be employed just to get the data into a decent shape for secondary use.  First you have to structure it, so you have to run optical character recognition (OCR) algorithms to turn an image of a PDF, for example, into computable text.  Then you have to use NLP to extract meaningful elements from that or identify patterns you might be looking for.”

It can be challenging for organizations to do that first-layer work on unstructured data by themselves, which is preventing many providers for progressing any further with their AI initiatives, he continued.

“The ability to bridge the gaps like that – and then do something meaningful with the results – will certainly be a differentiator for technology providers, and it should be something that customers consider when planning out their projects.”

For Optum, pairing NLP with deep learning is helping users of its Case Advisor software to streamline the process of chart review.

“There is so much data to review that providers have to be very selective in what they choose to look at,” explained Morsch.  “That doesn’t give an organization a comprehensive picture of their clinical mix or any revenue they might be leaving on the table.”

“Deep learning can essentially review every case for you, then it can identify certain cases that may be best for additional review by a human expert. It supports appropriate automation and it can transform the process of reviewing cases for coding and billing.  But first you have to have the data.”

Hain agrees that organizations will likely need a strong technology partner to help them move forward into leveraging AI for measurable results.

“One of the key challenges of thoughtful machine learning is that the skillset isn’t fully there,” he remarked.  “This isn’t a role that organizations have been traditionally equipped to take on.  So using our experience to help them learn how to build out effective teams is something that we are focusing on doing.”

Epic is continuing to build out its library of machine learning models that providers can adopt to meet their specific needs, Hain said, reducing the need to have extensive data science resources in house.

“There is a need to understand the current workflow, identify the opportunities to make it better, and then start applying machine learning to those issues,” he said.

“But in order for them to succeed, they have to put a practical frame around it.  You can’t just have AI for the sake of AI.  When it’s producing measurable good for the end-user, whether that’s a clinician or a patient, it gains acceptance and starts to make a difference downstream.”

“In order…to succeed, [you] have to put a practical frame around it.  You can’t just have AI for the sake of AI.”

Marrying a flexible, practical organizational strategy with appropriately aggregated data and AI algorithms can help organizations take advantage of the fundamental complexity of healthcare instead of being overwhelmed by it, says Larry Burnett, Principal in the Health & Government Solutions arm of KPMG.

“Machine learning is letting us do some very exciting things because of the way it can integrate thousands of variables into a calculation,” he asserted.

Larry Burnett, Principal, KPMG
Larry Burnett, Principal, KPMG

Source: Xtelligent Media

“Scheduling, for example, is an excellent application.  There are plenty of patients who stay over the weekend at the hospital simply because they couldn’t get into the catheterization lab on Friday, or couldn’t get an echocardiogram because they were over scheduled.”

“If a physician is late or cancels an appointment, the patient becomes unstable and can’t be moved, the right equipment is being used, or there’s a shortage of nurses…any one of those things can lead to a significant delay.”

Machine learning is sophisticated enough to account for any or all of those variables, Burnett said, and it can offer useful suggestions for staffing, scheduling, or reallocating resources.

“Humans can’t optimize for those things as effectively as AI.  Artificial intelligence is very good at looking at all of the factors that could affect care and rapidly identifying the variability and the opportunities to tighten things up.”

“You definitely need all of that data in the right place to make it happen.  But once you have that aggregated information, the possibilities are enormous for using AI to reduce waste and ensure quality outcomes.”

DEVELOPING THE TECHNICAL FOUNDATIONS FOR DATA AGGREGATION

For providers to get all their data ducks in a row, they will need to develop an infrastructure strategy that prioritizes aggregation while still protecting privacy and security.

Attendees of HIMSS18 were treated to nearly an entire week of focus on using cloud storage and computing to enable the next generation of data analytics, something that certainly carried through to the 2019 conference.

“I’ve been coming to HIMSS for 20 years, but it wasn’t until last year that I saw real excitement about the cloud for the first time,” said Rao from KPMG.  “This year, the interest was even greater, for several reasons.”

“Firstly, we trust the cloud in our daily lives for so many things, which is finally bleeding over into the healthcare realm.  And secondly, because organizations are truly looking to move into AI and they realize that the cloud is probably the best way enable the data aggregation they need, they are willing to put their data there.”

Cloud service providers have been aggressively marketing their AI capabilities to the healthcare industry, and are offering more than just storage.

Pre-fabricated machine learning models and toolkits from companies such as Google and Amazon Web Services are enticing organizations to experiment with cloud at an accelerating pace and promoting trust and acceptance among lingering skeptics.

In fact, healthcare’s journey towards embracing the cloud is not all that dissimilar from its process of coming to terms with artificial intelligence.

Both started off relatively slow while raising red flags for stakeholders around the use, security and privacy of data, but are gaining steam quickly to become the accepted norm.

Both cloud and AI demand a certain degree of faith in variables beyond the organization’s direct control.

And both require organizations to commit to a holistic strategy, not just to adopting a technology.  As a result, a cloud-based approach often becomes experts’ go-to suggestion for AI development.

“All your data needs to go somewhere.  Chances are, it’s not all going to go into the same place – that’s just how it is,” said Dimond.  “But you need a business strategy around what you’re doing so that even if data ends up in different locations, there is something to link it together so it’s usable.”

“It’s going to be pivotal for organizations to start considering a multi-cloud strategy if they want to position themselves for AI.  If you build that into your data aggregation and interoperability model from the beginning, you’re going to be in a much stronger place to bring together the volume and variety of data you need to support machine learning models.”

“You need a business strategy so that even if data ends up in different locations, there is something to link it together so it’s usable.”

While organizations can certainly be successful with aggregating their data on premise, Rao also believes that cloud is simply going to become the default for AI-enabled organizations in the near future.

“Anyone that wants to becoming a truly AI-driven organization is going to have to integrate cloud,” he stated.

“First of all, it’s more secure, despite what people thing.  I’d rather trust a company that is putting billions of dollars into their security protocols than trust myself, that is for certain.”

“Secondly, the rate of capabilities is advancing far quicker than I could manage on my own.  Even if I could build a data center that is the equal of Amazon or Google or Microsoft, it would be outdated in six months.  In two years, it would be a full generation behind.  It is not my mission to keep up with that incredible pace of change – I don’t have the budget or the time for it.  They do.”

Cloud offers the ability for individual organizations to aggregate their data in a single location – or in more than one location easily linked together – but it also creates the opportunity for multiple organizations to collaborate on developing and validating machine learning models.

Mark Morsch, VP of Technology at Optum
Mark Morsch, VP of Technology at Optum

Source: Xtelligent Media

Academic consortia and industry partnerships are proliferating as organizations seek to unlock the value in multifaceted datasets from complex populations, with Microsoft, Google, and Amazon all releasing FHIR-based tools, APIs, and data sharing platforms to support collaboration across disparate systems in health and life sciences.

For Epic Systems, aggregation in a single repository takes the form of the Epic-hosted Cosmos databank, part of the EHR company’s overarching vision for using machine learning to support evidence-based practice across the care continuum.

“Scale is very important for artificial intelligence,” said Hain.  “You do need a certain amount of volume to make sure you are capturing a balanced dataset and including enough elements to feed a model.”

“But also have to make sure you are using the right data that is targeted and tailored to your own population.  You can’t do either of those if your data isn’t accessible and centralized.  There are plenty of choices about how to do that, but having access to the right data is obviously critical.”

In conjunction with the renewed pressure from CMS to share data openly and appropriately, cloud is becoming more attractive to organizations that want to meet regulatory mandates and prepare themselves for an AI-first approach to care delivery.

“The benefits certainly outweigh the risks,” observed Burnett.  “There are all kinds of opportunities out there for organizations that can aggregate their data and create the right environment for machine learning.  Cloud is an important tool in that process, and it’s something for organizations to strongly consider as they develop their roadmaps.”

Implementing the technical foundations for artificial intelligence, whether cloud-based or on premise, will be vital for allowing healthcare to continue its journey towards prescriptive insights and a higher level of analytics maturity.

“The possibilities of what we can do with machine learning are nearly endless,” said Morsch.  “And the timing is just right, because the amount of data we have right now for making decisions is simply too immense for any human to process, and it’s only going to keep growing.”

“It’s very exciting that we have AI to help providers with that.  It’s exciting, but it’s also a bit daunting.  I’m looking forward to seeing how the industry rises to the challenges of AI and supports these brilliant people as they deliver care.”