AIP.IQ demonstrates AI and ML capabilities in predictive health analytics hackathon
November 10, 2021
November 10, 2021
Federal agencies increasingly need to deliver analytical insights at speed and at scale. By adopting solutions that can process data faster and with less resources, agencies can make the most of their data and overcome the widening data science talent gap.
Consider federal health agencies - while we’ve seen promise and progress, there’s still much that can be done to translate large medical datasets into actionable, lifesaving insights. There are institutional restrictions, such as privacy requirements and data silos, that can make connecting the dots difficult. More broadly, though, this data is complex, and agencies may not have the skills, tools, or bandwidth to fully index and explore it. For example, electronic healthcare records often face issues – including inconsistent demographics for the same patient identifier and different versions of the same diagnosis codes (ICD) – that impact speed to insight.
With this in mind, our Applied Intelligence Discovery Lab hosted the Eagle Challenge hackathon to show how a platform-based approach – when combined with human ingenuity – can make it easier to ingest, process, exploit and relate big data, no matter the subject matter.
In this instance, we analysed data for pressing healthcare issues, including drug overdose and chronic conditions including high blood pressure, high cholesterol, and high blood sugar. Teams of artificial intelligence (AI) and machine learning experts took an iterative, experimental approach to analysing the data with Accenture’s AIP.IQ platform. Working part-time over the course of only eight weeks, teams outlined key trends to help treat and predict these conditions. They left the hackathon with reusable models that can then be immediately adapted and applied to a range of federal use cases.
The hackathon demonstrates a key opportunity for federal agencies: Data analytics platforms like AIP.IQ offer the ability to combine various data sources and feeds with both rich data management and analytics tools from a variety of leading vendors in a FedRAMP-authorized, cloud-based environment. Working as an integrated tool set with predefined data templates, process mapping, and specialized algorithms, analysts can spend less time on basic configuration and more time testing and refining multiple hypotheses.
<<< Start >>>
In the case of federal health agencies, this means less hours spent on data analysis and more spent using data-driven insights to help deliver on the core of the mission: Providing high-quality patient care.
<<< End >>>
The Eagle Challenge event tapped into IBM Explorys Electronic Health Record (EHR) data — deidentified, longitudinal, patient-level data for over 53 million unique patients. The hackathon broadened the aperture on the Explorys data set, with experts using a range of tools and processes within AIP.IQ.
One team analyzing risk of drug overdose made a scalable and adaptable Extract, Transform, and Load (ETL) process for the Explorys data. They used this to develop a 77% accurate random forest model useful for identifying at-risk patients. The accelerated failure time (AFT) model can be used with all available covariates, and the outcomes of AFT risk factor analyses can be used for future feature extraction.
Another team layered on additional dimensionality to their analysis by looking at what other drugs were being used by those who fell victim to opioid abuse, finding that combinations of non-opioid drugs may be proxy indicators of potential drug abuse. They were able to develop a 75.8% accurate general risk model and over 99% accurate heroin-specific model for indicators of opioid risk.
Finally, another team dove into indicators for cardiac arrest diagnoses, developing a novel time series analysis indicating a correlation between state monthly unemployment and per-capita heart attack diagnoses. They did so by creating an ETL pipeline capable of collating and analyzing data from the Census Bureau, Bureau of Labor Statistics, and the Explorys dataset. The same framework can scale for additional economic factors and diagnosis codes.
Large-scale data analysis has the potential to transform the federal government and a shift to the cloud has made this a practical possibility.
<<< Start >>>
<<< End >>>
Agencies stand to gain valuable operational insights, not just in healthcare but across a wide range of federal mission areas. We hosted a separate hackathon recently that looked at mobility data — leveraging AI to analyze use cases including transportation, the spread of disease, and national parks management. Another upcoming event will focus on refugee data, tapping a range of demographics to further understand this complex population.
By applying data science at scale to real-world problems using AIP.IQ, our hackathon shows the art of the possible while at the same time creating reusable assets — models that can be applied across a range of immediate use cases without added technical debt.
Our hackathons invite teams to engage with data in new ways. However, this innovative mindset can and should extend beyond special events to become the day-to-day. As agencies collect increasing quantities of data, understanding how to process and analyze that data more effectively will be essential to meeting mission needs.
Ultimately, the Eagle Challenge is evidence of the benefits of a platform-based approach to big data analytics, and the outcomes that can be generated when we apply AI and machine learning solutions to data at scale.