By Alex MacLeod, Director of Healthcare Commercial Initiatives at InterSystems |
Every time I go to a new healthcare provider, I’m reminded of the importance of data. We must constantly confirm or fill in the gaps based on our health history, and I’ve lost track of how many HIPAA forms I’ve signed over the years. In these moments, we don’t often think about how the abundance of healthcare data helps ensure we receive the best care. The emergence of advanced digital technologies on the healthcare scene has since created a tidal wave of big data generated and collected from countless sources, such as electronic health records (EHRs), provider and payer records, lab results, research, clinical trials, surveys, apps, and medical wearables, and more.
While big data isn’t a new concept, in the past five to ten years especially it has significantly influenced how we approach interoperability in healthcare IT. A common way to understand the dimensions of big data is through the three V’s: volume, velocity, and variety. Each of these represents an opportunity in the world of artificial intelligence (AI) that can shape the way we interact and share data in the health industry. By taking a deeper look at these three characteristics of data, we can gain a greater understanding and appreciation for what must happen on the back end to accurately process and unify healthcare information for all.
Volume: More Accurate Models
One thing the healthcare community is not lacking in is a vast amount of data. And when it comes to creating models that apply data to the decision-making process, the more of the right kind of data (i.e. clean and healthy) available, the better. Large data volumes allow machine learning to build more accurate models and recognize trends. With more data to train and validate those models, users can create dedicated models for specific audiences. The abundance of data helps segment observations and can lead to more action-driven analytics. Across the board, industries are realizing the wealth of potential data to which they have access. In the healthcare IT sector, for example, the digitization of patient records and innovation of tracking devices make the sheer amount of available data swell, increasing the possibilities of how the data and analytics can be applied to patient care and operational processes.
Velocity: Real-Time Inferencing
Velocity is the difference between running a model in real-time compared to a post-completion analysis. High-velocity streaming data creates an opportunity for real-time inferencing, which is critical for clinicians to gain actionable insights and more accurate care decision-making. Running models across health IT systems in real-time enables reinforcement learning, where the models can interpret the environment and learn through trial and error. But keeping up with the fast-paced velocity of health data requires a free flow of data with the proper infrastructure to support it. AI is needed to first access the data that has remained static and siloed for years, whether in disparate networks or incompatible formats. Once that’s brought to the forefront, AI can create a faster data pipeline to give healthcare workers the edge they need.
Variety: Cross-Comparing Data Types
As health data becomes more easily shareable, it presents a new problem in different data types. Data now comes in hundreds of different formats, from pure text to audio and video to natural language. This makes it challenging to develop algorithms that can leverage these datasets effectively and translate them across different systems. But machine learning presents a solution: deep learning is just one example of the set of practices and algorithms that’s well-suited to deal with these kinds of content. While clinicians have struggled for years with having no control over the initial input of data, machine learning can assist with processing a variety of unstructured data formats such as clinical tests, diagnostics, lab results, and patient records, among others. As a result, data is transformed into a unified hub of information that enables better care outcomes.
Approaching healthcare IT with these three dimensions of big data in mind requires forging a place for AI and machine learning in the industry. By delivering the ability to translate full sets of structured or unstructured data with these technologies, we can move beyond previous rigid and siloed systems that have prevented unity across health records, care collaboration and consequent treatment. Integrating these technologies as a pillar in the health ecosystem results in more interoperable data systems and allows for the seamless movement of data across the continuum to support positive patient outcomes.
The three V’s have always been a helpful framework for data management, but they’re now even more impactful due to the influence they’ll have on AI development in healthcare. As we continue to vet our data through these three dimensions, we unlock the ability to extract even more value from the data, ultimately empowering our solutions to deliver on the promise of health-driven AI.
About The Author
Alex MacLeod is the Director of Healthcare Commercial Initiatives at InterSystems where she leads the company’s healthcare data management capabilities. She’s been an integral part of the InterSystems team for over twenty years; starting as an hands-on technical intern, she worked her way up to a leadership position as director at the company’s Boston headquarters. Moving into the healthcare business unit has allowed her to focus on her commitment to improving patient care for all while advancing healthcare into the digital age.
Originally from Germany, she holds a degree from the Technische Universität Darmstadt and holds several certifications as a HealthShare Technical specialist.