The secret to success in health IT implementations

The secret to success in health IT implementations
Jul 22, 2021
6 MIN. READ
Advances in technology mean there are more ways to support the healthcare delivery ecosystem and health science researchers than ever before, but technologists and data scientists also need to understand the science, data, and applicable use cases to deliver mission impact and insights most effectively.

Monitoring disease exposure. Accelerating scientific research. Improving healthcare delivery. As federal health agencies seek to take advantage of modern technologies to deliver on their missions, they have the tools and data to make dramatic improvements. But health IT implementations can get snagged or derailed in a number of ways—with potentially dire consequences for public health and safety. 

ICF’s public sector chief technology officer, Kyle Tuberson, sat down with ICF’s senior vice president and federal health IT expert, Karen Holloway, to discuss what makes for a successful engagement. This interview has been edited and condensed for clarity. 

Kyle Tuberson: First off, can you share your perspective on the tech evolution in the healthcare industry?

Karen Holloway: Health IT has come a long way in the last 25 years since I joined the industry. At the beginning of my career, engagements were principally focused on moving away from paper-based systems and digitizing records via OCR, implementing Electronic Medical Records (EMRs), and automating data ingestion from lab systems or medical devices. Today, we’re leveraging big data pipelines, AI, and machine learning to integrate health data from thousands of healthcare systems to perform surveillance and analysis.

In the field of biomedical, life science, and health science research we’ve moved beyond basic data management technologies that aid with data curation, metadata, and processes for data life cycle analysis to genomic data visualization and personalized medical therapeutics. 

 
Go to ICF
While technologies have evolved considerably, specialized knowledge in healthcare or health science remains essential.

Kyle Tuberson: What do you see as one of the biggest health IT challenges today?

Karen Holloway: Siloed data. We got here through the impetus of the HITECH Act as multiple implementations and vendor products began to proliferate. At the time, we placed a great deal of focus on terminology services, semantic operability, data standardization, and interoperability. 

However, many barriers to siloed data have been reduced with Office of the National Coordinator (ONC) leadership, the advent of electronic health information exchange (HIEs), Regional Health Information Organization (RHIO), The Institute of Healthcare Executives and Suppliers (IHES), and the market’s commitment to data sharing and interoperability. For example, healthcare providers now use electronic health records (EMRs) to send orders to pharmacies. Lab results can be obtained electronically and displayed within a patient record, and referrals can be sent electronically to external provider networks. Although we can point to many successes, the COVID-19 pandemic has demonstrated that we still have a long way to go to integrate a national healthcare ecosystem. [Ed. For more on this topic, explore our public health enterprise podcast series.]

Kyle Tuberson: What is the state of federal health IT today?

Karen Holloway: COVID-19 has certainly changed everything. For U.S. federal health agencies, the stakes are incredibly high as they work to stop further spread of the virus and better prepare the nation for future pandemics. There has been a discernible shift to unlocking data to generate valuable insights. 

Today, researchers can easily manage healthcare data with the advent of large data pipelines, data lakes, automation tools and technologies such as machine learning and artificial intelligence, predictive analytics, and cloud computing. With those tools, researchers can aggregate datasets and leverage graphics processing units (GPUs) to run models and experiments across studies, research disciplines, and datatypes. Also, data sharing is now possible through adherence to data science standards like Fast Healthcare Interoperability Resources (FHIR) and the National Institutes of Health (NIH)’s FAIR Principles. 

Kyle Tuberson:  You mentioned biomedical data. Can you elaborate on how bioinformatics plays a role in health IT?

Karen Holloway: Bioinformatics, or the marriage between biology and computer science, has played a significant part in leveraging technologies, tools, and processes to advance the health research agenda. At its core, the bioinformatics field has always been devoted to bringing the disciplines of science and IT together to make sense of biological data. With the advent of gene sequencing, and the sheer volume of genetic data, the importance of bioinformatics has elevated. The volume of genetic data collected, stored, and analyzed to extract meaningful findings presents an ongoing challenge. But with bioinformatic tools, it’s possible to use molecular profiles to identify complex and rare genetic disorders and target potential therapies.

Technology advances and the contribution of bioinformatics have led to the discovery of more effective diagnostic methods, treatment guidelines, personalized medicine, vaccine and therapeutics development, better clinical quality, and patient care. The evolution of those technologies and standards also enables public health disease surveillance through the exchange of data from clinical records obtained from provider networks, hospital systems, and state and local health departments.

Kyle Tuberson: How can we ensure successful health IT implementation?

Karen Holloway: While technologies have evolved considerably, specialized knowledge in healthcare or health science remains essential. Subject matter and domain experience are crucial to a successful team. Technologists and data scientists must have knowledge of clinical or research user needs, typical workflows, terminology, what the data means, the relationship of data entities, and an understanding of the context of problems and hypotheses. It’s that domain fluency plus the programming, modeling, and data visualization skills that create strong, mission-focused IT teams.

We combine domain experts with technologists to empower agencies to achieve their mission objectives—with speed and accuracy.

Kyle Tuberson: Can you share an experience in which you identified subject-matter expertise to be as critical as technological advancements? 

Karen Holloway: Early in my career as a developer, I relied heavily on the resident nurses within my project teams to understand terminology, workflow, and clinical informatics. Those insights are critical to achieving successful EMR and lab system implementations. 

I recall a time when—prior to joining ICF—I was responsible for implementing big data and machine learning for a healthcare predictive analytics system. I quickly realized how frustrating and inefficient it is to have a team without subject-matter expertise managing the IT implementation. Our team of pure data scientists had never worked with healthcare data or in a clinical setting. We struggled with understanding basic use cases and failed to develop meaningful algorithms or derive patterns and insights that resonated with our clients. Eventually, we pulled in consultants to help us fill our subject-matter expertise gap, but it’s a lesson I still carry with me today. Since then, I have ensured that any team I build—or clients I support—will have a multi-disciplinary structure and comprise technologists and data scientists with a bioinformatics background, supported by subject-matter and domain experts.

Kyle Tuberson: Any final takeaways?

Karen Holloway: Part of our approach at ICF is to create integrated, multi-disciplinary teams. We have thousands of subject-matter experts in health sciences who work alongside our digital and technology experts to create holistic solutions. Together, they support health research, bioinformatics, data analytics, scientific application development, and big data pipelines. 

The combination of these skills affords us the opportunity to be at the forefront in the fight against COVID-19 working alongside our federal agency partners. One of the projects I’m most proud of is BioSense. This is the digital platform at the epicenter of the National Syndromic Surveillance Program’s (NSSP) operations. BioSense is the Centers for Disease Control and Prevention’s (CDC)’s cloud-based public health surveillance system that enables local, state, and national health officials to monitor and respond to disease, addiction, and hazardous conditions. The BioSense platform, which ICF completely re-architected in 2014, makes it simple and efficient for healthcare providers to support and collaborate with the government on public health initiatives. It also makes it easy for public health officials to spot emerging health trends and epidemics, including the current COVID-19 pandemic, and then make impactful, data-driven decisions.

From strengthening the CDC’s public health surveillance system to quickly developing a portal to help the NCI understand how genetics impact COVID-19 symptoms, we combine domain experts with technologists to empower agencies to achieve their mission objectives—with speed and accuracy. 

Meet the authors
  1. Karen Holloway, Senior Vice President, Federal Strategic Initiatives

    Karen brings more than 25 years of expertise providing strategy, business transformation, and technology solutions for commercial and public sector clients in healthcare, environmental, and energy sectors.  View bio

Get new digital transformation insights sent to your inbox