News & Insights

Articles, Company Updates and More

     

Artificial Intelligence (AI) and HIPAA Compliance

March 27, 2024

By Kenneth E. Rhea, M.D., FASHRM, LAMMICO Physician Consultant


Artificial Intelligence (AI) and HIPAA Compliance
SHARE :           

Since the introduction of ChatGPT (Generative Pre-trained Transformer) in June 2020 by Open AI, the prominence of AI and machine learning foundational models have dramatically increased across all industries affecting operations and creating projections of both extreme benefits and concerns. Healthcare has been no exception. Discussions of AI and medical applications have shown that technology is already being incorporated into apps and used by millions of users.[1]

Before ChatGPT, AI was already prevalent in our lives. Streaming providers recommend TV shows and movies based on your preferences and search and watch history. Cars can “self-drive” with very little human control. Speech recognition and speech-to-text technology allows for accurate dictation and command of our devices without ever picking up a remote or our phones. And of course, chatbots, which have gotten better at resembling human texting.

AI is currently being used in medicine for numerous applications. Isabel, one of several clinical decision support tools currently used, has long been used by clinicians to improve diagnosis quality, leading to more cost-effective care. Other applications are relevant to compliance with the HIPAA privacy and security regulations. For instance:

  • AI-powered systems can enhance data security by implementing encryption, access controls, and monitoring mechanisms to safeguard patient information. These systems can detect anomalies in data access patterns and identify potential breaches.  Threat detection is one primary compliance goal that can be assisted by role-generative AI.[2]
  • Privacy and security regulations are based on the protection of individually identifiable health information, which, when handled in specific ways, becomes protected health information (PHI).  Therefore, anonymization and de-identification become necessary for some uses and disclosures. AI algorithms can assist in this process.
  • The HIPAA Security Rule requires training the medical workforce on the privacy and security of PHI. AI-driven platforms can provide (and perhaps even create) ongoing training and education to healthcare staff about HIPAA regulations, ensuring they stay updated and informed about compliance requirements, e.g., security reminders.
  • Control of access to medical information within medical systems is critical to compliance. AI can streamline and automate auditing processes and maintain comprehensive audit trails that track data access, modifications, and disclosures to assist with accurate compliance and related reporting, e.g., breach response processes.

While AI technologies can significantly assist healthcare providers in complying with HIPAA regulations, they should be consistently implemented and used very cautiously. Responsible and ethical use of AI in healthcare is crucial to ensure that patient privacy and data security are always maintained.  Human oversight and interpretation remain essential to validate AI-generated insights and decisions in any healthcare setting.  Remember that compliance with HIPAA regulations is the responsibility of the covered entity provider, and no amount of AI software assistance changes that fact.


[1] Byron J. AI and HIPAA Privacy Concerns. AIHC. https://aihc-assn.org/ai-and-hipaa-privacy-concerns/. August 3, 2023.

[2] Burshan C. The Role Generative AI Can Play in Threat Detection. InfoRisk Today. https://www.inforisktoday.com/role-generative-ai-play-in-threat-detection-a-22889?rf=2023-08-23_ENEWS_SUB_IR__Slot1_ART22889&mkt_tok=MDUxLVpYSS0yMzcAAAGNwmvcizJQktbS11bNbkCErgNrc1sDMGK9-n2u7yHdfUixnJfQGjBXs8IkCaZgv4LcmsEFtPNezXx_XNxaDpxRi1vrp54eCbuu3hE0jxnJNhx7sd0. August 22, 2023.


Annual Reports:

Receive Regular Updates: