Artificial Intelligence & Machine Learning , Events , HIMSS

HIPAA Considerations for AI Tool Use in Healthcare Research

Attorney Adam Greene of Davis Wright Tremaine on Compliance Concerns
HIPAA Considerations for AI Tool Use in Healthcare Research

The potential use cases for generative AI technology in healthcare appear limitless, but they're weighted with an array of potential privacy, security and HIPAA regulatory issues, says privacy attorney Adam Greene of the law firm Davis Wright Tremaine.

"At this point, it's just a matter of imagination of what AI can do with healthcare," Greene said in an interview with Information Security Media Group during the 2023 Healthcare Information Management and Systems Society Global Health Conference and Exhibition in Chicago.

AI poses potential privacy, security and HIPAA compliance issues that must be considered as the tools are embraced and further developed for healthcare sector use, he said.

The healthcare industry has already used some forms of AI for years such as by assisting radiologists detect tumors and other anomalies. "ChatGPT now, I think, is able to pass the medical exam to become a doctor," he quipped.

One privacy issue is AI's need for tremendous volumes of data, including patients' protected health information, to build AI tools.

"Ideally, you can use de-identified information, but sometimes that doesn't get the job done - and that's where navigating the privacy laws can get challenging."

Also, there is a lack of clarity out there in respect to when developing AI might be considered business operations under HIPAA. "That might depend upon whether it is primarily focused on improving the covered entity that is providing the protected health information being used."

Depending upon the circumstances, using PHI to develop AI may or may not be considered research under HIPAA.

In any case, the use of PHI for AI efforts raises a number of HIPAA concerns involving obtaining individuals' authorization, Greene said. Using PHI for AI research purposes "is not a 'get out of jail free' card" for compliance obligations, he said.

In the interview (see audio link below photo), Greene also discussed:

  • Patient consent considerations and data interoperability issues involving the use of PHI for AI research and development purposes;
  • The HIPAA "safe harbor" and "expert determination" methods of de-identifying patient information and how they would relate to AI research;
  • The Department of Health and Human Services' Office for Civil Rights proposed changes to the HIPAA privacy rule - and what might happen next on the regulatory landscape.

Greene specializes in health information privacy and security laws, including applying those laws to new technologies, such as artificial intelligence and machine learning. He formerly served as senior health information technology and privacy specialist at the HHS OCR, where he played a significant role in administering and enforcing HIPAA privacy, security and breach notification rules.




Around the Network

Our website uses cookies. Cookies enable us to provide the best experience possible and help us understand how visitors use our website. By browsing bankinfosecurity.asia, you agree to our use of cookies.