The Legal and Regulatory Landscape Evolving with AI in Life Sciences

The Legal and Regulatory Landscape Evolving with AI in Life Sciences

evolving-ai
June 1, 2023

In the last ten years, more data and greater computing power have led to a boom in AI-related patent applications, with life and medical sciences emerging as a top application field. In 2021, more than 100 applications submitted to the US Food and Drug Administration contained artificial intelligence and machine learning (AI/ML) components.

According to Brigid Bondoc, Partner at Morrison Foerster in Washington, DC, the FDA shares an industry optimism that "AI/ML can help bring safe, effective, and high-quality treatments to patients faster," to quote Patrizia Cavazzoni, M.D., the director of the Center for Drug Evaluation and Research (CDER) at the FDA.

How the agency evaluates and manages emerging AI/ML technologies in drug and biologic applications were addressed in the panel "Artificial Intelligence in Life Sciences: The Evolving Legal and Regulatory Landscape," presented by Morrison Foerster (MoFo), on May 11, 2023, at the Association for Corporate Counsel (ACC) live CLE event "2023 Life Sciences Conference: Future-Ready Resilience: Preparing to Face Challenges in the Uncertain Future."

The panel, consisting of Bondoc; Anna Yuan, Senior Associate at MoFo; Wendy Chow, counsel to MoFo; and Lauren Wu, Head of Privacy, Sr. Dir. of Legal for Regulatory and Compliance, US Privacy Officer, and Global DPO at Evidation Health, discussed the evolving FDA policies and regulation on AI, data privacy implications, and protecting AI technology in life sciences.

FDA Regulatory

Bondoc presented the big picture on the FDA's AI regulation, where the agency actively monitors AI/ML software in medical devices and clinical developments. The agency may expand coverage into FDA-regulated activities, such as automation and learning of medical devices, diagnostic and therapeutic development efficiency, regulatory assessment, and post-market surveillance.

The 21st Century Cures Act excludes five types of software from the FDA's medical device definition and regulation, including administrative support software, wellness, and electronic patient records. Still, the act allows the FDA to "claw back any exception if it identifies risks associated with the software as a medical device," said Wu. With AI technologies, wellness, and electronic records, "the waters get muddier," commented Wu—for example, what to do with a pedometer that monitors sleep and blood pressure. It may become more than a wellness device and move into the regulated medical device space.

The FDA appears flexible and careful in developing an AI/ML regulation framework, offsetting the need to facilitate innovation while protecting public health. For example, Bondoc pointed to the agency's drafted guidance on Predetermined Change Control Plans (PCCP) for AI/ML-enabled device software functions, attempting to address a significant issue facing emerging technology.

Historically, changes to a cleared or approved medical device that could significantly affect the safety or effectiveness of the device required submitting a premarket notification 510(k). Yet, the notification program is "fundamentally at odds with the use of AI/ML in medical devices that facilitate continual improvements and modifications to the device based on data gathered while in use," said Bondoc. The FDA aims to develop a regulatory approach tailored to AI/ML-enabled devices, allowing for safe and rapid modifications in response to new data while ensuring safety and effectiveness.

The FDA sees ethical, privacy, and security issues using AI/ML in drug development. The agency is also concerned with the lack of transparency in algorithms that can lead to amplification errors or preexisting biases. "Although recognizing the problem, the potential regulatory solutions have yet to be provided," said Bondoc. Still, the agency aims to prevent algorithmic discrimination and advance the use of AI/ML techniques. It recently released a discussion paper, "Using Artificial Intelligence and Machine Learning in the Development of Drug and Biological Products." And to further address issues of AI in drug manufacturing, the FDA released “Artificial Intelligence in Drug Manufacturing.”

Protecting AI Technology in Life Sciences

Yuan confirmed an eight-fold increase in AI patents from 2017 to 2020, citing WIPO Technology Trends 2019: Artificial Intelligence and Mondaq, and observed a shift in the types of things patented. In the past, patents were more theoretical, such as novel AI algorithms for speech recognition. Nowadays, there are more patents on concrete, practical applications of machine learning in drug discovery and medical diagnostics.

Patent applications include novel, practical applications of AI algorithms, drugs developed by AI, and computer hardware configurations and optimizations to facilitate AI training and inference processes. The patents fall into two buckets: one, the application of known AI to specific fields and sectors, and two, new AI models and algorithms. Yuan said the first is generally a higher value in patenting with broader and more detectable claims vis-à-vis the second with narrow and mathematically oriented claims.

Privacy and AI

The use of AI/ML in life sciences involves data from individuals, even protected health information (PHI). At some point, whether it is the process's training input, inference, or output, the data use may run afoul of state privacy laws, implicate new legislation regulating AI and automated decision-making, or trigger federal law, including the Health Insurance Portability and Accountability Act (HIPAA) or the Federal Trade Commission Act prohibiting unfair and deceptive practices, such as the sale and use of racially biased programs.

Chow also detailed the states with comprehensive privacy laws, including California, Virginia, and Tennessee, with numerous state laws pending in Connecticut, Indiana, Iowa, Montana, and Washington. State legislatures complement privacy laws with laws regulating AI and automated decision-making, such as the New York AI Bias Law. Still, Wu put privacy legislation into context: "It's not necessarily the driver for AI; it's a stakeholder."

Blog Info
Eric Elting, TLS Director, Global Legal and Patent Business Development