Germany: Artificial Intelligence (AI) in medical devices: The better the product, the harder the conformity assessment?
Whether medication software or tumor diagnostics - artificially intelligent software is indispensable in today's medicine. If the software makes diagnostic and/or therapeutic suggestions, it is usually a medical device. The manufacturer of a medical device is subject to an extensive program of obligations according to the Regulation (EU) 2017/745 (Medical Devices Regulation - "MDR"). For a software – that regularly belongs to risk class IIa or higher – a Notified Body must be involved in the conformity assessment. If the medical device is artificially intelligent, the conformity assessment can become challenging.
A legal definition of “artificial intelligence” does not (yet) exist. The legal definition contained in the draft for an AI Regulation under EU law (Artificial Intelligence Act – "AIA") is so unwieldy that little insight is gained from it. It defines AI as "software that is developed with one or more of the techniques and approaches listed in Annex I and can, for a given set of human-defined objectives, generate outputs such as content, predictions, recommendations, or decisions influencing the environments they interact with." In practice, a distinction continues to be made between static AI that has learned and operates in a learned state, dynamic AI that continues to learn in the field, and black box AI that does not explain how it comes to a result.
The German Notified Bodies Alliance (Interessengemeinschaft der Benannten Stellen für Medizinprodukte in Deutschland – “IG-NB”) is currently of the opinion that dynamic AI is in principle not certifiable. In general, certification can only be considered for static AI, since the learned state does not change. In the case of static black box AI, a decision would have to be made on a case-by-case basis.
The background to this is that in the case of significant subsequent changes after the conformity assessment of his product, the manufacturer is obliged to inform the Notified Body, which then decides whether a new conformity assessment is required or whether the change can be simply approved and a supplement to the respective EU certificate can be issued.
In the case of dynamic AI that continues to learn in the field (i.e., after conformity assessment), this obviously leads to the problem of theoretically constantly required notifications of change and possibly required recertifications. Clearly, this is not practicable. On the other hand, field data has enormous potential to improve the software. To ensure that innovation is not hindered by uncertainty, there will be no way around adapting the MDR and improving coordination with the AIA.