Mr Müller-Quade, why is it important to certify Artificial Intelligence?
Jörn Müller-Quade: How AI systems arrive at their decisions is often incomprehensible even to experts. This is also referred to as black box systems. A customer cannot judge for himself whether the use of an AI system is safe in a particular context. This is where a certificate can provide orientation and give conscientious manufacturers an advantage on the market.
What distinguishes the certification of Artificial Intelligence from the certification of other IT systems?
Jörn Müller-Quade: Decisions made by AI systems are often not easy to understand, especially in the case of learning systems. However, certification becomes much more difficult if one cannot understand the system. In some cases, one will probably have to combine well-understood protection mechanisms with AI. In addition, AI systems can learn, i.e., change dynamically, which is why a one-time, static certification is not sufficient. Certification must become an open process.
How does certification succeed in ensuring the quality of AI systems without inhibiting innovation?
Jörn Müller-Quade: Since certification can be complex and time-consuming, you should only certify AI systems that have an increased criticality. If, for example, an algorithm that is supposed to suggest pieces of music to me is wrong, that is certainly no drama and certification is not necessary. Certification is more necessary for autonomous driving or for AI systems in medical technology.