The use of facial recognition software will be further restricted, while compliance with the rules will be more controlled. Applications for which use is not legally permitted but is desirable must be provided with a legal basis as soon as possible. Otherwise the curtain will fall for these applications.
A majority in the House of Representatives yesterday supported a motion by GroenLinks-PvdA MP Barbara Kathmann to tighten enforcement. First, the government must map the use of facial recognition software in government and business. Where there is a legal problem, as far as the House is concerned it is the end of the matter. According to Kathmann, many municipalities and tobacconists use this software without a legal basis for it. “The GDPR rules say that it is not allowed, but it happens anyway,” she explains.
She calls for a permit requirement for the use of facial recognition in public spaces. The government must quickly investigate whether this is possible. According to the State Secretary, Justice and Security is already working on this, but according to Kathmann this only concerns the police. In addition to the left-wing parties, Kathmann also managed to gain the support of NSC and BBB. VVD, CDA and PVV were against. State Secretary Alexandra van Huffelen (Digitalization) has also advised against the motion because such software already falls under the GDPR privacy legislation and the Police Data Act.
Data ethics
The motion arose from the previously held parliamentary debate on the use of algorithms and data ethics. The idea was put forward to see how artificial intelligence (AI) can accelerate and improve work processes in government. D66 MP Hanneke van der Werf noted that the business community is already fully deploying AI and the government is lagging behind in digitization. The House is therefore asking the government to develop guidance for municipalities and other authorities with safe AI applications that can reduce the workload.
There was also support for a motion by Dogukan Ergin (Denk) to provide information in schools about digital systems and their risks. According to him, there is little knowledge about algorithms in education, which puts students at unnecessary risk of discrimination.