In response to the scourge of the Deepfakes, Denmark is preparing to take a radical measure: this Thursday, the government announced its intention to Evolve copyright law To ensure that each person can have an exclusive right to the use of their body, their appearance and their voice.
Since the popularity of the Deepfakes exploded with the rise of the generative AI, these matted multimedia content to imitate a real person have regularly been talked about for bad reasons. Everyone knows today that beyond simple humorous diversions, they can become weapons of identity theft, or even mass disinformation.
The Danish government has decided that these practices represented an increasingly important risk, and that it was high time to ensure that the law was able to protect the victims. The country’s Ministry of Culture is therefore preparing to introduce a Amendment to copyright law to achieve this.
« With this bill, we send a clear message: everyone has a right of way on the use of their body, their voice and their face “Said Jakob Engel-Schmidt, Danish Minister of Culture, in an interview with Guardian. « Human beings can be copied digitally and used for all kinds of ends, and I am not ready to accept this “, He added.
And he is apparently not the only one. Also according to the British media, a large majority of Danish deputies, regardless of their political affiliation, would have supported this amendment, which will be studied this summer before being officially subject to the fall.
These changes, if they are actually approved, will allow Danish citizens to report any multimedia content imitating their appearance or their voice without explicit consent, andrequire its withdrawal of online platforms.
What about image generation services?
In parallel, the amendment would also cover the ” Realistic imitations, generated by digital tools »The work of artists. One way to punish plagiarism via generative AI, with potential compensation for the victims.
Currently, however, it is not known whether this point would also concern image generation services such as those offered by Openai and others. It is indeed a problem that is more difficult to approach. As a reminder, the AI models that underlie most of these services have been trained by harvesting millions of images in the four corners of the web, including artists who have not given their consent. However, if AI models are now very good at learning, we still have all the sorrows of the world to make them “forget” data.
Should we therefore force all AI actors to reread their models from zero to exclude these protected content? Is it only economically possible, legally? It seems highly improbable in the current state of things. In this specific case, it will therefore be necessary to count on the cooperation of companies. Some have already set up safeguards, such as the prohibition of certain prompt problematic (“generates an image in the style of …”). But it is clear that there remains a lot of work at this level.
The importance of common legislation
Be that as it may, in the event that operators in these platforms are reluctant to cooperate, the Danish government intends to do everything in its power to force them. The Minister notably mentions “ severe fines “, But also direct requests from the European Commission. The latter could then use levers like the Digital Services Act (DSA) To force offenders to follow the rule.
But for such a device to really can express its full potential, it would ideally be generalized. However, to date, no equivalent legislative framework has yet been adopted elsewhere in Europe. Jakob Engel-Schmidt therefore hopes that other countries will follow the example of Denmark, so that The whole old continent can lead a coordinated struggle against this systemic scourge.
It remains to be seen whether the rest of the European community will follow suit in the Nordic country, starting with states like Germany and Italy, which are also examining anti-deepfake devices.
🟣 To not miss any news on the Geek newspaper, subscribe to Google News and on our WhatsApp. And if you love us, .