Home » Self-regulation, transparency and surveillance: the issues to be resolved in the Ai Act

Self-regulation, transparency and surveillance: the issues to be resolved in the Ai Act

by admin
Self-regulation, transparency and surveillance: the issues to be resolved in the Ai Act

Listen to the audio version of the article

Thirty-four business associations, authors and artists from the entire cultural world ask the Government with one voice that Italy change its position on the European artificial intelligence regulation. The appeal comes in view of the Trilogue on 6 December which will have on the agenda the delicate negotiations to approve the Artificial Intelligence (AI) Act, the European regulation which intends to establish a legal framework for the development of artificial intelligence, including the generative one. «We strongly ask the Italian Government to support balanced regulation which, by guaranteeing the transparency of sources, favors the development of artificial intelligence technologies, while protecting and promoting original human creativity and all the cultural contents of our country». Similar appeals have been signed by French and German associations. The goal: that the three European countries that have so far opposed more stringent legislation change their position on the European regulation.

What are the issues in the December 6th meeting?

Under discussion on Wednesday will therefore be the rules to be applied to foundational models which are all those AI systems capable of carrying out different tasks thanks to the use of unstructured data. And therefore generative artificial intelligence applications such as ChatGpt or Midjourney fall into the category. Last week, Italy, France and Germany expressed their opposition in a document to introducing “untested rules” on the most performing AI models such as GPT-4, the basis of the ChatGPT chatbot, into the provision. On this point, one of the thorniest of the negotiations involving the European institutions, the three EU big names suggest choosing the path of self-regulation through codes of conduct for AI developers. And this is to avoid burdening companies with excessive administrative burdens that would stifle innovation in a crucial sector for the future. This is the issue on which there is a diversity of views. At the forefront is France with Emmanuel Macron, who in recent days had urged the development of “non-punitive regulation to preserve innovation”.

See also  iOS 18 Notes notes will add a lot of new features | Mobile Daily

Who will save creativity (and creatives) from the Bigs of artificial intelligence?

The reasons for the appeal

We know that protected works, voices and images are used without the consent of the rights holders to generate new content. Some of these uses may infringe not only the copyrights but also the moral and personality rights of the authors and jeopardize their personal and professional reputation. Furthermore, there is the risk that authors, artists and cultural and creative businesses will have their original work replaced, forcing them to compete with their digital replicas which would gain obvious advantages in various respects with serious economic consequences as well. That said, those who signed the appeal ask that the AI ​​Act must guarantee that absolute priority is given to the maximum transparency of the sources used to train its algorithms, for the benefit of the creatives and industries we represent and more generally for European society. «The obligations envisaged – we read – should be applied to developers and operators of upstream and downstream generative AI systems and models with particular reference to the obligation to preserve and make publicly available sufficiently detailed information on the sources, contents and works used for training, to enable parties with a legitimate interest to determine whether and how their rights have been infringed and to take action. These obligations must at least be extended to all systems made available in the EU or generating outputs used in the EU, commercial or non-commercial and lead to a presumption of use in case of non-compliance allowing the right holders to exercise their prerogatives also for the granting of licenses. It is crucial to recognize that none of the protections based on legal instruments already existing in the legislation European Union has the least chance of working if rigorous and specific transparency rules are not imposed on generative AI developers.”

The position of the European Parliament

In the search for a delicate balance between progress and the protection of human rights, it is up to the European Parliament to put its foot down. “We are not willing to accept light self-regulation for the most powerful models” Brando Benifei, head of the delegation of the Pd MEPs in Parliament, told Ansa European and rapporteur of the AI ​​Act, opening up, however, to the possibility of limiting the field of application of this specific regulation to models for general use. Codes of conduct are not sufficient, just think, explains Benifei, of the OpenAI affair which ” it also showed all the instability of the governance of companies developing powerful models, which entail a systemic risk”. It is therefore imperative to introduce “clear” and “sanctionable obligations”, says the dem MEP, recalling that in the proposal of Rome, Paris and Berlin “there is no incentive to respect the self-attributed rules”.

See also  discount of over 600 euros on a Top Seller LG TV

The issue of surveillance and biometric recognition.

Another point of discussion is the surveillance issue. The European Parliament has called for a ban on the use of AI to analyze and identify people and to perform biometric recognition in real time. Some States have asked for exceptions and the Council’s position seems to be more inclined to authorize forms of surveillance in certain situations. However, the possibility of a compromise seems real.

You may also like

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.

This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Accept Read More

Privacy & Cookies Policy