- From: Jerven Bolleman <me@jerven.eu>
- Date: Tue, 13 Sep 2022 13:04:00 +0200
- To: Dan Brickley <danbri@danbri.org>
- Cc: paoladimaio10@googlemail.com, ProjectParadigm-ICT-Program <metadataportals@yahoo.com>, Public-cogai <public-cogai@w3.org>, W3C AIKR CG <public-aikr@w3.org>, public-lod <public-lod@w3.org>
Dear All, Before, wondering what to do. I suggest actually taking the time to read the proposed law. Find what specific parts you dislike and then write to the members of the european parliament that you think would be willing to propose an amendment. Personally, I read through it (in the dutch version) and do not feel there is anything to be done. The chances that a company could successfully sue an open-source developer is rather unlikely. Considering this applies to high risk use of AI in products or things that are products. High risk is defined in article 6, para 2 and annex 3, the number of open source projects that could fall under this category is IMHO approximately 0. With the caveat that open-source used in such projects should, again in my opinion, fall under the law. But as the vast majority of open-source is provided "as is" with a disclaimer of warranty in one way or another, any integration for use in a high risk AI must develop the technical documentation etc. to comply with the law. The risk is therefore properly on the integration side. Not the open-source developer side. e.g You are a developer (Alice) of a little program to help teach people to learn english. You make a little AI for scoring tests. You are not concerned because it is not high risk. continuing: You are a developer (Bart) of a software system testing if people's level of english is sufficient to be allowed into university. You are in the category of high AI. (Annex 3, item a,b) Bart uses the little AI open sourced above by Alice, You need to be able to explain and provide the documentation demanded. If you can't do this with the little AI module (written by Alice) then you (Bart) are not allowed to use it for your project.If this Bart does use it for the high risk AI project, the legal risks are extremely likely to be fully in Bart's court. Yes, Bart could sue Alice, chances of success are minimal under all reasonable standards. Could this be painful for Alice, yes. But being sued by random Bart's that have no standing is a risk we all already have and this does not really raise it IMHO. Best regards, Jerven On Tue, Sep 13, 2022 at 10:11 AM Dan Brickley <danbri@danbri.org> wrote: > > Is there any response from actual Open Source AI projects? Or collaborative data initiatives like Wikidata? > > Speaking of large tech companies - including my employer, Google - many have focussed very heavily on Open Source, and have significant investment in the Open Source ecosystem being healthy. > > Dan > > On Tue, 13 Sep 2022 at 01:13, Paola Di Maio <paola.dimaio@gmail.com> wrote: >> >> Do we have a plan? >> >> On Tue, Sep 13, 2022 at 12:13 AM ProjectParadigm-ICT-Program <metadataportals@yahoo.com> wrote: >>> >>> The EU AI Act could spell disaster for open source development and put the AI development in the hands of large companies in the corporate sector. This could also create problems for any technologies using ontologies, semantic web technologies, knowledge graphs, predictive algorithms, knowledge representation, the use of which could be construed as artificial intelligence. >>> >>> https://www.theregister.com/2022/09/11/in_brief_ai/ >>> >>> Milton Ponson >>> GSM: +297 747 8280 >>> PO Box 1154, Oranjestad >>> Aruba, Dutch Caribbean >>> Project Paradigm: Bringing the ICT tools for sustainable development to all stakeholders worldwide through collaborative research on applied mathematics, advanced modeling, software and standards development -- Jerven Bolleman me@jerven.eu
Received on Tuesday, 13 September 2022 11:32:17 UTC