
Protect My Art 作者: Binh Khang Nguyen
Monitor websites for AI training protection tags (noai, noimageai) and help protect artists' work
尚无用户尚无用户
您需要 Firefox 来使用此扩展
扩展元数据
关于此扩展
A very basic extension that:
- Checks robots meta tags of webpage for "noai" and "noimageai"
- Visually indicate if the webpage withdraws consent for AI training
This way, creators can check if the artwork they've uploaded has had their consent withdrawn. It is recommended for creators who wish to opt out of training to apply the following line of code to the HTML header of their eg. Portfolio, blogs, etc.:
<meta name="robots" content="noai, noimageai">
Or, if they are uploading to a platform that offers the option, to enable these tags on the webpages of their artworks.
I made this extension with the aim of helping "noai" tags become the industry standard for withdrawing consent.
Generative AI models use crawlers to scavenge for new images to add to their training dataset, but this is often against the creators' will. In response, some use adversarial perturbation (poisoner) tools such as Nightshade to deter crawlers from violating consent, increasing training costs for companies. With opposing interests and a lack of formal collaboration, AI training costs will go up and artists will be forced to run resource-intensive software to poison their art, creating a lose-lose scenario.
Making these simple meta tags the standard for consent will give companies the means to work with artists, and no excuses not to. Attempts were already made by websites such as DeviantArt or ArtStation, and more awareness should be brought to this to help realise this vision. As of now, these tags are yet to be recognised, so I still recommend every artist who publishes their posts publicly to use the above-mentioned poisoner tools. With collective effort, we will create sufficient obstacles to encourage collaboration.
GitHub: https://github.com/SirSaltySalmon/protect-my-art/tree/main
- Checks robots meta tags of webpage for "noai" and "noimageai"
- Visually indicate if the webpage withdraws consent for AI training
This way, creators can check if the artwork they've uploaded has had their consent withdrawn. It is recommended for creators who wish to opt out of training to apply the following line of code to the HTML header of their eg. Portfolio, blogs, etc.:
<meta name="robots" content="noai, noimageai">
Or, if they are uploading to a platform that offers the option, to enable these tags on the webpages of their artworks.
I made this extension with the aim of helping "noai" tags become the industry standard for withdrawing consent.
Generative AI models use crawlers to scavenge for new images to add to their training dataset, but this is often against the creators' will. In response, some use adversarial perturbation (poisoner) tools such as Nightshade to deter crawlers from violating consent, increasing training costs for companies. With opposing interests and a lack of formal collaboration, AI training costs will go up and artists will be forced to run resource-intensive software to poison their art, creating a lose-lose scenario.
Making these simple meta tags the standard for consent will give companies the means to work with artists, and no excuses not to. Attempts were already made by websites such as DeviantArt or ArtStation, and more awareness should be brought to this to help realise this vision. As of now, these tags are yet to be recognised, so I still recommend every artist who publishes their posts publicly to use the above-mentioned poisoner tools. With collective effort, we will create sufficient obstacles to encourage collaboration.
GitHub: https://github.com/SirSaltySalmon/protect-my-art/tree/main
评分 0(1 位用户)
权限与数据详细了解
必要权限:
- 访问您在所有网站的数据
更多信息
1.1.2 的发布说明
Updated logic to fix bug involving restricted pages (no content script injected).
Binh Khang Nguyen 制作的更多扩展
- 目前尚无评分
- 目前尚无评分
- 目前尚无评分
- 目前尚无评分
- 目前尚无评分
- 目前尚无评分