
Protect My Art 作者: Binh Khang Nguyen
Monitor websites for AI training protection tags (noai, noimageai) and help protect artists' work
無使用者無使用者
必須使用 Firefox 才能使用此擴充套件
擴充套件後設資料
關於此擴充套件
A very basic extension that:
- Checks robots meta tags of webpage for "noai" and "noimageai"
- Visually indicate if the webpage withdraws consent for AI training
This way, creators can check if the artwork they've uploaded has had their consent withdrawn. It is recommended for creators who wish to opt out of training to apply the following line of code to the HTML header of their eg. Portfolio, blogs, etc.:
<meta name="robots" content="noai, noimageai">
Or, if they are uploading to a platform that offers the option, to enable these tags on the webpages of their artworks.
I made this extension with the aim of helping "noai" tags become the industry standard for withdrawing consent.
Generative AI models use crawlers to scavenge for new images to add to their training dataset, but this is often against the creators' will. In response, some use adversarial perturbation (poisoner) tools such as Nightshade to deter crawlers from violating consent, increasing training costs for companies. With opposing interests and a lack of formal collaboration, AI training costs will go up and artists will be forced to run resource-intensive software to poison their art, creating a lose-lose scenario.
Making these simple meta tags the standard for consent will give companies the means to work with artists, and no excuses not to. Attempts were already made by websites such as DeviantArt or ArtStation, and more awareness should be brought to this to help realise this vision. As of now, these tags are yet to be recognised, so I still recommend every artist who publishes their posts publicly to use the above-mentioned poisoner tools. With collective effort, we will create sufficient obstacles to encourage collaboration.
GitHub: https://github.com/SirSaltySalmon/protect-my-art/tree/main
- Checks robots meta tags of webpage for "noai" and "noimageai"
- Visually indicate if the webpage withdraws consent for AI training
This way, creators can check if the artwork they've uploaded has had their consent withdrawn. It is recommended for creators who wish to opt out of training to apply the following line of code to the HTML header of their eg. Portfolio, blogs, etc.:
<meta name="robots" content="noai, noimageai">
Or, if they are uploading to a platform that offers the option, to enable these tags on the webpages of their artworks.
I made this extension with the aim of helping "noai" tags become the industry standard for withdrawing consent.
Generative AI models use crawlers to scavenge for new images to add to their training dataset, but this is often against the creators' will. In response, some use adversarial perturbation (poisoner) tools such as Nightshade to deter crawlers from violating consent, increasing training costs for companies. With opposing interests and a lack of formal collaboration, AI training costs will go up and artists will be forced to run resource-intensive software to poison their art, creating a lose-lose scenario.
Making these simple meta tags the standard for consent will give companies the means to work with artists, and no excuses not to. Attempts were already made by websites such as DeviantArt or ArtStation, and more awareness should be brought to this to help realise this vision. As of now, these tags are yet to be recognised, so I still recommend every artist who publishes their posts publicly to use the above-mentioned poisoner tools. With collective effort, we will create sufficient obstacles to encourage collaboration.
GitHub: https://github.com/SirSaltySalmon/protect-my-art/tree/main
由 1 位評論者給出 0 分
權限與資料了解更多
必要權限:
- 存取您所有網站中的資料
更多資訊
- 版本
- 1.1.2
- 大小
- 23.07 KB
- 最近更新
- 1 天前 (2025年9月14日)
- 相關分類
- 授權條款
- MIT License
- 版本紀錄
- 新增至收藏集
1.1.2 版的發行公告
Updated logic to fix bug involving restricted pages (no content script injected).
Binh Khang Nguyen 製作的更多擴充套件
- 目前沒有評分
- 目前沒有評分
- 目前沒有評分
- 目前沒有評分
- 目前沒有評分
- 目前沒有評分