【】

The U.S. Senate has unveiled yet another AI protections bill among a series of similar initiatives, this time aimed at safeguarding the work of artists and other creatives.
Introduced as the Content Origin Protection and Integrity from Edited and Deepfaked Media Act (COPIED Act), the new legislation would require more precise authentication of digital content and make the removal or tampering of watermarks illegal, the Vergereported, under new AI standards developed by National Institute of Standards and Technology (NIST).
The bill specifically requires generative AI developers to add content provenance information (identification data embedded within digital content, like watermarks) to their outputs, or allow individuals to attach such information themselves. More standardized access to such information may help the detection of synthetic, AI generated content like deepfakes, and curb the use of data and other IP without consent. It would also authorize the Federal Trade Commission (FTC) and state attorneys general to enforce the new regulations.
A regulatory pathway such as this could effectively help artists, musicians, and even journalists keep their original works out of the data sets used to train AI models — a growing public accessibility issue that's only been exacerbated by recent collaborations between AI giants like OpenAI and media companies. Organizations like artist union SAG-AFTRA, the Recording Industry Association of America, the News/Media Alliance, and Artist Rights Alliance have come out in favor of the legislation.
"We need a fully transparent and accountable supply chain for generative Artificial Intelligence and the content it creates in order to protect everyone’s basic right to control the use of their face, voice, and persona," said SAG-AFTRA national executive director Duncan Crabtree-Ireland.
Related Stories
- Study of AI as a creative writing helper finds that it works, but there's a catch
- TikTok bad actors are using AI to churn out political misinformation, new report shows
- Microsoft made an AI voice so real, it's too dangerous to release
- The consequences of making a nonconsensual deepfake
- Collina Strada's Baggu collab is under fire for using AI-generated prints
Should it pass, the bill would make it easier for such creatives and media owners to set terms for content use, and provide a legal pathway should their work be used without consent or attribution.
TopicsArtificial IntelligencePoliticsSenate
相关文章
Olympic security asks female Iranian fan to drop protest sign
Olympic security personnel questioned a female Iranian volleyball fan Saturday when she showed up fo2025-05-09- 曝國安正式報價韓國國腳薑祥佑 僅差球員本人確認簽字_細節_金玟哉_費內巴切www.ty42.com 日期:2022-01-24 22:01:00| 評論(已有327648條評論)2025-05-09
中國足球名人論壇廣州召開 廣州城董事長黃盛華 :職業俱樂部股改符合國際潮流
中國足球名人論壇廣州召開 廣州城董事長黃盛華 :職業俱樂部股改符合國際潮流_廣東_女足_黃盛華www.ty42.com 日期:2022-01-25 15:31:00| 評論(已有327808條評論)2025-05-09- 曝郭田雨即將加盟葡超隊 海港新星亦留洋一中資球隊_球員_宋承良_本賽季www.ty42.com 日期:2022-01-26 22:01:00| 評論(已有328071條評論)2025-05-09
Darth Vader is back. Why do we still care?
They saved the best for last in the first official trailer for Rogue One: A Star Wars Story, release2025-05-09