But others scolded members for sharing the images outside the group and risking the channel being shut down. Some members of the Telegram channel appeared amused to see the images spread, not just on social media but also on sites featuring celebrity nudes and stolen adult content, 404 Media reported. "We have teams working on the development of guardrails and other safety systems in line with our responsible AI principles, including content filtering, operational monitoring, and abuse detection to mitigate misuse of the system and help create a safer environment for users,” Microsoft's spokesperson said. ![]() So far, Microsoft has not yet verified that the images were created using any of its AI tools, but the company is taking steps to strengthen filters on prompts to prevent future misuse in the meantime.Ī Microsoft spokesperson told Ars that the tech giant is “investigating these reports" and has "taken appropriate action to prevent the misuse of our tools." The spokesperson also noted that Microsoft's Code of Conduct prohibits the use of Microsoft tools "for the creation of adult or non-consensual intimate content, and any repeated attempts to produce content that goes against our policies may result in loss of access to the service." Advertisement However, images of Swift can still be generated using the recommended keyword hack. 404 Media and Ars were not able to replicate outputs based on recommendations in the Telegram group. It's possible that Microsoft has already updated the tool to stop users from abusing Designer. ![]() Members of the group shared strategies for subverting these safeguards by avoiding prompts using "Taylor Swift" and instead using keywords like "Taylor 'singer' Swift." They were then able to generate sexualized images by using keywords describing "objects, colors, and compositions that clearly look like sexual acts," rather than attempting to use sexual terms, 404 Media reported. While it's still unknown how many AI tools were used to generate the flood of harmful images, 404 Media confirmed that some members of the Telegram group used Microsoft's free text-to-image AI generator, Designer.Īccording to 404 Media, the images were not created by training an AI model on Taylor Swift's images but by hacking tools like Designer to override safeguards designed to stop tools from generating images of celebrities. These images began circulating online this week, quickly sparking mass outrage that may finally force a mainstream reckoning with harms caused by the spread of non-consensual deepfake pornography.Īt least one member of the Telegram group claimed to be the source of some of the Swift images, posting in the channel that they didn't know if they "should feel flattered or upset that some of these Twitter stolen pics are my gen." Axelle/Bauer-Griffin / Contributor | FilmMagic reader comments 423įake AI images sexualizing Taylor Swift spread to X, formerly known as Twitter, from a Telegram group dedicated to sharing "abusive images of women," 404 Media reported.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |