Mon. May 6th, 2024

Apple has removed apps from the App Store that claimed to make nonconsensual nude imagery, a move that demonstrates Apple is now more willing to tackle the hazardous app category.

App Store icon

The capabilities of generative AI to create images based on prompts has become a very useful tool in photography and design. However, the technology also has been misused in the creation of deep fakes — and nonconsensual pornography.

Despite the danger, Apple has been remarkably hands-off from the problem. Prior to the recent move, it hadn’t done much to fix a potentially major problem.

Continue Reading on AppleInsider | Discuss on our Forums

By

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.