![]() What Lensa AI has done is unwittingly create an app that is the embodiment of everything we deem morally dubious about AI. While innocent in nature, Lensa AI has created an app that can be used for immoral and potentially illegal purposes, that differ from its intended use, and currently, Lensa AI has no control over how you use the app. Granted, I gave it some NSFW images first, but this creates an “I’ll show you mine, and you show me yours” moral, and possibly legal, dilemma.Įspecially as the app is creating new images that could potentially put people in prison if shared. So, right now, on my iPhone, I have an App that just generated NSFW content for me that’s available on the App Store with a 4+ age restriction. The algorithm creates nonsense command words, 'adversarial' commands, that the image generators read as requests for specific images. For a child or young person, having a sexual image or video of themselves shared online can be a distressing. If you’re looking for consensual nudes, OnlyAccounts is a new OnlyFans search engine that makes it incredibly easy to find all the top creators across different niches and categories. Removing nude images - information for parents. The Apple guidelines state that:Īpps containing pornographic material, defined by Webster’s Dictionary as “explicit descriptions or displays of ****** organs or activities intended to stimulate erotic rather than aesthetic or emotional feelings”, will be rejected.Īpps that contain user generated content that is frequently pornographic (e.g. This also creates an issue with the Lensa AI listing on the App Store. So, no need to worry then because that’ll fix the problem, surely… Still, the UK is already expecting an increase in troublesome AI-generated pics and proposing laws to make the sharing of non-consensual nudes illegal. GitHub and other tech services have already banned the sharing of code that can do this, and Google banned deepfakes from using its machine-learning platform. ![]() Usually, software that has this capability is banned. Basically, anyone that can download a picture and cut and paste a face can hack the system into making non-consensual nudes. The issue is that Lensa AI makes this perverted pursuit much easier to do and do at scale. A security firm called Sensity says it recently discovered a network of deepfake bots on chat app Telegram creating. ![]() While Lensa parent company Prisma Lab's CEO and co-founder Andrey Usoltsev told TechCrunch such images "can't be produced accidentally" by the app, he said it could be provoked to create nude images through "intentional misconduct," such as uploading nudes against the terms of service (which prohibit uploading content that is "obscene, pornographic, indecent, lewd, suggestive" or otherwise sexualized).The industry has already started taking a stand Updated Tue, Oct 20, 2020, 2:37 PM 2 min read. That sentiment was echoed by dozens of others, mostly women, saying the app had automatically generated sexualized or outright nude photos of them, despite avoiding not-safe-for-work reference photos in their uploads. With the Simple Online Image Converter, all you have to do is upload an image and choose the new. This Image Converter supports converting image to the following formats: JPG, PNG, PDF, GIF, BMP, TIFF. "To be clear, NONE of the photos I submitted included nudity, which the app specifically prohibits!" Simple Online Image Converter is online application which converts image from one format to another (eg. "Ok so I put my hottest 20 pics into lensa instead of just the first 20 selfies I could find & it came back with a bunch of ai-generated nudes," one user wrote on Twitter. The trending Lensa app - currently the top photo app in the Apple and Google Play stores - generates artistic edits based on user-uploaded reference photos, but its machine-learning technology appears to be creating unintentional nudes of its users. Account icon An icon in the shape of a person's head and shoulders.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |