Are your social media streams currently packed with portraits of your friends and acquaintances rendered in various artistic styles? It seems like just about everyone is hopping on the Lensa bandwagon right now, using the photo editing app’s “Magic Avatar” add-on to produce fantasy portraits for a mere $3.99.
Users simply upload 10 to 20 photos of their faces, and the app’s AI-powered image generating tool does the rest. It may seem like harmless fun, but some artists say the AI is stealing their work. Even worse, the app is generating sexualized images of minors, and users are unwittingly signing away the rights to their own images.
Fantasy Portraits Ignite Ethical Concerns
Artificial intelligence often learns by scraping the internet for content. The Lensa app’s Magic Avatar function draws from sites where real artists upload their work, like DeviantArt, Behance, and ArtStation. Lensa runs on Stable Diffusion, a text-to-image app trained to learn patterns through an online database of images called LAION-5B.
Some artists are recognizing their distinctive styles and even the remnants of their signatures in the AI portraits, like Kim Leutwyler, a Sydney-based artist who told The Guardian she saw almost every portrait she’s ever shared on the internet in the Lensa results.
In an interview with The Daily Beast, Spanish illustrator Amy Stelladia says she learned her art was being used by AI apps after discovering a website called haveibeentrained.com, which allows artists to search the LAION dataset for their work. The tool also provides an easy way for artists to request that their art be removed from the AI-training network.
For artists, the problem is twofold. First, the AI is flooding the market with cheap art that exploits and devalues human creativity. “They are meant to compete with our own work, using pieces and conscious decisions made by artists but purged from all that context and meaning,” Stelladia says. “It just feels wrong to use people’s life work without consent, to build something that can take work opportunities away.”
Second, AI photo apps like Lensa are making bank while the human artists they draw from barely scrape by. By some estimates, Lensa is drawing in $1 million USD per day, and the artists seeing their work in its results aren’t even getting credited, let alone compensated.
Portrait Results are More Than a Little Problematic
Some users have found that even when they submit source images that are fully clothed and show no skin at all, the results are often semi-nude and sexually suggestive. At MIT Technology Review, Melissa Heikkilä writes that out of 100 avatars she generated, 16 were topless and another 14 were barely clad. Heikkilä notes that she and another colleague of Asian descent seemed to get more images like these than their white colleagues who tried the app.
Lensa’s Magic Avatar has also generated nude art based on photographs of children, lightened black skin, and made users significantly thinner than they appear in the source images they uploaded.
AI training data like LAION is full of racist stereotypes, pornography, and even explicit images of rape. Of course, AI training data reflects the biases and prejudices of human artists and other content producers, but there doesn’t appear to be any attempt on the app creators’ part to moderate that data. While Stable Diffusion offers a filter on its dataset to limit graphic results, Lensa doesn’t appear to use it.
Lensa also doesn’t seem to enforce its policies prohibiting nudity and minors, and it doesn’t prevent users from uploading source images of people other than themselves. That makes it easy for anyone to exploit Magic Avatar and produce sexualized images of anyone they want, including children.
Signing Away the Rights to Your Own Image
Though the images are supposedly automatically deleted within 24 hours after being processed by Lensa, they’re used to train the AI to produce more portraits. Users aren’t compensated when the company uses their images for that purpose. The app thus becomes just another way to give corporations more control over our identities — and that doesn’t seem worth the four bucks.