First we had deepfakes, which could glue someone’s face onto someone else’s body. Then we had This Person Does Not Exist, which created people on a website every time you refreshed the page. Then we had Generated Photos, a commercial stock photography site, built entirely from AI-generated humans.
Generating realistic-looking people has been one of the biggest challenges in visual AI, but researchers are mastering the technique quickly. The latest example: Generated Photos—which currently does $15,000 a month in revenue selling a library of AI-generated stock models, according to the company—has released an update that not only generates an AI-built human on demand but also lets you position it. Through easily tunable controls, you can make a person frown, look to the left, or wear glasses. Almost like a photographer, you can use the website’s UI to nudge your subject into the exact pose you want.
“I’m thinking [of our platform as] Photoshop that edits the scene, not pixels,” says Ivan Braun, the founder of Generated Photos. “Instead, users should use the higher-level commands, pretty much as you would guide a human painter. I imagine this interaction: Could this guy be less excited? Give him a cellphone in one hand. Not this one—a modern one.”
No doubt, creating people from thin air is a stomach-churning proposition. With deepfakes, we saw rampant abuse, as celebrities but also everyday people had their faces stolen and stuck into pornography. But Generated Photos doesn’t let you copy someone else. In fact, the company is trying to guard against some of the less savory aspects of identity theft online. A few months ago, Generated Photos released an Anonymizer tool that could scan your picture and create an intentionally unrecognizable copy for you to use on social media to avoid surveillance. This second face might have a similar hair and skin color, but not your unique smile or eyes. The copy is something akin to a cousin rather than a twin. (Braun also says he works with police agencies when they reach out about how to identify fake photos.)
So how does the new tool work? You begin with a random face. You can select the sex (male or female). You can change the head pose by dragging a matrix in the direction you want the person to look. Then you can select all sorts of other options just by checking boxes and pulling sliders. You can change their skin and hair color. You can make people disgusted or sad, add reading glasses or makeup. You can even make them older or younger.
Once you’re done, you can buy your creation in a higher-resolution format to use however you’d like (including commercial use). The custom faces start at $9, with discounts given for volume orders.
Variety is important to Braun’s customers, who are typically looking for anonymous, tweakable faces—which can be useful to startups that want to put a human face on a chatbot, or even advertise a dating site, without enlisting an actual person. Braun claims to have a customer in a major social media company, which uses Generated Photos to help train an AI to spot fake photos. He also points to a food company that distributes products in Asia that he says generated a face to skirt esoteric trademark packaging laws.
Other customers are technologists who need to generate new faces to train their own visual AIs. Still others are academic researchers, who purchase faces to be used in studies. To control for cultural bias, or to isolate a variable such as hairlines, inventing a few humans with particular traits gives a researcher more control than using regular photography. Braun shared dozens of testimonials from customers across academia, and the use cases vary from teaching a course on eyewitness testimonies in the criminal justice system to training a visual AI to recognize faces that are wearing a mask incorrectly during COVID-19.
Indeed, Braun says he built these customization features into the platform because customers requested it on day one. When it launched in 2019, the site offered a database of 10,000 models from various ethnicities, but they all appeared to be young—around age 25—with an attractive smile, looking at the camera head-on.
“One of the [customer] requests was, ‘For god’s sake, could they stop smiling for a second?'” says Braun.
So, after six months of work, which included two months retraining the AI at the heart of the service, the team developed the product you see here. Trying it out for myself, I can say, it’s remarkable when it works. You feel like a god, making your own human. It’s rare to be able to control an opaque algorithm with such ease or fidelity.
But it is definitely buggier, with more visual aberrations than the site’s earlier library. That’s because the tool is being pushed much harder than it was before, with more complicated head angles and hairstyles. “It’s always a balance between the buggy and uniform,” says Braun. He also notes that, on the human face, noses are pretty simple to simulate, because most noses are pretty similar. When it comes to something like hair, however, standards go out the window.
One issue on hair that I observed was that, while the Black natural hair movement is in full swing, the Black people that I generated had hair that was flat-ironed and treated with a curling iron. Afros, locks, and braids are nowhere to be seen. When I asked if that was an oversight, and whether white-dominant datasets that trained the AI could be impacting Black representation, Braun insisted this wasn’t the case.
“Ironically, it’s the result of the opposite request,” he says. “Black people wrote to us complaining about generating stereotypic hair. We [should] have all varieties of hair types, they said, including the straight ones. Overreacting to this request could be the issue.”
In any case, the updated Face Generator is worth checking out, if only as a peek into the future of AI tools. Five years ago, the possibility of generating a realistic-looking human seemed like sci-fi. Now, not only can you do that, you can ask them to pose for you, too.
Next, Generated Photos wants to allow you to put its faces on full human bodies, which will both widen its addressable market of stock photography—and push the war on what’s real one more step into confusion.