Have you ever noticed how selfies can make even the most delicate button nose look like Adrien Brody’s or Gérard Depardieu’s? A group of scientists from Princeton and Adobe Research has invented a solution.
Their photo-editing tool enables users to decide how far away the camera is from their face–after the photo has already been taken. That means that the software can take your selfie–the one where your nose looks huge–and slightly redistribute the pixels so that the photo looks like it’s been taken from farther away. Goodbye, big nose (you can test it out at the project’s web site).
Taking portraits from farther away is a technique often used by professional photographers, and it tends to make photos look more, well, professional. A 2007 study showed that the subjects of photos taken from about 2 feet away are generally considered to be more approachable, while subjects of those taken from about 13 feet away come across as smarter and more impressive.
These aren’t the camera’s distortions–according to Fried, noses do seem bigger when you’re physically closer to someone. But personal space norms make us comfortable with certain people at certain distances: We’re more used to seeing a significant other up close and strangers from a few feet away. Fried called this amount of space a “canonical distance.” Because a selfie is taken from a short distance away from the face, it’s often closer than the canonical distance.
But manipulating photos to correct for that skewed perception is no easy task. “We as people are really sensitive and attuned to faces,” he says. “So everything, whether you want to improve photos, edit photos, anything where you have a face, it’s a really hard task. The slightest change will really pop up if it’s not natural.”
In order to create the editing method, Fried and his team combined a model for creating 3D human heads and a program from Carnegie Mellon researchers that identified key points on a picture of a face. They then wrote an algorithm that would transfer the 2D key points into a 3D image, effectively creating a model of the person’s face. From there, the program adjusted the 3D head based on other models of a face photographed from the desired location and warped the pixels in the flat image to match.
The editing tool isn’t specifically for reducing nose size in selfies–that’s just a nice side effect for some. Instead, Fried hopes to provide amateur photographers with more options, enabling them to change their facial features in a natural way that simulates what the photo would look like if it was simply taken from a different distance. “Many people would like the selfie effect because they prefer themselves with larger noses,” Fried said. “The point is to empower users, to get all these different effects without having to move the camera.”
Fried, along with the Princeton professor Adam Finkelstein and Eli Shechtman and Dan Goldman of Adobe Research (though Goldman is now at Google), presented the research at this year’s Siggraph conference. The team built a demo for people to try out as well, and the tool would no doubt make for a popular app–but Fried isn’t interested in pursing that for the time being.
The next step along this line of inquiry is a digitally recreated version of the iPhone’s Live Photos feature, in which the phone’s camera captures a second and a half of video before and after the picture being taken.
“You’ve seen the Harry Potter movies?” Fried says. “That’s the dream. If we can do more, a slight rotation of the head, bridging a bit–if we can add these minor movements, we could bring life to still photographs.”
But for now, bulbous-nosed selfie-takers can look a little more normal.