Whenever I open Instagram these days, it feels like I’m being watched. The minute I follow a new person or double tap a landscape or cat pic, the photo-sharing app—to which I’ve long been addicted—picks up on it and starts showing me remarkably similar images. It’s almost creepy.
As it turns out, Instagram’s algorithms are indeed keeping an eye on me, busily drawing a complex map of my likes, follows, and other in-app behavior, and that of the people I follow as well. As the app learns, its Explore tab gets better at recommending photos and videos to me. I can’t help but take the bait. Next thing you know, I’m barreling down some new visual rabbit hole and following six more people.
The Explore tab is essentially one giant recommendation engine. But the ever-evolving methodology Instagram uses to sort through one of the world’s largest networks of photographs, comments, and likes is far more complex than your standard, “If you like that, you’ll like this” logic. And, surprisingly, Instagram says that despite the image recognition capabilities of parent company Facebook, there’s no machine vision involved.
Explore, the section of Instagram designated by a little magnifying glass, is where the app suggests images and videos you might appreciate, and allows you to search for more by tag, username, or place. This infinite and constantly refreshing grid of imagery does its best to reflect your interests. Mine shows a blend of musical instruments like drums and synthesizers, various animals, hazy film photography, tattoos, weird psychedelic art, food, beach scenery, and the occasional meme. You might see political rallies, yoga poses, makeup tutorials, shirtless Justin Bieber pics, corgis, vintage cars, and whatever it is you happen to be into—even if you didn’t know that you were.
For either of us, a few of the suggestions may seem oddly off-base. But for the most part, they’re increasingly likely to draw us in. And a week or two from now—depending on how much you swipe and tap your way through Instagram—this page could look totally different. With tens of millions of images being shared on Instagram in a typical day, how does the Explore tab figure out what to show us?
The process is twofold. First, the algorithms embark on a content quest known within the company as “sourcing.” To machete its way through Instagram’s massive jungle of image uploads, the system uses a blend of your own behavior—which pictures you’ve liked, who you follow, what kinds of images you share with others, and data about the people you follow, including what types of posts they like, for example—to narrow millions of images down to just a few hundred. It even analyzes the activity of people who follow the same accounts you do. Aided by Instagram’s machine-learning prowess, the process gradually gets more sophisticated and, for many users, eerily spot-on.
“You base [the predictions] off an action, and then you do stuff around that action,” says Layla Amjadi, a product manager at Instagram who works on Explore. “Following is an explicit action. Liking is another explicit action. You have all these different permutations of what you can look at.”
For example, if you and I both happen to follow six of the same people, perhaps you’ll like a photo posted by a seventh account that I follow, but that you’ve never heard of. Instagram’s algorithm will test this theory by slipping one of their posts into your Explore page. This is why if you scroll through Explore enough, you may occasionally see a familiar face—say, a friend of a friend, or somebody you recognize from around town. Instagram is mining the complex, multilayered social web between users and trying to learn from how those people interact with one another.
Instagram can also leverage the mountains of data and social links its users generate on Facebook to suggest photos, people, and ads. Remarkably, however, the Explore algorithm does not use Facebook’s image recognition technology to understand the contents of images. Instead, the recommendation algorithms take clues from things like hashtags, captions, and the overarching theme of the account itself (which it gleans from the account description and user behavior around it). An Instagram account dedicated to food, cats, or guitars, for instance, is going to be filled with photos of those particular things.
The system then tries to rank the images in terms of how likely they are to be relevant to you—and thus, how likely you are to take a new action, such as liking the photo or following the account. These predictions are made using yet another analysis of your past behavior on Instagram—and then used to assign a numerical score to each image so the system can prioritize which images to show you. By this point, Instagram’s sea of millions of videos and photos is narrowed down to just a few dozen and personalized just for you. The app also makes an effort to declutter things further by blocking certain content; Instagram recently started “shadowbanning” overly spammy accounts, preventing their photos from appearing on the Explore tab.
One of the challenges of the Explore tab is striking a balance between personalization and serendipity.
“You may want a good amount of cat content, but don’t want the entire experience to be cats,” Amjadi says. “It’s not just replaying the same stuff to people, but also helping them broaden their horizons in a way that they might find interesting, too.”
Since Instagram’s 700 million users are spread out across the globe, the Explore page often surfaces things that may appeal to you, but also offers a peek into another culture elsewhere. Maybe you’re into garage rock, but you’ve never heard of this up-and-coming band from Australia that just played a gig in Sydney last night. Check them out. Or maybe you’re into makeup how-to’s, but have never seen eyeliner done quite the way that this teenager in São Paulo did it on her Instagram feed this morning.
The Explore tab has come a long way since Instagram first launched in 2010. Originally called Popular, this part of the app began as a showcase of the network’s most popular images at any given time. Not surprisingly, the experience got boring pretty fast. So the company started to try to personalize it in 2014, two years after Facebook purchased it. Explore, as it was now called, worked: In one early A/B experiment, the addition of personalization showed a 400% increase in engagement. The following year, the company added videos and trending tags, improved the search functionality, and put more of an emphasis on viewing images by location. Since then, they’ve swapped various features and tweaked details, all the while trying to improve the personalization quietly in the background.
“It’s always iterative,” says Amjadi. “People’s behavior changes. The ecosystem changes. The content being produced changes. Everything is changing, so you can’t just do this and let it sit.”