Want to freak out a bunch of Twitter users? Easy: Just give them the sense that Twitter is about to change in some fundamental way, without providing much in the way of concrete details.
That's what BuzzFeed's Alex Kantrowitz did last Friday when he reported that the service was going to start algorithmically reordering tweets as soon as this week. The prospect inspired everyone from random users to William Shatner to tweet little cries of alarm, accompanied by the bleak hashtag #RIPTwitter.
Twitter CEO Jack Dorsey responded by attempting to calm the masses without denying that change was on its way.
Well, "next week" is here, and so is change. Twitter is rolling out a revised version of the timeline that indeed shuffles around some tweets into an order that isn't purely reverse-chronological—but it doesn't blow away the old format in the manner that had some users writing obituaries for the service.
I spoke with Michelle Haq, a Twitter product manager in charge of the timeline, about what's new. Without further ado, some questions and answers:
It's going to start showing some tweets at the very top of your stream out of sequence—interesting ones that got tweeted since the last time you logged in and when you didn't happen to be perusing the service. "We know this is a problem," says Haq. "Leaving Twitter means missing great stuff."
By attempting to push up high-value tweets, Twitter is also addressing a common complaint about the service: Almost a decade into its existence, it can still be a challenge for newcomers to figure it out. These changes, Haq says, will make Twitter "more accessible to a wider variety of people."
In a relatively rare example of a major consumer web service letting its users tiptoe their way into something new, this feature is starting out as an option you can turn on in Twitter's iOS, Android, and web incarnations. (It's called "Show me the best tweets first" in settings.)
In the "coming weeks," Twitter will turn on the feature for users by default, and put a notification in the timeline when it does, Haq says. But even then, you'll be able to turn it off again.
Of course, Twitter's expectation is that most people will like the timeline tweak—or at least not hate it—once they're exposed to it. "We have the opt-out because we also prioritize user control," Haq says. "But we do encourage people to give it a chance."
Isn't This An Awful Lot Like The "While You Were Away" Feature That The Service Launched More Than A Year Ago?
Absolutely. This is less of a sea change than a version 2.0 of "While You Were Away," which showed some tweets at the top of your feed that would have otherwise come and gone. And it's driven by the same principle: Unless you read your Twitter stream in its entirety, you probably miss out on things you would have liked.
Haq gave me an example—a playful tech-support exchange between Edward Snowden and Dorsey from last October, which even someone who followed both of them could have overlooked:
"The tweets we surface are the ones we think will matter," Haq says. The service is attempting to cater to the interests of each user: If you follow a lot of football players, for instance, you'd be more likely to see tweets about football. But even then, "they're tweets you would have seen otherwise—we're just organizing them to put them at the top," Haq says. (This would seem to negate my own personal fear about my timeline being overrun with Kardashians or other topics I haven't personally curated into my Twitterworld.)
There are a number of pretty obvious signals Twitter can use to determine whether a particular tweet is notable. A tweet with copious retweets, likes, and replies, for instance, is theoretically more engaging than one whose existence goes unacknowledged by other users. Beyond that, "we're not talking about the technical details" of the algorithm involved, Haq says.
"The main value proposition is to make sure you never miss anything important," Haq says. "It will show up differently for different people." People who follow vast quantities of extremely prolific Twitter users will get a different experience from someone who follows just a handful of sporadic tweeters. But as a ballpark estimate, she says that a typical user might see a dozen tweet highlights at a time. They'll be in chronological order themselves, just placed above the full-blown standard chronological timeline.
That's right, it's still there. And if you refresh the stream, the highlights will disappear altogether, and the classic timeline will start at the top, Haq says.
Only if you examine their time stamps carefully. They won't be cordoned off in a way that makes their repositioning glaringly obvious, Haq told me.
Naw. Twitter likes to call itself "the pulse of the planet," and as far as I can tell, everyone responsible for its fate believes that preserving the real-time flavor is essential, now and forever. That doesn't mean that some degree of out-of-sequence curation is unacceptable, though. After all, retweets are out-of-sequence tweets, and I don't know of a single Twitter user who'd contend that they're a violation of the service's spirit.
"We're going to be listening very carefully to users as we announce and launch this and continue to iterate and make it better and better as we go," Haq says. "This doesn't change the essence of Twitter as the best way to find out what's happening in your world."
On a scale of one to 10, with one being no change and 10 being Twitter becoming indistinguishable from Facebook? This strikes me as about a 2.35. Maybe a bit more than that if it's a precedent allowing Twitter to fiddle with the timeline in more radical ways at a later date. But even then, the real precedent was when the company introduced "While You Were Away" back in January 2015.
Heck, no. Anyone who's completely shocked by the move hasn't been paying attention. As long ago as September 2014, Twitter execs were floating the idea that the timeline's organization should include an algorithmic element. A few months later, "While You Were Away" launched. Last July, Dorsey said that the company "continue[d] to question" the reverse-chronological format. And in December, it became clear that some users were seeing an experiential version of the timeline that fiddled with the order of tweets.
A few reasons spring to mind:
Historically, Twitter has been slow to evolve. Its users aren't accustomed to it being subject to constant revision—and then revision to the revision—of the sort that's been a staple at Facebook since the beginning. When Twitter change does come along, or even the mere possibility of change, it can therefore be more jarring.
Twitter is especially personal. I like Facebook, but I'm keenly aware that I'm one of 1.5 billion people in a giant gated community owned and operated by someone else. Twitter, with its tiny set of features—140-character messages, photos, very short videos—feels more like it's mine. And if you feel like something is yours, you're naturally more skeptical about it evolving.
It's not obvious that computers can tell which tweets matter. I know which tweets matter most to me, and much of the time, I care for reasons that can't be expressed in terms of keywords or quantity of retweets. Twitter's use of the phrase "best tweets" in describing the new feature may be overselling it; it's possible that "potentially worthwhile tweets" would be more accurate.
People associate algorithmic curation with an attempt to clone Facebook. One of the best things about Twitter is that it's strikingly different from Facebook in most respects that matter. And Facebook's algorithmic curation has never felt the least bit magical. But whatever you think of it, today's rejiggering of tweets is not going to result in a service that feels like a Facebook knockoff. It's still going to be Twitter.
Clearly not. "The fact of the matter is that improving Twitter means changing Twitter," Haq says. With any luck, most of the people who were so worked up over the weekend will conclude that today's tweaks aren't so catastrophic after all. And maybe future rumors of change—10,000 character posts, anyone?—won't lead quite so many folks to reflexively declare the end of Twitter as they knew it.