Fast company logo
|
advertisement

The independent companies that test networks use different approaches—and come to different conclusions.

Why so many wireless carriers seem to have “America’s best network”

[Photo: Nilanjan Paul/Pexels; milindri/iStock]

BY Rob Pegoraro3 minute read

The results are in, and they show Verizon Wireless runs America’s best mobile network. Except for the results that saw Verizon and T-Mobile tied for first place. Or the earlier results that crowned AT&T.

In recent weeks, three key studies have reached conflicting conclusions about the top four nationwide carriers, which includes Sprint. But reports by RootMetrics, Opensignal, and PCMag that, respectively, gave top honors to Verizon, T-Mobile, and AT&T aren’t wrong—not if you understand how they were put together.

The big schism in network testing covers how it’s done. In drive testing, vehicles roam the country with phones monitoring network performance. By contrast, crowdsourcing utilizes free apps that people install on their phones to perform similar tests.

RootMetrics relies on the first approach, supplemented by walking tests; its latest report touts 3,915,800 tests over 236,764 miles driven. PCMag does only in-car testing at 12 to 20 locations in 30 cities, for some 60,000 tests total. Opensignal’s crowdsourced app data, meanwhile, yielded 5,632,817,140 measurements for its latest U.S. study.

Drive testing promises a more scientific approach, without uncertainty over the location or volume of tests.

“My belief is that systematic network testing of the kind that RootMetrics performs is the most authoritative,” says Wave7 Research lead analyst Jeff Moore. He adds that Root, unlike its rivals, tests voice and messaging performance in addition to data.

But budget and time realities limit how broad drive testing can be. “We hit 30 major metro areas and draw conclusions based on those, but if you ask me to talk about Des Moines, I’m going to have to guess based on nearby locations,” says PCMag lead mobile analyst Sascha Segan.

You also can’t send cars or researchers into people’s homes or offices, while crowdsourced testing can cover those spaces and uncover in-building reception issues. So RootMetrics also flows crowdsourced data from its mobile apps into its coverage maps, notes product and benchmarking head Kevin Hasley.

Crowdsourcing can be skewed by people only opening test apps if they’re angry or happy with their network, so Opensignal’s apps regularly automatically capture data without user intervention.

“Automatically initiated testing avoids this issue and still means we test across the complete range of smartphones used, at any time, across all of the locations where users spend their time,” wrote Ian Fogg, Opensignal’s vice president of analysis, in an e-mail forwarded by a PR rep.

advertisement

His “complete range” phrase brings up a subtle difference between drive and crowd data: the phones used.

Both RootMetrics and PCMag stick to a single model that they think best supports the four carriers’ different frequencies. But they didn’t choose the same device: While PCMag used Samsung’s current Galaxy S10, Root stuck with the older S9.

In Root’s tests, the S10 “didn’t perform as well as the S9,” says Hasley, adding that later firmware updates fixed those issues and would let the company switch to the S10 for its next tests.

Rather than focusing on one phone, tests that utilize crowdsourced apps retrieve data from an array of models. “If you want to keep an older phone and switch carriers, or get a refurbished phone, then crowdsourced testing may give you a more accurate view,” says Susan Welsh de Grimaldo, director of service provider strategies at Strategy Analytics, in an email. But there are many instances in which the standards supported by your particular phone might constrain performance. “AT&T’s performance this year was particularly dependent on some new modem technologies, so phones such as the iPhone 7 might not see that performance,” warns PCMag’s Segan. “And a lot of T-Mobile’s newer rural coverage is dependent on Band 71 support, so if you don’t have a B71-capable phone, it’s a whole different map.”

All of this can get sufficiently complex that it can be easier to focus on the single metric of download speeds. But reliability matters more; when Qualcomm does its own network drive tests, its key metrics are “coverage and capacity,” said Kirti Gupta, vice president of technology and economic strategy, at an event in Washington. In that light, the composite scores of RootMetrics and PCMag and Opensignal’s LTE-availability metric are the figures to focus on.

“Reliability and consistency of speed is more important to a good overall experience than top download speeds,” says de Grimaldo. So while wireless carriers may like to brag about their download ratings, the reality is simpler: If you don’t have to think too much about your wireless service, then you had the right idea when you picked yours.

Recognize your brand’s excellence by applying to this year’s Brands That Matter Awards before the early-rate deadline, May 3.

PluggedIn Newsletter logo
Sign up for our weekly tech digest.
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Privacy Policy

Explore Topics