Verizon gets bullish on 5G (perhaps prematurely)

Verizon was one of the first operators to make a big commitment to LTE, so it makes sense that it would come out as an early backer of LTE’s successor 5G. Today Verizon announced it would begin field trials of honest-to-god 5G networks in 2016, much earlier than any other global operator.

There’s only one problem: No one knows what the heck 5G is yet.

Image Credit: Verizon

Image Credit: Verizon

The standards bodies that define these things have just started the process of determining what 5G will be: what it will accomplish, the kind of radio technologies it will use, and the spectrum it will traverse. The entire process could take several more years to finish, which would put Verizon’s 5G network trials well ahead of an actual 5G definition. (Check out my earlier post on the ongoing saga of 5G for more details.)

So what is Verizon getting at exactly? Most likely Verizon is signaling its intentions to join the growing research effort surrounding 5G, committing its resources to testing out candidate technologies, frequencies and use cases for our next generation networks. The real commercial networks that Verizon customers would actually connect to won’t emerge until years later. No matter how aggressive individual operators want to be, they’re still limited by the same standards and product development timelines all operators face.

That said, Verizon has been known to light fires under standards makers in the past. Despite being confined to a single country, Verizon is one of the world’s most powerful operators. It wasn’t the first to launch LTE – it wasn’t even the first in the U.S. – but it was the first to launch it on a large scale. That early backing helped bring LTE to market years earlier than was originally expected.

The mobile industry is targeting 2020 for 5G’s debut, but an increasing number of network makers and operators are making bold claims about delivering 5G networks years sooner. That could just be posturing, but who knows? If a large portion of the mobile industry really wants 5G sooner rather than later, they might just have the clout to force the issue.

Posted in Networks, Other | 2 Comments

Are Rio de Janeiro’s mobile networks ready for the Olympics?

Last month OpenSignal released its State of Mobile Networks for Brazil, but given the 2016 Summer Olympics are less than a year away, we thought it a good idea to drill down into that data to see how the host city, Rio de Janeiro, performed specifically. Next summer, hundreds of thousands of people will descend on Rio, and you can bet they’ll want to text, Tweet and Instagram their experience to the world.

The good news is that Rio’s mobile networks are in decent shape. The country’s five major operators generally provided better 3G and 4G coverage and speeds in Brazil’s second largest city than the offered to the country as a whole.

3G/4G Coverage


Let’s start with coverage. While there are still some places in Rio where OpenSignal’s app could only detect a 2G signal, the city’s mobile data infrastructure is widespread. The country’s fourth largest operator Oi scored the lowest, providing a 3G or better connection only 71 percent of the time. All of the other operators had coverage numbers over 80 percent, with Nextel able to provide a 3G mobile data link a whopping 96 percent of the time.

LTE Coverage


There are still some sizable gaps in Rio’s LTE coverage, though it’s important to remember that 4G is still relatively young in South America. While our nationwide report found that no operator could provide an LTE signal more than half the time, several of those operators offered considerably better coverage in Rio itself. Vivo customers with LTE phones were able to connect to the 4G network 66 percent of the time, while Claro customers’ connection rate stood at 60 percent. Nextel’s LTE network, which only went online in June, is already beating out Oi and TIM’s in terms of coverage.

LTE Download Speeds


LTE may not be available everywhere in the city, but where it is, speeds are definitely impressive. Claro, Oi and Vivo averaged download speeds over 17 Mbps, which places their networks among the fastest in the world — at least for the time being. As more Brazilians trade out their 3G smartphones for LTE versions, we’ll likely see average 4G data rates fall as more devices compete for the same capacity. Nextel was the big exception in this category. Though it’s built considerable 4G coverage in Rio, its total data capacity appears to be severely limited. It averaged just 2.8 Mbps in 4G download tests.

3G Download Speeds


When mobile users fell out of LTE coverage onto the 3G network, speeds dropped off dramatically, though four of the five operators were able to supply a 1 Mbps or greater connection. And even in places where 3G wasn’t available, Brazil’s operators were able to provide some kind of signal in most cases. The amount of time mobile users spent without any network connection whatsoever was 3 percent or lower in all cases, and Vivo performed particularly well in this category. Its customers were at a loss for signal only 1.6 percent of the time.

Time With No Signal


Brazil’s operators still have another year to prepare before Olympic athletes and fans hit Rio en masse – and they’ll probably need the time. Though Rio’s mobile infrastructure seems in good shape today, nothing can quite bomb a network like playing host to a major global sporting event. In particular you can expect to see Brazil’s fast 4G speeds take a hit as both more Brazilians and an influx of tourists load up the country’s new LTE networks.

That said, there’s plenty Brazil’s operators can do to bolster their networks in the next 11 months. They can add more LTE coverage and bandwidth by upgrading more towers and adding more 4G capacity to existing towers. They could surgically add capacity to key Olympic venues and neighborhoods using small cells and Wi-Fi. And when the Olympic events actually kick off, they’ll likely deploy temporary towers (cells on wheels) throughout the city to handle the additional traffic. U.K. operators seemed to cope quite well with the onslaught during the 2012 Olympics in London. Let’s see if Brazil’s can do the same.

For this analysis, OpenSignal extracted all tests conducted by users with our Android and iPhone app within the city limits of Rio de Janeiro, which encompasses all 32 Olympics competition venues and the four principle neighborhoods where the games will be focused. In all, we took 13.3 million measurements from 7,353 different smartphones over a three-month period between May 1 and July 31.

Posted in Comparing Coverage, LTE, Networks, Reports | Leave a comment

Vodafone ads claiming its voice coverage is “unbeatable” are banned

Vodafone has drawn the ire of the U.K.’s advertising overseers for a marketing campaign that claims the operator is “unbeatable at connecting your calls.” The Advertising Standards Authority on Wednesday ruled that Vodafone had to stop running the web and print ads because it couldn’t substantiate its sweeping boast of having the best call coverage in the country.

Vodafone didn’t just base its claim on wishful thinking. It hired a consultant to perform tests of its networks in 26 towns and cities, but the ASA’s beef was with the methodology it used to pick those locations. The tests were performed in the West Midlands, North West of England, West Yorkshire and London; leaving out Wales; Northern Ireland; southeast, northeast and southwest England as well as most of Scotland. Basically, the ASA accused Vodafone of cherry picking its test locations to back up its results.

Vodafone backed up its unbeatable call claims in London, but what about the rest of the U.K. Photo Credit: Flickr user Bharat Rawail

Vodafone backed up its unbeatable call claims in London, but what about the rest of the U.K.? Photo Credit: Flickr user Bharat Rawail

This is perhaps a touchy subject for OpenSignal to weigh in on, considering we’re in the business of providing network metrics, which many operators use to back up their own ad claims. We think our crowdsourced testing regimen provides a far more extensive data sample and methodology than any one-off study, but this does raise an interesting issue about applying data analytics to mobile networking.

Broad studies produce general results, which can give us an idea about how any given mobile network performs. But networks are such complex creatures where call and signal quality can vary not just by city but also by individual cell and time of day. A network that might pump out a multi-megabit connection or super-crisp calls at your place of work, might suddenly slow to dial-up crawl or drop calls one mile over or one hour later. Averages, are just that, averages.

Averages can be very useful when comparing networks, of course. The problem is when you use selective data to compute them. I’m sure Vodafone’s network handles calls as promised in the 26 locations it tested. It’s just the other wide swathes of England, Scotland, Wales and Northern Ireland that you should wonder about.

Methodology aside, it’s worth noting that regulators and watchdogs are taking a harder stance on operators’ marketing claims. In the U.S., the Federal Communications Commission hit AT&T with a $100 million fine for its long history of offering unlimited plans that have clear restrictions. If an “unlimited” customer hits a pre-defined data limit in a month, AT&T slows down their speeds. This practice, called throttling, isn’t just unique to AT&T. In our tests, OpenSignal has found plenty of evidence of operators scaling back customer speeds – some are more upfront about it than others – so we may very well see regulators start cracking down on throttling around the world.

Posted in Comparing Coverage, Networks, Other | Leave a comment

A wealth of virtual choices: Global MVNOs now number 1000

Eight years ago you would have been forgiven for thinking the MVNO was a dying trend. Much-hyped virtual operators like Disney Mobile, ESPN Mobile, Helio and Amp’d Mobile were dropping like flies. But despite those very public failures, MVNOs kept chugging along quietly. According to a recent report from GSMA Intelligence, there are now 1,017 MVNOs in the world, making them a force to be reckoned with.

An MVNO (which stands for Mobile Virtual Network Operator) is a mobile service provider without a network. They buy data, voice and SMS capacity from an established operator, which is why the number of MVNOs far out number the total number of mobile networks in the world.

FierceWireless got its hands on the full GSMA Intelligence report and published some of its most interesting findings:

  • The number of MVNOs has grown 70 percent since 2010, but it’s been a rocky path for many of them. In 2015, 210 MVNOs have either shut down or have folded into another operator.
  • The MVNO phenomenon is still very much a Western trend. Nine of the 10 countries with the most MVNOs were in Europe or North America. Germany had the most (129), followed by the U.S. (108) and the U.K. (76). Japan was the most MVNO-friendly Asian country with 23.
  • The developing world, however, is starting to see more virtual operators. 30 countries in developing regions now have at least one MVNO, up from 13 countries in 2010. The countries with the most are Poland (23), Russia (16) and Malaysia (13).

What’s interesting about this new breed of MVNOs is that they’re very different from their failed predecessors of the 2000s. Instead of trying to focus on a brand or lifestyle, they’re focusing on price or services, offering plans and options their big network-owning counterparts can’t or won’t match. For instance FreedomPop has been shaking things up in the U.S. with a bare-bones data and voice plan it gives away for free. Other companies have different motives. Google’s new Project Fi is intending to explore new data pricing models and encourage more consumers to connect to Wi-Fi. (For a more detailed look at the different types of MVNOs, check out my earlier blog post.)

But making an MVNO work is still a tough business, as GSMA Intelligence and Fierce point out. Profit margins are notoriously low, as most of the revenue any MVNO takes in goes to paying for network access. There’s also no guarantee that consumers will take to whatever new hot new idea an MVNO comes up with. Samba Mobile tried to fund a free 3G service in the U.K. through advertising, but it shut its doors in 2014.

OpenSignal is taking in a lot of interesting data about the world’s MVNOs from our crowdsourced app users, so look out for future blog posts exploring some of the more interesting trends in this market.

Posted in Other | Leave a comment

Making sense of the alphabet soup of mobile networking

If you’ve ever gotten confused by dizzying array of acronyms out there for different types of mobile technologies, don’t worry, you’re not the only one. You practically have to be a telecom engineer to keep track of all of the different kinds and generations of mobile networks, most of which are still live and humming in the market today.

I thought it would be useful to post a primer on all of these different mobile technologies as well as put them in some historical context. Let’s start at the beginning, in the days of shoulder pads, velour jump suits and brick phones:

Generation 1: AMPS

The original cellphone: The Motorola Dynatac (Photo credit: Flickr user Mark Wahl)

The original cellphone: The Motorola DynaTAC (Photo credit: Flickr user Mark Wahl)

AMPS (Advanced Mobile Phone System) were the old analog systems that kicked off the mobile age – though they persisted well into the last decade – and nothing is more emblematic of that era than the Motorola DynaTAC, a phone that weighed almost two pounds and cost $4000 when it was first offered to the public 1983.

Generation 2: GSM and CDMA

In the 1990s, 2G ushered in digital mobile communications, but 2G also marked the beginning of the wireless technology wars that divided the world into two camps.

Most of the world adopted GSM (it originally stood for Groupe Spécial Mobile, but was retroactively changed to Global System of Mobile communications), which was based on a radio interface called TDMA (Time Division Multiple Access). Meanwhile multiple operators in the Americas and Asia adopted a competing standard called cdmaOne that used a Qualcomm-developed technology called Code Division Multiple Access.

That split basically sent the mobile industry down two different paths, and those paths wouldn’t intersect again until the 4G age. That schism is still evident today in the U.S. where phones from Verizon and Sprint (CDMA) are still largely incompatible with AT&T and T-Mobile (GSM) and vice versa.

In the second generation we also saw the first data services. On the GSM side we saw two network enhancements: GPRS (General Packet Radio Service) and the slightly faster EDGE (Evolved Data rates for GSM Evolution. Meanwhile, cdmaOne gave way to a new family of technologies called CDMA2000. The first of those technologies, CDMA 1xRTT (One times Radio Transmission Technology – the CDMA camp loves its technical acronyms), introduced internet connectivity to counter GPRS and EDGE, but none of these technologies provided real world speeds better than a dial-up modem.

Network Acronym Chart2


Generation 3: UMTS and EV-DO

In the first half of the 2000s, we saw a new wave of networks emerged designed to tackle mobile data. The advent of 3G also saw both mobile camps agree on CDMA as the optimal radio interface technology for these new data networks. There was only one problem: neither faction would settle on a single CDMA-based standard. The result was the GSM and CDMA camp continued to build different networks and the technology wars dragged on for another decade.

The GSM community adopted UMTS (Universal Mobile Telecommunication System), which was based on a variant of Qualcomm’s technology called Wideband-CDMA and today still handles the lion’s share of global mobile internet traffic. Meanwhile CDMA operators continued along the CDMA2000 development path, producing a competing 3G technology called CDMA 1X EV-DO (Evolution-Data Optimized).

Both of these data technologies greatly improved upon their pokey 2G counterparts, pushing theoretical speeds into the megabit range, but neither was a big success initially. Apart from BlackBerry messaging addicts and Palm and Symbian’s early advocates, consumers couldn’t find many compelling reasons to have an internet connection to their phones. The mobile data revolution did come, but not until the first iPhone (which wasn’t even a 3G phone) was introduced in 2007.

The original iPhone (Photo Credit: Philip Rood)

The original iPhone (Photo Credit:Flickr user Philip Rood)

Right about that time, we also saw a new advances in 3G networking. GSM operators began rolling out HSPA (High-Speed Packet Access) upgrades to their UMTS networks. HSPA is an umbrella term for two technologies, HSDPA and HSUPA, with the “D” and the “U” standing for downlink and uplink respectively. HSDPA pushed download rates into the multi-megabit range and HSUPA improved on UMTS’s generally sluggish upload speeds.

Later versions of HSPA were known as HSPA+, and they took advantage of new modulation schemes, smart antennas and expanded frequency channels to achieve theoretical megabit speeds in the double digits. The fastest HSPA+ networks approach 50 Mbps in throughput, and the industry probably would have pushed those speeds even faster if LTE hadn’t come along.

While the GSM community had risen to the opportunity created by the smartphone revolution, the CDMA camp faced a bit of a crisis. It’s answer to HSPA was EV-DO Revision A (known as Rev. A for short), but it only boosted maximum download speeds to 3.1 Mbps. Qualcomm introduced further EV-DO iterations such as Revision B, which used channel stacking technologies to increase bandwidth, but the CDMA camp largely refused to adopt it. By 2008, CDMA2000’s evolution had come to a halt. That was the main reason why Verizon and Sprint were so keen to move to 4G while the GSM community in Europe was content to ride out HSPA+.

Generation 4: LTE

4G is rather controversial term, since it’s historically been used as a marketing gimmick than any kind of real indicator of technology evolution. AT&T and T-Mobile earned a lot of derision when they started referring to their HSPA networks as 4G in the late 2000s, but the first operator to use the term 4G was Sprint back in 2008 when it launched the first mobile WiMAX network in Baltimore.

WiMAX (Worldwide Interoperability for Microwave Access) was supposed to be the tech community’s play to challenge the mobile industry’s – in particular Qualcomm’s – dominance of mobile networking technology by replacing CDMA with a new radio interface called OFDMA (Orthogonal Frequency Division Multiple Access). WiMAX had backing from ISPs and mobile operators around the world as well as Silicon Valley heavyweights like Intel and Google, and it was available long before both the CDMA or GSM camps had a commercially viable alternative.

WiMAX, however, could never get its act together, but the pressure it applied on the mobile industry had the unforeseen result of ending the long schism between the CDMA and GSM camps. Instead of adopting WiMAX, CDMA operators like Verizon abandoned CDMA’s version of 4G, called EV-DO Revision C, and embraced the standard put forward by the GSM community, LTE (Long-Term Evolution). LTE utilizes OFDMA as well – though Qualcomm’s technology dominance continues in a post-CDMA world – but it’s a technology much friendlier to the operator powers that be.

In 2009, TeliaSonera launched the first commercial LTE networks in Stockholm and Oslo, followed by large-scale rollouts in the U.S., South Korea and Japan. WiMAX was effectively dead, and today LTE continues its march around the world. Most of the former WiMAX players have committed to LTE as well, but since their spectrum is compatible with the networks being deployed by most of the mobile industry, they’ve adopted a variant called TD-LTE (Time Division-LTE).

With LTE firmly entrenched, operators have started upgrading their networks to support a family of new technologies called LTE-Advanced. These new networks support a host of new capabilities, but the biggest one is a feature called carrier aggregation, which bonds two or more LTE transmissions together. The result is a lot faster speeds, like the 300-Mbps network Everything Everywhere has launched in London.

Generation 5 and beyond…

5G is a term you see popping up occasionally, but the first 5G networks are still years away from seeing the light of day. The mobile industry still hasn’t settled on a 5G standard, but it looks like 5G will be about much more than boosting speed. Researchers are investigating networks that operate at extremely low power and deliver data at very low cost in addition to radio technologies that can boost bandwidth into the gigabit range (for a closer a look at what 5G may or may not be, check out this earlier blog post).

Whatever the final 5G standard looks like, though, I’m sure we’ll see a host of new confusing acronyms to go along with it.

So that ends our historical tour of arcane mobile industry nomenclature. Did I miss anything? If I did, tell us in the comments below.

Posted in Other | Leave a comment

Facebook: 1 billion people served (in a single day)

Facebook hit quite the milestone this week. On Monday, more than 1 billion connected to the social network in a 24-hour period, CEO and founder Mark Zuckerberg reported on his profile page.

Photo Courtesy of Facebook

Photo Courtesy of Facebook

At first glance 1 billion may not seem like an impressive number. Facebook hit the 1 billion active registered user mark in 2012 and it’s since grown its active accounts to 1.5 billion. But on any service or app there’s always a wide gap between registered users and people who use it actively. For instance, practically everyone I know my age or younger has registered for a Twitter handle, but only my friends in the tech industry tweet or access their twitter streams daily.

That makes Facebook’s feat quite impressive. Two-thirds of its global user base logged into or interacted with Facebook in a single day, even if it was to merely ‘Like’ a friend’s post or view a photo. 1 billion people represents one-seventh of the world’s population – that’s a whole lot of social networking.

Facebook is quickly becoming a network that could rival the great telecom institutions of the world: global landline and mobile networks and even the Internet itself. In fact, since Facebook depends on internet connectivity to function, its growth ultimately is limited by the reach of the Internet. That’s why we’ve seen Facebook and its enigmatic founder talking so much of late about bringing the Internet to peoples and regions of the world with little or no access today. Connecting the world’s 7 billion people to Facebook requires that all 7 billion have an Internet connection.

Posted in Connecting the World | Leave a comment

What the critics are saying about Google’s Project Fi

Google’s Project Fi has started making its way into the hands of U.S. consumers, and over the last two months we’ve seen reviews from many tech media outlets. We thought it would be interesting to take the temperature of those reviews, to see how the tech world is reacting to Google’s freshman attempt to becoming a mobile operator.

It’s fair to say that most of the commentary is positive. Reviewers in general are encouraged by the service itself and Google’s metered pricing, which essentially charges you only for the data you use at simple, reasonable rates. But it’s also clear Google has some kinks to work out in Project Fi, especially if you’re an avid Google Voice user.

A Nexus 6 phone, which is currently the only device that fully connect to the Project Fi network (Photo courtesy of Flickr user Chris F)

A Nexus 6 phone, which is currently the only device that can fully connect to the Project Fi network. Photo Credit: Flickr user Chris F)

First off, I should explain how Project Fi differs from the usual operator services out there. Google is operating as what’s known as a mobile virtual network operator (MVNO). That means it doesn’t actually own any mobile networks or spectrum. Instead it buys voice minutes and data capacity off of someone else’s network (for more details, check out my earlier blog post explaining MVNOs).

Most MVNOs, however, have a single contract with a single operator in any particular country, meaning their networks are only good as their partners’ networks. What’s unique about Project Fi is Google is teaming up with two operators, T-Mobile US and Sprint, and it’s leaning extensively on public Wi-Fi hotspots. The idea is Project Fi will deliver you the best data or call experience available in any given moment or place by automatically selecting the best network connection available.

Though reviewers found that speeds on Project Fi weren’t any more impressive than what they’d get on a regular operator, many like Android Central’s Andrew Martonik were impressed with the coverage and consistency Project Fi’s multi-network setup produced. Wrote Martonik:

“The auto-switching networks has turned out to be absolutely great so far in our testing. We’ve been able to get great speeds in the denser parts of the city where T-Mobile has historically done better than Sprint, and in a more rural area where our T-Mobile phones didn’t have better than EDGE service we actually had Sprint LTE on our Project Fi Nexus 6.”

In fact, most reviewers fawned over the multi-network aspect of Project Fi, claiming it handed off calls and data sessions seamlessly between cellular networks and Wi-Fi. Some even described the capability as “magic.” But Canadian software developer Nicholas Armstrong performed a deep dive into the network selection mechanics of his Fi Nexus 6 and found that the service wasn’t quite as flexible as reviewers made it out to be. Instead of flitting between networks like a bandwidth hungry butterfly, Project Fi has some clear rules and limitations on when and how it can move between Wi-Fi, Sprint and T-Mobile connections.

In his blog, Armstrong wrote that Project Fi basically functions like a phone with dual SIM cards, except it can only access one network at a time. That means it has to disconnect from either Sprint or T-Mobile’s network to reconnect to the other. The result is that when active in a call or data session the Nexus 6 needs to remain on single network, so it can’t pass a call or data session between Sprint or T-Mobile’s network even if a better connection is available. While Project Fi can definitely pass a call from Wi-Fi to cellular, Armstrong says, it can’t do the reverse, and even that hand-off from Wi-Fi to cellular comes with a 2 to 5 second hiccup.

This isn’t a knock on Google. These kind of connection transfers between networks are very difficult to do, made even harder by the fact that Sprint and T-Mobile use completely different network technologies (T-Mobile uses GSM, while Sprint is on CDMA. They both have LTE networks, though Sprint uses an LTE variant). Armstrong’s tests just show Google and the mobile industry in general have a lot of work to do before we have devices that can flip between networks on a whim.

The other feature reviewers were particularly optimistic about was Google’s pricing plan, which is done on a simple meter of $1 for 100 MBs. There’s no bucket or rollover plan to consider. If you use 900 MBs in a month, your charged $9. That’s hardly a mind-blowing concept, but it’s been one that the major operators have been loathe to adopt. CNET’s Lynn La writes:

“I did find Project Fi’s pricing structure, an aspect of other wireless networks that many find to be frustrating, to be extremely user friendly. Its transparency is reassuring, and the fact that you don’t have to worry about overshooting or letting your monthly data allotment go to waste is a relief.”

Some reviewers, however, did find one big glaring fault with the service, but one you’ll only notice if you have an active Google Voice number. Voice is Google’s attempt at creating a unified number and service that can be shared across multiple devices. It has a lot of fans, despite the fact Google has been mucking around with it in recent years as it folds Voice into Hangouts.

If you have no Google Voice account, writes The Verge’s Dieter Bohn, you’ll move onto Project Fi with ease. Google will simply assign you a new Voice number, which will work like any other mobile number. If you are a Google Voice user, though, Bohn says, “prepare for confusion.” His review continues:

“It gets worse, I’m afraid. Precisely what happens when you port your number from Voice to Fi (which are kind of the same thing — but not really!) is clear as mud. Many attempts have been made to quash the Fear, Uncertainty, and Doubt surrounding these issues. While the various explainers you can find on the web are technically accurate, they are also emotionally unsatisfying. Witness! You won’t lose your Google Voice number, and it will still do most of the stuff it did before, but you may have to wend your way back to the 2011-era Google Voice site to manage it. Your texts no longer forward via SMS but they’re available in the Hangouts App. You can’t call people from Google Voice on the web but you can from Hangouts. Oh, and on Android there’s a Hangouts dialer app you can use, sometimes, just because.”

There’s an easy answer right? Just don’t connect your Google Voice number to your Project Fi phone. Nice try. The Washington Post’s Brian Fung writes:

“If Google detects that you have a Google Voice number when you finally register with Project Fi, it prompts you to assign that number to your Project Fi phone. If you’d rather not and ask to use a different number instead, Google will take away your Google Voice number — “no getting it back,” Google’s registration page says.”

If you’re not turned off by the Google Voice issues and are ready to give Fi a try, you’re in for a wait. Project Fi is still in beta, and like other Google beta services, you have to get an invite to join. You can request an invite on the Fi site, but it may take weeks or even months to hear back. But even if you don’t plan on becoming a Project Fi customer, writes Kevin Tofel at ZDNet, it might worth signing up for the $30 welcome kit just for the extra goodies it comes with.

I should also mention that Project Fi customers have started downloading the OpenSignal app (Thank you!), which means we’re starting to collect information on how Project Fi’s unique network performs. We’re still in the process of compiling that data, but in the next few weeks we should have some analysis to share.

Posted in Networks, Wifi | Leave a comment

Wi-Fi in the sky is getting a speed boost

The friendly skies can seem awfully unfriendly when you’re crammed into a bulkhead seat trying to view email on a god-awful Wi-Fi connection. Internet access at 40,000 feet is still an expensive and painfully slow experience on most airlines, but the biggest provider of inflight Wi-Fi in the U.S. may soon deliver a much faster in-plane internet experience – though not necessarily a cheaper one.

Gogo just received permission from the U.S. Federal Aviation Administration to roll out a new mile-high wireless network that takes advantage of faster satellite connections.

The new service is called 2Ku, and if it works as planned it will mean a connection of up to 70 Mbps to Delta and Virgin Atlantic aircraft in the coming year. That may seem like a lot, but keep in mind this is shared connection, just like your home or office broadband link. That 70 Mbps of capacity will be divvied up among all of the passengers connecting to the in-plane network.

Diagram of Gogo's 2Ku inflight wifi system

Gogo’s inflight wifi system. Photo Credit:

Still, compared to what Gogo offers today, that will be a considerable improvement. Most Gogo-fueled flights in North America connect to what is essentially a 3G network – it’s just pointed at the sky instead of the ground – where speeds to the aircraft max out at around 10 Mbps. Gogo also has satellite connections on international flights which top out at 30 to 40 Mbps, but passengers not only have to share that capacity with each other but with other nearby flights in the sky, so as the airspace becomes more crowded, speeds creep downwards.

OpenSignal has tracked Gogo’s performance through our crowdsourced network, though our data is limited considering the scarcity of optimal test conditions (the OpenSignal app has to run its test while a user is both on a flight with his or her phone turned on and logged in to the Gogo network). Of the 100 valid download tests OpenSignal logged in the last few years, speeds averaged 998 kbps.

Gogo’s new service, called 2Ku, will not only make more extensive use of its existing satellite network, but it will use a new type of antenna that steers itself to face those satellites up in geo-stationary orbit. That more resilient link will produce a near doubling in available download speeds over its current satellite service and seven times faster than its current ground-to-air network. You’re still not going to be streaming Netflix movies on a crowded flight (Gogo actually prohibits streaming), but that email attachment will pop up a lot faster.

It’s not just Gogo trying to turbo charge its networks. Many of the inflight Wi-Fi providers are tapping new antenna technologies or in some cases linking to newer generation satellites to boost their broadband speeds. With internet access becoming faster and more consistent on planes, you would expect it become more accessible to more passengers, right? Well, that won’t necessarily be the case.

I recently spoke with satellite broadband analyst Tim Farrar, who tracks the in-flight Wi-Fi sector closely. He expects internet pricing on planes will only increase on the major airlines. The reason for this is the major airlines target business travellers who aren’t footing the Wi-Fi bill themselves; their companies are. Gogo wouldn’t make any more money by lowering prices and encouraging leisure travellers to connect, Farrar said. It would just see a lot more congestion on its already congested networks, potentially angering those bread-and-butter business customers.

JetBlue launch of FlyFi

JetBlue launches FlyFi. Photo Credit: Anthony Quintano

In fact, the business model of inflight Wi-Fi is a bit topsy-turvy. The best inflight internet in the U.S. today is offered by JetBlue — which connects to a new super-satellite run by ViaSat – and for the most part that service is completely free to passengers. While airlines like American and United view Wi-Fi as a money-making enterprise – just like checked bag fees – JetBlue sees it as a loss-leading amenity service. It’s footing its passengers’ internet bills to lure more customers onto its flights.

Posted in Wifi | Leave a comment

State of Mobile Networks Report, Brazil

Today OpenSignal releases the first instance of a new type of report: country-level network performance reports, which will be published for different countries on a monthly basis. The first country showcased in these reports is Brazil, with a fast-growing telecommunications sector and a significant number of OpenSignal users.

So what do we show in these reports? Each Network Performance Comparison, Brazilmobile network in the country is evaluated on five metrics: 3G/4G Coverage, 4G Coverage, Time with No Signal, 3G Download Speed, and 4G Download Speed.

What are these metrics? If you want to know more about how we calculate the results, take a look at our methodology pages, one explaining the concept of “time coverage” and one giving an overview of OpenSignal.

We also give a sense of how each network’s performance changes in the months covered by the report, via our historical graphs. Follow-up reports for each country will be every 6 months, so we’ll get to see how the networks have changed in Brazil in our Feb 2016 report. But that’s looking ahead too much – read the Aug 2015 report on Brazil now! As always, if you have any comments, please write them below or in the forums. Are our findings consistent with your experience or research? We’d love to hear from you!

Posted in Other, Reports | Leave a comment

Android Fragmentation 2015

The world Android lives in is a confusing jungle. Today we released our annual report on Android Fragmentation, which takes a look at the complex variety of devices that developers have to build for. This complexity is both good and bad, if you can imagine your dream phone then someone, somewhere, will probably have built it. The downside, however, is that the apps you install may not be optimized for its screen size or features.

One thing we noticed in building this year’s report is the dramatic changes that have occurred in the ecosystem over the past few years: a huge increase in observed devices, great proliferation in manufacturers and (our favourite topic) the rise of embedded physical and virtual sensors. We spotted a few trends that we felt were too broad to be included even under our broad ‘Fragmentation’ umbrella – so we decided to include them in this blog. Screens are bigger, CPUs have more cores and mobile devices now contain more RAM than your average desktop not too long ago.

Screen Shot 2015-07-31 at 16.05.45Over the past few years, and over ten million OpenSignal downloads, we have seen the Android device landscape both fragment and evolve. Devices are bigger and more powerful, and this has changed the way people use their devices, helping to make the web increasingly mobile-first. Interestingly you can see the slight tail-off in NFC growth, as a technology that was supposed to be revolutionary never completely took off, with only around 30% of observed Android devices being NFC capable. Screens have steadily got bigger, a trend that does not appear to be slowing down (and our largest ever screen size, the Slate 21, is recorded in the main report) – in a few years it will be instructive to see similar graphs (perhaps produced by the data teams at Levi’s or The Gap) to see how this technological transformation has influenced 21st century pocket design.

Posted in Reports | Tagged , , , , , , , , , , , | 11 Comments