In this blog post, we’re taking a deep dive into the very core of OpenSignal: explaining the metrics we use, what they all mean and what their roles are in measuring the real-world mobile network experience as users see it.
How do we collect the data in the first place?
We collect and analyze more than 3 billion measurements every day, from more than 100 million smartphones across the world. We collect data every day of the week, at all hours and in all the places people live, work and travel: no simulations, no predictions, no idealized testing conditions. Our data comes from actual smartphone users and we report users’ actual network experience, whether they are indoors or out, bustling in a busy city or trekking in the countryside.
We collect the vast majority of our data via automated tests that run in the background, enabling us to report on users’ real-world mobile experience at the largest scale and frequency in the industry. These automated tests are run at random points in time and therefore represent the typical experience available to a user at any given moment.
Availability – Adding time to the equation
Our 4G availability metric shows the proportion of time users with a 4G device and subscription have an LTE connection. When we report an average 4G availability of 75%, that means our LTE users were, on average, connected to LTE services on their network 75% of the time.
Availability is not a measure of coverage or the geographic extent of a network. It won’t tell you whether you are likely to get coverage if you plan to visit a far flung region that is off the beaten track. Instead, it measures what proportion of time people have a network connection, in the places they most commonly frequent, something often missed by traditional coverage metrics. Looking at when users have an LTE connection, rather than where, provides us with a more precise reflection of the true user experience.
We also keep track of the instances that leave mobile users most frustrated: when there is no signal to connect to at all. The most common dead zones users struggle with occur indoors. As most of our availability data is collected indoors (not surprisingly, since that’s where users spend most of their time), we’re particularly astute at detecting those areas of zero coverage.
Download Speed – The real speeds users get
When it comes to average download connection speeds we report publicly on three variations: 3G, 4G and overall speed. Measured in Mbps, our 3G and 4G speed metrics reflect the average download speeds our users’ experience on each type of network over the course of the data collection period.
Overall download speed represents the typical everyday speeds a user experiences across an operator’s mobile data networks. It is calculated as a weighted average of the individual 3G and 4G speeds based on the proportion of time typical users spend connected to each. If users cannot connect to an LTE network due to low availability, their overall speeds will also drop as they will spend more time on 3G connections (which can be up to 4 to 5 times slower than an LTE connection).
Our upload speed metric measures the average upload speeds for each operator on LTE connections. Just as with our download speed metrics, 4G upload speed is measured in Mbps and reflects the average uplink speeds our users experience over the course of the data collection period.
Typically upload speeds are slower than download speeds, as current mobile broadband technologies focus resources on providing the best possible download speed for users consuming content on their devices. As mobile internet trends move away from downloading content to creating content and supporting real-time communications services, upload speeds are becoming more vital and new technologies are emerging that boost upstream capacity.
Latency – Quantifying the lag
Latency refers to the delay users experience as data makes a round trip through the network. If the latency of your network is high, you’ll experience a lot more lag time. We measure our latency metrics in milliseconds for both 3G and 4G connections – the lower the latency value, the more responsive the network.
Both our download and latency speed tests are run on content delivery networks (CDN) that host the most popular internet destinations (not on dedicated testing servers where requests might follow a different route to normal traffic or indeed be prioritized in a way that isn’t representative), helping us to accurately understand typical user experience.
Time on Wifi
Our time-on-Wifi metric quantifies how often users spend connected to Wifi networks as opposed to cellular data connections. Time on Wifi doesn’t measure the amount of data consumed on Wifi. Rather it shows how often users’ devices are actively connected to a Wifi network.
Peak Speed – The fastest experienced speed
Peak speed is an average of the fastest speeds OpenSignal users experience on a network. For this metric, we disregard technical or congestion limitations and focus on the most optimized connections. This is different to the best-case speeds measured in idealized conditions that users themselves can never achieve.
To calculate peak speed, we only examine data from devices that have conducted multiple automated speed tests in a three-month period. We extract the fastest speed test from those devices and then toss out the bottom 95% of the results, leaving us with only the top 5% of the fastest speeds we’ve collected from our user community. The average of that top 5% is our average peak speed metric. For a more practical grasp of peak speed, read our analysis on peak speeds in the U.K. and India.
Video Experience – Measuring real-world mobile video streams
This metric quantifies the quality of experience for mobile video by measuring real-world video streams from end user devices. The quality of experience is derived using an International Telecommunication Union (ITU)-based approach. This approach is built upon on detailed studies which have derived a relationship between technical parameters, including picture quality, video loading time and stall rate, with the perceived video experience as reported by real people. To calculate the video experience we are directly measuring video streams from end-user devices and using this ITU approach to quantify the overall video experience for each operator on a scale from 0 to 100. The videos tested include a mixture of resolutions and are streamed directly from the world’s largest video content providers.
The following scale can be used to relate the Video Experience scores to the actual experience users receive:
- 75 – 100 Excellent: Very consistent experience across all users, video streaming providers and resolutions tested, with fast loading times and almost non-existent stalling.
- 65-75 Very Good: Generally fast loading times and only occasional stalling but the experience might be somewhat inconsistent across users and/or video providers/resolutions.
- 55-65 Good: Less consistent experience, even from the same video streaming provider and particularly for higher resolutions, with noticeably slower loading times and stalling not being uncommon.
- 40-55 Fair: Not a good experience either for higher resolution videos (very slow loading times and prolonged stalling) or for some video streaming providers. Experience on lower resolution videos from some providers might be sufficient though.
- 0 – 40 Poor: Not a good experience even for lower resolution videos across all providers. Very slow loading times and frequent stalling is common.
With video being the single largest category of traffic carried on mobile networks and consumption expected to grow to keep pace with consumer demand, this is an extremely relevant metric. It is another key element in OpenSignal’s ongoing mission to measure the actual consumer experience on mobile networks.
Metrics in the pipeline
We are constantly evaluating new ways to measure users’ true mobile experience and frequently report on our findings as these metrics progress through the development stage. In time, we aim to include these in our reports as standard. Here are a few:
Coverage Experience — The new way to look at coverage
Coverage experience is OpenSignal’s unique take on coverage. It contrasts with our availability metric in that instead of measuring when a user has a network signal, coverage experience is based on where a user has a signal. In short, coverage experience is the percentage of places where users’ devices have a network connection. We measure it as percentage, so if an operator has a 4G coverage experience score of 75% then our 4G users were able to connect to an LTE signal in 75% of all locations they visited during the designated test period.
Coverage experience is a far more reliable indicator of coverage than the traditional coverage metrics used by the industry. Traditional coverage metrics are based on mathematical models that try to predict how signals will propagate. This involves making a large number of assumptions about terrain, buildings, weather and many other factors all of which means the errors can compound and the end results may be quite inaccurate. Our coverage experience metric is based on measurements from real devices owned by millions of real users in all of the places they actually visit and spend their time. We are not reporting on a simulated measure of the coverage people experience, as the industry has done for decades, but directly measuring it.
We’re constantly working on new, better and more meaningful ways to measure users’ real-world mobile experience. Stay tuned to our blog for the latest updates and or sign up to receive our newsletter.
Have a question or a comment? Leave a comment below.
Feel like checking the numbers for yourself? Download our OpenSignal or Meteor app. Both are, available on Android and iOS!