Over the past few weeks we at OpenSignal, along with Which?, the consumer advocacy group, have been putting out reports on the state of mobile coverage in the UK. This began in November when we released a report looking at overall coverage in the UK – and was followed by individual reports for the 12 regions that we and Which? decided to subdivide the UK into, released in two batches over the last two weeks.
Traditionally it has been extremely hard for mobile users to compare the performance of network operators, and therefore to gain accurate information on what level of service they can expect. The coverage maps published by operators often do not reflect typical user experience, as they are based on tests carried out by the network in public areas – with coverage beyond roads modelled from the limited tests run. We covered our concerns with the drive testing methodology, including drive-tests run using consumer phones rather than discrete testing equipment in our response to the recent OfCom coverage report.
Our reports on mobile coverage (and coverage maps published on this website, and hosted on Which?) are an attempt to help customers better understand the difference between mobile networks in terms of the actual service they provide, meaning that customers can make better choices based on localised coverage. Improving consumer choice is a key objective of OpenSignal, which made partnering with Which? an obvious choice. The data for these reports is gathered directly from almost 40,000 UK users of the OpenSignal app, which runs in the background recording the locations where users have access to the different network type (no signal at all, 2g, 3g, 4g), we get our speed data from background speed tests and tests run actively by users. All of our data, therefore, comes from real-world customers of the UK networks – meaning that have an unparalleled perspective into coverage as it is experienced by actual consumers.
After the second batch of local reports, EE responded publicly with criticism of our methodology, something which has not previously happened with any UK operator (indeed Vodafone said they welcomed the findings), claiming that a sample of 2 million would be required to fully understand mobile coverage in the UK – a surprising claim considering that they published a survey earlier in the year revealing the behaviour of 4G users [pdf], a survey based on 1,000 4G users.
By rejecting our findings, EE are rejecting the directly measured experiences of almost 10,000 of their own customers; users who have traditionally had little avenue for complaint, or few resources to go by when determining whether their own individual experience is typical or to be expected. At OpenSignal we want our reports to reflect the experience of these consumers, and to give users a suitable platform to ensure that their experience as consumers is heard publicly. These are the very people that it is increasingly important that networks should be listening to. Of course, having more users always allows for better information – but we are very confident that our sample is representative and statistically significant. To reject our findings on grounds of ‘sample size’ is only going to be misleading to consumers, the very people that Which? and OpenSignal are trying to help by making better coverage information more easily accessible.
Our ‘time on’ metrics are not an attempt to measure total UK coverage geographically, we instead attempt to survey the true experience of mobile users in the UK, based on data collected 24/7 from real consumer devices. Mobile operators’ own network testing, including tests run for them by third parties, cannot typically test in people’s homes or offices, the places where people spend the most time. Indeed, networks including EE, Vodafone and O2 have partnered with companies that crowdsource mobile coverage data, validating our methodology as an important tool for gaining a complete picture of the end user experience. Our data is used by networks worldwide and our UK report was cited by OfCom in their most recent mobile coverage report [pdf]– as an alternative to mobile coverage data self-reported by the operators themselves. Independent data is crucial to a healthy mobile network ecosystem, as it allows for a level of regulatory and consumer oversight that is not possible when networks self-report coverage based on their own testing methods.
We do not test the network in terms of theoretical performance, we do not model the network based on tests done in pre-determined locations; instead we directly measure the proportion of time that real users have access to the different levels of mobile network provision. There is no agenda to our publication of results, we are merely recording what the customers themselves experience in order to provide better information to a market that has historically been marked by an information imbalance between consumer and provider, a market failure that leads to inefficient purchasing decisions. In the light of recent announcements from the DCMS about possibly enforcing national roaming in the public interest, it would appear that neither the people nor the government are happy with enduringly patchy coverage, making it increasingly important that all mobile operators listen to their customers.