Notes From the Field
Azimuth’s Notes from the field section explores actual customer scenarios where Azimuth’s real world test solutions have been used to solve notable wireless deployment issues.
Device benchmarking (or regression testing) is something that every company in the mobile ecosystem does – whether it’s an operator, chipset vendor, handset or infrastructure vendor. Subscribers, operators and device OEMs might all want to compare the performance of one handset against other competing handsets, compare a handset against a reference handset (golden UE), or compare different versions of the same handset. But given the explosion in the number of mobile platform versions (such as Android Honeycomb, Ice Cream Sandwich, etc.), chipsets, OEMs and technology combinations (LTE/HSPA/UMTS, LTE/HSPA+/UMTS, GSM, etc.), the total number of devices that need to be tested has become unmanageable.
Given the number of devices that need to be tested and the range of tests that need to be run on each of these devices, most lab testing has been simplified to focus on and catch major issues. Expectations would be to flesh out the rest of the issues through drive testing. As a result, drive testing has become even more complicated, but it is often during this phase that critical issues are caught. Unfortunately, since drive testing occurs quite late in the product R&D cycle, it significantly increases the cost and time to fix issues thus prolonging the time to market.
In this edition of Azimuth’s “Notes from the Field,” we’ll explore how Field-to-Lab testing allows customers to address this problem through multiple avenues. The inherent capabilities of Field-to-Lab to profile, visualize and recreate field conditions provide some unique advantages. When combined with the power of TestBuilder Automation, it can automate the entire multi-UE test bed, allowing efficient and effective test of a broad set of devices in a wide range of environments.
More specifically, by using Field-to-Lab’s environmental profiling, test and validation teams can develop environmental profiles that represent typical deployment/use conditions and thus allow them to test in a manageable set of real-world conditions. This actually addresses two problems: it allows customers to test devices in real-world conditions (something not typically done today) and allows customers to do real-world testing with a manageable set of profiles. This is significant because there are thousands of real-world profiles. Testing against all profiles would be time prohibitive, so the key is to make this list manageable, which is what Field-to-Lab does.
The automation environment, which consists of Test Builder’s device automation and benchmarking modules, enables test teams to control multiple handsets (different chipsets, different OEMs, etc.) and run a wide range of tests on them. Test teams could run the same set of tests on multiple handsets to see how performance varies across different UEs or run different tests in the same conditions; the former would be to test how different UEs perform under the same conditions, the latter would correspond to a scenario such as multiple users sitting in a vehicle going through similar RF conditions but doing different operations. Test Builder comes basic automation support and can be augmented with modules to control common devices, device platforms, commonly used test tools such iPerf, and commonly used diagnostic tools such as QXDM.
A series of tests can be executed one after another using Azimuth’s Test Scheduler facilitating overnight, over the weekend and 24×7 testing, significantly reducing the time and effort to test multiple devices.
An operator based in Japan, for example, was able to take a handful of characteristic profiles (identified using AzMapper) and test all of its devices, or a device against a reference device. This enabled them to characterize the performance of multiple UEs across multiple dimensions (ex: data rates, call reliability etc.) in a range of real-world conditions which approximated many of its actual deployment environments.
The device certification lab manager said, “With Field-to-Lab, I am now able to test devices more comprehensively in the lab and catch more issues in the lab, significantly reducing the number of issues I catch in the field.”
The Field-to-Lab approach can be used by any and all, regardless of what they do (operator, OEM, etc.) This can not only be done as a part of testing, but even as a part of early R&D or IOT to improve product quality.
In this edition of Azimuth’s “Notes from the Field,” we’ll explore how real world testing solved significant network latency issues in an urban environment characterized by a large number of smartphone users, where many customers were complaining about video stalling and increased latency when accessing data in multiple locations within the city and its immediate suburbs.
The relevant sectors had been live on the network for quite some time and comprised a mature location with strong customer base. In an initial attempt to address the problem, the operator providing service to these customers completed drive testing and logging of field data. The handset vendor also began drive testing devices in order to replicate the problem. As a result, both the service provider and handset vendor identified settings related to fast dormancy (the mechanism through which a device transitions to a standby state upon some inactivity) to be the potential cause for the increased latency. Fast dormancy is used to save battery life if a mobile device quickly goes into standby — but at the expense of increased network latency and network overhead.
Next, using the existing drive test logs, Azimuth helped the operator to reliably and repeatedly recreate the issue in the lab. The operator was then able to use Test Builder’s automation capability to control the device while monitoring battery consumption and network latency. This enabled the operator to run a series of tests to evaluate different network settings and characterize the multi-dimensional relationship between the settings used, battery power consumption and latency. With this information, the operator was able to work with the infrastructure suppliers in conjunction with the OEMs on whose devices the delay was prominent to develop optimal settings that ensured an effective trade-off between latency, network overhead and battery power consumption on the device.
Azimuth Global FTL Product Manager Vivek Vadakkuppattu explained, “This is another example of how FTL can help operators differentiate their offerings. The ability to quickly and rapidly evaluate a complex set of tradeoffs without having to return to the field and wait for just the right RF environment is crucial in developing and optimizing the high performing wireless networks subscribers now expect globally.”