Testing Netflix on Android

When Netflix decided to enter the Android ecosystem, we faced a daunting set of challenges:

1. We wanted to release rapidly (every 6-8 weeks).

2. There were hundreds of Android devices of different shapes, versions, capacities, and specifications which need to playback audio and video.

3. We wanted to keep the team small and happy.

Of course, the seasoned tester in you has to admit that these are the sort of problems you like to wake up to every day and solve. Doing it with a group of fellow software engineers who are passionate about quality is what made overcoming those challenges even more fun. 

Release rapidly

You probably guessed that automation had to play a role in this solution. However automating scenarios on the phone or a tablet is complicated when the core functionality of your application is to play back videos natively but you are using an HTML5 interface which lives in the application’s web view.

Verifying an app that uses an embedded web view to serve as its presentation platform was challenging in part due to the dearth of tools available. We considered Selenium, Android Native Driver, and the Android Instrumentation Framework. Unfortunately, we could not use Selenium or the Android Native Driver, because the bulk of our user interactions occur on the HTML5 front end. As a result, we decided to build a slightly modified solution.

Our modified test framework heavily leverages a piece of our product code which bridges JavaScript and native code through a proxy interface. Though we were able to drive some behavior by sending commands through the bridge, we needed an automation hook in order to report state back to the automation framework. Since the HTML document doesn’t expose its title, we decided to use the title element as our hook. We rely on the onReceivedTitle notification as a way to communicate back to our Java code when some Javascript is executed in the HTML5 UI. Through this approach, we were able to execute a variety of tasks by injecting JavaScript into the web view, performing the appropriate DOM inspection task, and then reporting the result through the title property.

With this solution in place, we are able to automate all our key scenarios such as login, browsing the movie catalog, searching, and controlling movie playback.

While we automate the testing of playback, the subjective analysis of quality is still left to the tester. Using automation we can catch buffering and other streaming issues by adding testability in our software, but at the end of the day we need a tester to verify issues such as seamless resolution switching or HD quality which are hard to achieve today using automation and also cost prohibitive.

We have a continuous build integration system that allows us to run our automated smoke tests on each submit on a bank of devices. With the framework in place, we are able to quickly ascertain build stability across the vast array of makes and models that are part of the Android ecosystem. This quick and inexpensive feedback loop enables a very quick release cycle as the testing overhead in each release is low given the stakes.

Device diversity

To put device diversity in context, we see around 1000 different devices streaming Netflix on Android every day. We had to figure out how to categorize these devices in buckets so that we can be reasonably sure that we are releasing something that will work properly on these devices. So the devices we choose to participate in our continuous integration system are based on the following criteria.

We have at least one device for each playback pipeline architecture we support (The app uses several approaches for video playback on Android such as hardware decoder, software decoder, OMX-AL, iOMX).

We choose devices with high and low end processors as well as devices with different memory capabilities.

We have representatives that support each major operating system by make in addition to supporting custom ROMs (most notably CM7, CM9).

We choose devices that are most heavily used by Netflix Subscribers.

With this information, we have taken stock of all the devices we have in house and classified them based on their specs. We figured out the optimal combination of devices to give us maximum coverage. We are able to reduce our daily smoke automation devices to around 10 phones and 4 tablets and keep the rest for the longer release wide test cycles.

This list gets updated periodically to adjust to the changing market conditions. Also note that this is only the phone list, we have a separate list for tablets. We have several other phones that we test using automation and a smaller set of high priority tests, the list above goes through the comprehensive suite of manual and automation testing.

To put it another way, when it comes to watching Netflix, any device other than those ten devices can be classified with the high priority devices based on their configuration. This in turn helps us to quickly identify the class of problems associated with the given device.

Small happy team

We keep our team lean by focusing our full time employees on building solutions that scale and automation is a key part of this effort. When we do an international launch, we rely on crowd-sourcing test solutions like uTest to quickly verify network and latency performance. This provides us real world insurance that all of our backend systems are working as expected. These approaches give our team time to watch their favorite movies to ensure that we have the best mobile streaming video solution in the industry.

Amol Kher
Amol is the Chief Technical Officer at Wello, a startup that focuses on connecting users with trainers over live video. He is currently developing a mobile app for the product. In his previous job, he was the engineering manager for the tools and test team at Netflix mobile team and was responsible for shipping the Netflix Mobile app on the iOS, Android and AppleTV platforms. Prior to Netflix, he was an early engineer on the Chrome for Android team at Google.

The Related Post

Two dominant manual testing approaches to the software testing game are scripted and exploratory testing. In the test automation space, we have other approaches. I look at three main contexts for test automation: 1. Code context – e.g. unit testing. 2. System context – e.g. protocol or message level testing. 3. Social context – e.g. ...
This is part 2 of a 2-part article series; part 1 was featured in the September 2020 issue of the LogiGear Magazine, and you can check it out here. Part 1 discussed the mindset required for Agile, as well as explored the various quadrants of the Agile Testing Quadrants model. Part 2 will delve into ...
Mobile usage today is not just a trend but it is an essential shift in how people communicate with each other, interact with the world, and do business. According to a ComScore, in 2014 the number of mobile users surpassed the number of computer users and is showing strong growth over time, towards some point in ...
September Issue 2019: Advancing Automation
The following is a transcript of a May 7, 2008 interview with Hung Q. Nguyen, founder and CEO of LogiGear Corporation and coauthor of the best selling textbook Testing Computer Software. Interviewer: When it comes to software testing, what concerns or issues are you hearing from software developers? Hung Q. Nguyen: The most pressing concern ...
LogiGear Magazine September Issue 2020: Testing Transformations: Modernizing QA in the SDLC
LogiGear Magazine – The Big Testing Issue – April 2012
Picture a series of sprints: There are a variety of features being developed, with an eye towards having automated tests related to those features. Work starts to move along and Test Automation work likewise goes along with it. However, at some point, there invariably is that moment, usually in the middle of the project, where ...
For those that are new to test automation, it can look like a daunting task to undertake For those who are new to Automation, it can look like a daunting task to undertake, but it only seems that way. If we unpack it and pinpoint the fundamentals, we can have the formula for the desired ...
There is no one recipe to make big testing a big success. It takes planning and careful execution of the various aspects, like test design, infrastructure and organization – a mix that can be different for each situation in which you may find yourself. In writing about big testing, the first question that comes up ...
I feel like I’ve spent most of my career learning how to write good automated tests in an Agile environment. When I downloaded JUnit in the year 2000 it didn’t take long before I was hooked – unit tests for everything in sight. That gratifying green bar is near-instant feedback that everything is going as ...
We’ve scoured the internet to search for videos that provide a wealth of knowledge about Test Automation. We curated this short-list of videos that cover everything from the basics, to the more advanced, and why Test Automation should be part of part of any software development organization. Automation Testing Tutorial for Beginners This tutorial introduces ...

Leave a Reply

Your email address will not be published.

Stay in the loop with the lastest
software testing news

Subscribe