The A&C of SMAC: Analytics and Cloud

infographicIn the last issue on testing the SMAC stack we talked about the social and mobile aspects of testing. We will be referring to them in this article. In this issue part 2, we focus on the Analytics and Cloud aspect. The goal of this article is to understand a simple landscape of analytics and cloud.

Understanding the Flow

Let’s look at a diagram to walk through a very basic analytics and cloud workflow. We are looking for testing points in the analytics and cloud parts of SMAC.

You have an app on your mobile phone. You execute some workflows or use cases. In addition to the functionality of the app, all of the activities send data back to the cloud. From the app you made, some examples could be:

  • What and how many various social interactions do you do?
  • How many Tweets, Instagram posts or Yelp posts? What payment methods do you use for purchases?
  • In what location are you using the app?
  • What is your connection? Wi-Fi, 3G or 4G?

All this data is captured and sent to the data store in the cloud.

Any associated IoT (Internet of Things) devices ranging from garden humidity sensors to heart monitors are gathering and sometimes streaming large amounts of data.

The data that is captured and stored will be defined by the business side of product development. The business will decide what part of the user’s behaviors is meaningful for analysis.

In the SMAC stack, all this data is stored in the cloud, in reality it does not have to be stored in the cloud or take advantage of cloud services, but for our SMAC discussion, it is.

How the data is stored is enough for a different examination: the pluses and minuses or cost of structured data and data warehouses vs. unstructured data and Hadoops is a business decision. The methods and tools to test those vary greatly and deserve their own article.

The data is analyzed and spun around by various algorithms, written by data scientists to capture a particular aspect of users and use that the business requires.

Let’s look at an example here to bring this workflow to more reality.

Let’s say my mobile phone has an app that controls a watering device for the garden at my house. An IoT device in my garden measures the soil moisture content. The data from this device, depending on the functionality in the app, might be correlated to data from the national weather service that is streamed and focuses on air temperature and relatives humidity. This streaming data might be a giant data set stored and manipulated in the cloud. the resulting analytics are then sent back to the business and the Dev team to pull the data they need to look at the user patterns to optimize workflows and add or remove functionality.

At what point in this diagram do we need to insert some testing?

Data Testing

Data testing is a well understood and traditional area in software testing. Typically, data integrity (accuracy and consistency), access and availability, as well as all the defined functional and error handling tests, will be run and automated.

Also, it is important to verify and test that the analytics algorithms are working as well and as expected. It is important to note, testing the correct function of the algorithm is one set of tests, and validating the data science behind the algorithm is very different. The business and data scientists design what the algorithm is collecting, sorting and calculating. The business and data science correctness of that is not what we are testing. It is very important but it is not the software testing we do. That doesn’t mean you would never test that. If you are the subject matter expert and you are the most knowledgeable in that domain, perhaps you would test the data science, but more commonly not.

Analytics

The analytics gathered are very useful information for test teams. A very common set of data is click analysis. Click analysis is real usage, not happy path or modeled workflows- it’s what your users are actually doing. They are real-world scenarios and paths that must be tested and are the best to automate since you know they are actual uses. Check these workflows against your test case for gaps. Depending on the data collected, check you are testing with real user data, at real user peak and low use times, testing on the correct devices, with the right connectivity. The analytics must validate or give you data to fix your test coverage and hopefully reveal gaps, error handing (to be tested) and boundary cases.

Cloud/ Data Warehouse/Hadoop

As I mentioned above, how the data is stored- whether in more traditional relational data bases in data warehouses or in more modern Hadoops- is a business decision but will have major implications on how you test.

If you are a consumer of cloud services, in addition to the variety of different tools and methods for testing the competing infrastructures, you will be testing normal data storage servers for security, performance, load, concurrency and race conditions.

The providers of cloud services will do most, if not all the infrastructure testing here and have SLAs(service level agreements) promising certain performance and load benchmarks and security attributes they meet.

Summary

Cloud testing, in many cases, is similar to traditional testing. It is important to fully understand the Social, Mobile, Analytics and Cloud stack to test it effectively. The analytics and cloud aspects are mainly comprised of data, data manipulation and how the data is stored in the cloud. The testing focuses on the analytics algorithm functionality testing, normal data integrity test as well as common server tests- most commonly performance and security tests. The infrastructure chosen for the data in the cloud will dictate some significant changes to that testing.

Michael Hackett

Michael is a co-founder of LogiGear Corporation, and has over two decades of experience in software engineering in banking, securities, healthcare and consumer electronics. Michael is a Certified Scrum Master and has co-authored two books on software testing. Testing Applications on the Web: Test Planning for Mobile and Internet-Based Systems (Wiley, 2nd ed. 2003), and Global Software Test Automation (Happy About Publishing, 2006).
He is a founding member of the Board of Advisors at the University of California Berkeley Extension and has taught for the Certificate in Software Quality Engineering and Management at the University of California Santa Cruz Extension. As a member of IEEE, his training courses have brought Silicon Valley testing expertise to over 16 countries. Michael holds a Bachelor of Science in Engineering from Carnegie Mellon University.

Michael Hackett
Michael is a co-founder of LogiGear Corporation, and has over two decades of experience in software engineering in banking, securities, healthcare and consumer electronics. Michael is a Certified Scrum Master and has co-authored two books on software testing. Testing Applications on the Web: Test Planning for Mobile and Internet-Based Systems (Wiley, 2nd ed. 2003), and Global Software Test Automation (Happy About Publishing, 2006). He is a founding member of the Board of Advisors at the University of California Berkeley Extension and has taught for the Certificate in Software Quality Engineering and Management at the University of California Santa Cruz Extension. As a member of IEEE, his training courses have brought Silicon Valley testing expertise to over 16 countries. Michael holds a Bachelor of Science in Engineering from Carnegie Mellon University.

The Related Post

This article will cover 10 common mobile app testing mistakes to avoid when you are a software tester working in a mobile app testing and development environment. The 10 points may help you to start your mobile testing activities if you are new to mobile testing or they may help you to recap your existing mobile testing ...
To start with, we need a Test schedule. The same is created in the process of developing the Test plan. In this schedule, we have to estimate the time required for testing of the entire Data Warehouse system. There are different methodologies available to create a Test schedule. None of them are perfect because the ...
This is the second part of a two part article that analyzes the impact of product development for the internet of things (IoT) on software testing.  Part one of this article (LogiGear Magazine, Sept 2014) gave a wide view on the IoT, embedded systems, and the device development aspects of testing on these projects. This ...
25% of Americans own a tablet. Up from 11% of U.S. adults in July of 2011 to 18% in January of 2012. – Pew Internet & American Life Project Nigeria has close to 100 million mobile phone lines, making it Africa’s largest telecoms market. – Nigerian Communications Commission Google plans to sell 200 million Android ...
Devices matter. We don’t yet trust the mobile devices like we trust desktops and laptops. In the course of testing traditional web applications, rarely do you have to think about the model of the actual machine. In mobile, however, the behavior of an application can vary from device to device. You can no longer just ...
LogiGear Magazine – November 2011 – Mobile Application Testing Issue
Steps that will enable you to identify the weaknesses of your new app, its vulnerabilities and strengths. So you’ve just finished developing a nifty, customisable app that can help farmers track their produce from source to market via their mobile phone. You’re elated and want to get started marketing it right away. Not to burst ...
                                                                                                              ...
LogiGear Magazine, September 2015: Mobile Testing
The outbreak of smartphones and tablets forces us to be digitally available with speed. Keeping pace with communication tool developments, Lindiwe Vinson defines the methods used at Organic, Inc. where she leads her team discovering bugs using various key programs for both PC and Mac platforms.
  LogiGear_Magazine_September 2016_Testing SMAC Down  
iOS culture, even in many large organizations with skilled engineers, is behind on up-to-date testing practices. Agile development has long been all the rage; indeed, in most modern development shops the great agile methodologies are old hat. If you come from a software background like Ruby on Rails, Python, or certain Java niches, you may–until ...

Leave a Reply

Your email address will not be published.

Stay in the loop with the lastest
software testing news

Subscribe