2010 – 2011 LogiGear Global Testing Survey Results – Test Process & SDLC

Process

The objective of this survey and analysis is to gather information on the actual state-of-the-practice in software testing today. The questions originate from software development team assessments I executed over the years. A process assessment is an observation and questioning of how and what you and your team does.

In such a long survey, I wanted to keep process questions to a minimum and seek free-form comments and opinions on perceptions of process, and save more survey time for assessment of actual practices, strategy and execution rather than an individual’s plan for a process. The following section of questions deals directly with the development process itself. Results from this section has always been informative for managers as it deals directly with how things ought to be and what perceptions are for the process itself─in most cases there is a discrepancy between reality and the best laid plans! Let’s take a look at the Process portion of the survey.

P-1. How would you describe the effort to release your team’s software product or application?

The process is occasionally difficult, but we reach fair compromises to fix problems 43.40%
The projects are smooth with occasional process or product problems 28.90%
It’s a repeatable, under-control process. 19.70%
It’s difficult; problems every time 6.60%
It’s a huge, stressful effort 1.30%

This is an encouraging response! The overwhelming majority, 92% say their development process is either under-control, smooth or occasionally difficult. Only eight percent states the process difficult or stressful.

I am surprised at the very positive response teams have for their development process. This is a very common area for teams to complain and express frustration. Current conventional wisdom has teams frustrated with traditional processes. More than half of the survey respondents self-describe as using development process other than Agile.

P-2. Do you think your SDLC processes are followed effectively?

Yes 52.1%
No 47.9%

Now we see the big disconnect with the first question. This response is a virtual split. Just over half think their SDLCs are followed effectively, and the remaining portion does not.

From my experience, I find in development today is that there are “process docs” detailing how teams should effective make software yet are commonly written outside of development or many times by consultants. Teams regularly disregard these docs and are more comfortable with their own culture or practice that is either expressed or implied in tribal knowledge.

P-3. Have people been trained in the process?

Yes 74.3%
No 25.7%

This is encouraging results matching what I expected for internal training. If a team has a process in place, then it would be easy for the department to create either a PowerPoint slideshow or flash video of the process. Such a tool would make training easy for all employees. The opposite would be a team that does not have a process and works based on either tribal knowledge or “whatever works” ethics—a problematic process for long term goals. Making your SDLC as a training standard is also a great opportunity to question why the team does certain practices and sharpen the process or connect reality to “shoulds.”

P-4. During the testing phase of a project, what percentage of time do you spend executing tests on an average week? Example: 10 hours testing in a 40 hour week: 25%

50% 27.30%
74% – 51% 23.40%
Less than 25% of my time is spent on executing test cases 20.80%
49% – 26% 18.20%
More than 75% of my time is spent on executing test cases 10.40%

If there is one big, dark, hidden secret in testing I have chosen as my cause, it is to expose the amount of time testers actually spend testing as opposed to documenting, time in meetings, building systems, data, planning, maintenance─all those things testers need to do as well. The perception of most managers is that testers spend the overwhelming majority of their time executing tests. This is not the case.

Ten percent of respondents say they spend 75% of their time or more testing. This can also be read that only 10% of respondents are testing at least 30 hours in a 40 hour week with 10 hours a week spent on other efforts.

Just over 20% put themselves in the lowest category admitting to less than 25% of their time spent testing. This means 10 hours/week or less is spent testing and during a test phase, spending 30 hours/week working on other initiatives.

Two-thirds, 66.3% of respondents spend half their time or less (20 hours/week or less) in a testing phase actually testing. This is common and always very surprising to managers.

I do not conclude any fault lies in this. I know what test teams have to do to get a good testing job planned, executed and documented. Yet, in some situations, this is an indication of a problem. Perhaps the test team is forced into too much documentation, must attend too many meetings, or gets called into supporting customers and cannot make up lost testing time.

What is most important here is that everyone concerned─managers and directors, project managers, leads, anyone estimating project size─all know, most testers spend half or less of their time testing.

P-5. How much time do you spend documenting (creating and maintaining) test cases during an average product cycle?

Less than 25% of my time is spent on documenting test cases 51.30%
49% – 26% 27.60%
50% 11.80%
74% – 51% 5.30%
More than 75% of my time is spent documenting test cases 2.60%
We do not document our test cases 1.30%

There are many useful pieces of information to glean from this. Few groups spend too much time documenting. This is a great improvement from just a few years ago when many teams were under tremendous pressure to document every test case. Aside from this being completely useless, it leads to missed bugs! Teams spending so much time documenting were not testing enough causing testers to overlook bugs.

Some teams were collapsing under the stress of regulatory pressure to prove requirements traceability to auditors who were using naïve test case design methods or using a MS Word for test cases.

The last few years have seen an explosion in better test case management tools, better test design methods, like action based testing and Agile methods where lean manufacturing ideas have teams agreeing that less is more when it comes to test case documentation.

Very few teams reported not documenting their tests. This is a ignificant improvement to just a few years ago; during the dot-com boom when web applications might get a rapid fire test there was no time for documenting any tests, no regression testing, and no repeatability. In the next release, you are expected to start from scratch. All business intelligence is left undocumented and kept with few individuals. Hopefully, those days are gone.

Still almost 20% of the groups report spending 50% or more of their time during a project documenting. That is pretty high. There has to be excellent reason those groups are documenting so much, otherwise this is a problem. If you are managing that time you must ask yourself: do these testers have enough time to test? Is it a test project or a documentation project?

P-6. If your group receives requirement documents prior to the testing planning and test case design process, how would you characterize the adequacy and accuracy of these documents?

Very useful 48.60%
Somewhat useful 48.60%
Useless 2.70%

It is a positive result that only very few respondents find their requirements useless. It is also encouraging to note that almost half of the respondents find their requirements very useful! This is habitually another area where test teams complain of the level of quality of the requirements docs.

An assumption from these results is that requirements docs may be getting better. Perhaps a balance has developed in many teams as to how much information business analysts or marketing teams need to give both developers and test teams for them to do their jobs effectively. That, or, test teams have stopped complaining about requirements and make due in other ways.

P-7. What is your view on the quality of the code that is handed to you at the start of testing?

Usually stable/testable, no idea about unit testing, no accompanying documentation of the build 45.90%
Stable, unit tested 21.60%
Stable with build notes 13.50%
Often unstable with accompanying documentation of known problems 6.80%
Often unstable, little/no information on unexpected changes 6.80%
Very stable, unit tested, with build notes explaining bug fixes and changes 5.40%

To highlight: 40% of respondents appraise their builds as stable; 46% of respondents appraise their builds as usually stable; and 13% found the quality of code often unstable.
This all seems pretty good. There is one area that is particularly troubling in all this data.

Almost half of all the respondents do not get information from development. Testers have no idea about unit testing and no information about what changed in the build. There is no viable reason for this and it hurts product quality.

Agile development with TDD and daily scrums is meant to prevent this problematic lack of information. The Continuous Integration practice including automated re-running of the unit tests and a smoke or build acceptance test is very effective in speeding up development and delivery.

The following statements are hand-written responses.

P-8. Please fill-in the blank: My biggest problem with our development process today is:

  • “Not all out BA’s are trained in writing effective clear requirements.”
  • “Lack of Unit testing. Unit Testing is NOT automated, very few developers write test harnesses. It is all manual Unit Testing.”
  • “We have no clue what they are doing.”
  • “We test our changes, but do not test the overall product.
  • Regression testing is our biggest problem.”
  • “It’s a black hole!”
  • “On most projects, there is a lack of collaboration and cooperation between test and development teams (these by and large are not Agile projects, of course!).”
  • “No technical documentation of what had been build.”
  • “They are rude with testing team.”
  • “We need earlier involvement.”
  • “They don’t understand the business or the users well enough.”
  • “Bad communication.”
  • “Bad estimation.”
  • “No timely escalation of probable risks on quality delivery.”
  • “Too many processes are followed by rote.”
  • “Bad scope and requirements management”
  • “They are laying off QA staff and I’m not sure how they are going to adequately test the product.”
  • Lots of documentation required that does not increase the quality of the product.”

P-9. If you could do anything to make your projects run smoother, what would that be?

  • “Better communication.”
  • “More communication with remote team.”
  • “More testing by development.”
  • “Unit testing to be be mandatory & unit test report should be treated as a exit criteria to start the Testing.”
  • “Send bad developers to training.”
  • “Spend more time automating regression test cases.”
  • “Automate our testing.”
  • “More time allowed for test case preparation / documentation.”
  • “Re-Focus on planning and requirements gathering. Also we could stand to enforce creation of unit tests by developers. We rely too heavily on QA Automation to catch everything”
  • “Get buy-in from key people up-front and work to expose icebergs (blockers to success) as early as possible.”
  • “Policy in handling customer requests on change requests. Project management and Sales Team have to follow process so as not to over commit on deliveries.”
  • “Plan to reduce last-minute changes.”
  • “Lighten the documentation.”
  • “Stronger project managers that can lead.”
  • “Better project management, better enforcement of standards for SW development, CM and Testing.”
  • “Integrate ALM tools.”

P-10. If you have a lessons learned, success or failure story about your team’s development processes that is interesting or might be helpful to others, please write it below:

  • “We have done a good job on creating a repeatable build process. We release once a week to Production. Where we fail is in integration and regression testing.”
  • “The processes are well-defined. The team’s commitment and unity of the team are critical to the success of the project.”
  • “Don’t develop in a vacuum. The less exposure a team has to the business, user needs, how software supports tasks, etc., the less likelihood of success. Get integrated – get informed – get exposed! At a previous company, I used to drag my developers out to customer sites with me and while they dreaded facing customers, they were ALWAYS extraordinarily energized at the end having learned so much and feeling much more connected and *responsible* for satisfying customers. This tactic was ALWAYS valuable for our team.”
  • “Maintain high morale on the team. Motivate them to learn and develop technical and soft skills.”
Michael Hackett
Michael is a co-founder of LogiGear Corporation, and has over two decades of experience in software engineering in banking, securities, healthcare and consumer electronics. Michael is a Certified Scrum Master and has co-authored two books on software testing. Testing Applications on the Web: Test Planning for Mobile and Internet-Based Systems (Wiley, 2nd ed. 2003), and Global Software Test Automation (Happy About Publishing, 2006). He is a founding member of the Board of Advisors at the University of California Berkeley Extension and has taught for the Certificate in Software Quality Engineering and Management at the University of California Santa Cruz Extension. As a member of IEEE, his training courses have brought Silicon Valley testing expertise to over 16 countries. Michael holds a Bachelor of Science in Engineering from Carnegie Mellon University.

The Related Post

Complete 2010 – 2011 Global Survey Results LogiGear Corporation LogiGear Corporation provides global solutions for software testing, and offers public and corporate software-testing training programs worldwide through LogiGear University. LogiGear is a leader in the integration of test automation, offshore resources and US project management for fast and cost-effective results. Since 1994, LogiGear has worked ...
Michael Hackett looks at questions posed to managers in the final installment of our 2010-2011 Global Survey results.
METHODS M1. The test cases for your effort are based primarily on: Response percent Response count Requirements documents 61.3% 46 Discussions with users on expected use 2.7% 2 Discussions with product, business analysts, and marketing representatives 9.3% 7 Technical documents 4% 3 Discussions with developers 8% 6 My experience and subject or technical expertise 12% ...
I am Senior Vice President at LogiGear. My main work is consulting, training, and organizational optimization. I’ve always been interested in collecting data on software testing – it keeps us rooted in reality and not some wonkish fantasy about someone’s purported best practice! Just as importantly, many software development teams can easily become myopic as ...
Data was compiled and analyzed by Michael Hackett, LogiGear Senior Vice President. This is the first analysis of the 2010 Global Testing Survey. More survey results will be included in subsequent magazine issues.
Find out how you compare to others in our survey results Test Automation is the topic of the third survey in our State of Software Testing Survey Series. This survey covers skills, tools, benefits, and problem areas of test automation.
Data was compiled and analyzed by Michael Hackett, LogiGear Senior Vice President. This is the sixth analysis of the 2010 Global Testing Survey Series. More survey results will be included in subsequent magazine issues. To read past surveys, visit https://magazine.logigear.com/category/issue/survey/. Part 1- The Home Team HT1. Do you outsource testing (outside your company)? Response percent ...
LogiGear strives to keep its finger on the pulse of the latest trends in Software Testing. During this defining moment in history, we want to hear from you about how your work has been impacted by the pandemic in this quick 5 minute survey.
Michael Hackett discusses the results of the seventh installment of the Global Surveys focusing on common training in software testing.
This survey takes an in-depth look at teams that practice DevOps and compares it to teams that don’t practice DevOps. For 2017, LogiGear is conducting a 4-part survey to assess the state of the software testing practice as it stands today. This is a 4-part series to mirror LogiGear Magazine’s issues this year.
TOOLS T1. What testing-support tools do you use? (Please check all that apply.) Response percent Response count Bug tracking/issue tracking/defect tracking 87.70% 64 Source control 54.80% 40 Automation tool interface (to manage and run, not write automated tests) 52.10% 38 Test case manager 50.70% 37 Change request/change management/change control system 47.90% 35 A full ALM ...

Leave a Reply

Your email address will not be published.

Stay in the loop with the lastest
software testing news

Subscribe