2010 – 2011 LogiGear Global Testing Survey Results – Tools


T1. What testing-support tools do you use? (Please check all that apply.)

Response percent Response count
Bug tracking/issue tracking/defect tracking 87.70% 64
Source control 54.80% 40
Automation tool interface (to manage and run, not write automated tests) 52.10% 38
Test case manager 50.70% 37
Change request/change management/change control system 47.90% 35
A full ALM (application lifecycle management) suite 19.20% 14

Result analysis: Thirteen percent do not use a bug tracking tool. This does not surprise me but it does many others that many test teams do not track their bugs!

About half of the respondents use a test case manager, and the same percentage uses a requirements manager or change control system. Half use a automation tool interface; these tools most commonly contain manual and automated test cases. Yet, only 20% use a full lifecycle ALM tool. A few years ago this number would have been much smaller.

With each passing year─especially as more teams go agile or offshore work─this number will dramatically increase.
T2. I would describe our bug (bug, issue, defects) tracking as:

Response percent Response count
Effective 37.70% 26
Very effective; has a positive impact on product quality 34.80% 24
Adequate 20.30% 14
A mess 4.30% 3
Poor 2.90% 2
Hurts product quality/has a negative impact on product quality 0% 0

T3. What type bug tracking tool do you use?

Response percent Response count
Relational database tool (Bugzilla, TrackGear, TeamTrack, Team Test, ClearQuest, Homebuilt web-based or client server database) 68.10% 47
ALM tool that includes defect tracking 18.80% 13
Excel 8.70% 6
Email 2.90% 2
We do not track bugs 1.40% 1

Result analysis: This is a very positive move in our profession. Just a few years ago the number of teams using excel to track issues was significantly higher.

Excel is not an adequate issue tracking tool to sort, query, retrieve old issues from past releases, or control and manage access. With so many good and open source tools, there is no reason to be using a naive system.
T4. How many bug tracking systems do you use during a regular test project?

Response percent Response count
1 69.60% 48
Combination of tool and Excel and/or email 14.50% 10
2 11.60% 8
More than 2 4.30% 3

Result analysis: The problem of multiple bug tracking tools is common. In this survey, almost 30% of teams use more than one bug tracking tool. Most problematic is that almost 15% who use a tool, use excel and email. I see this often in my consulting work. It always causes headaches.

One team will not use another team’s tool, developers have a work management tool and will not use the bug tracking tool so the test team has to use two tools, some remote team may not be allowed access to the internal tool so all their bugs get communicated in excel and email. It is a management problem, but it also lends to a more devious problem of giving the impression that testing is disorganized.
T5. How do you communicate and manage test cases?

Response percent Response count
A relational database/repository focused on test case management (TCM, Silk Central, Rational Test Manager, TA, etc) 41.50% 27
Excel 21.50% 14
ALM tool that includes test case management 20% 13
Word 15.40% 10
We do not track, communicate or manage test cases 1.50% 1

Result analysis: The problem here is that almost 37% of teams are using MS-Word or Excel─that is dead data. It is difficult to share, edit, maintain, sort, query, and measure with these programs.

There are so many good test case management tools, some open source that make writing, editing/maintaining, sharing and measuring test cases so much easier. In my experience, there are very few good reasons not migrating to an easier and more sophisticated tool set.

There are also easy solutions to have test cases and bug tracking linked to the same tool. Test teams can graduate to a higher level of management, reporting and efficiency with tool sets.

T6. How are the test cases used? (Choose the MOST appropriate.)

Response percent Response count
They are used only for testers to execute tests 34.30% 24
They are used to measure and assess test coverage 30% 21
They are used to assess proper test execution 21.40% 15
They are used to measure and manage project progress 14.30% 10
They are not used during the project 0% 0

Result analysis: For teams not using test cases for more than execution, it may be useful to know that it is very common for them to have other usage.

T7. If you use a test case management tool, how is it used?

Response percent Response count
It is used to run our manual and automated tests 57.40% 31
It is used only for manual tests 40.70% 22
It is used to run only automated tests 1.90% 1

T8. If you have experience with success or failure regarding test tool use (ALM, bug tracking, test case management, automation tool interface, other) that is interesting or helpful to others, please write it below: (Comments from practitioners)

  1. “The best thing to do is manage the progress of the tests and see the bugs. You can measure the project’s health.”
  2. “I find Bugzilla reporting and commenting adequate communication most of the time. Its only problem is when immediate problems surface – at that point an email to appropriate parties telling them to look at Bugzilla usually works. So does walking over to the developer and showing them the issue.”
  3. “So far Jira was the best bug tracking tool.”
  4. “If you want people to use a TCM or bug management tool, make sure it has good performance and it’s simple.”
  5. “For a large project or program it is crucial to select a single method of tracking defects and what is considered defects versus ‘issues.’ This can lead to a great deal of confusion where defects identified as issues are not handled and addressed properly. I worked on a large project that various efforts had four different ways of tracking defects and issues. The result was that it was hard to assess the overall quality of the product that was being implemented.”
  6. “Testing should be driven by proven testing methodologies; not by the tool itself.”
  7. “Generating quality reports can be difficult using bug tracking systems.”
  8. Certain automation tools will not be suitable for certain type for projects.”
  9. “Test case management tools are not integrated to requirement management tools reason why our test cases are sometimes tested against obsolete functionality.”
  10. “Rally is very useful.”
  11. “Process discipline matters more than any tool.”
  12. “The tool is difficult to use for non-technical team members.”
Michael Hackett
Michael is a co-founder of LogiGear Corporation, and has over two decades of experience in software engineering in banking, securities, healthcare and consumer electronics. Michael is a Certified Scrum Master and has co-authored two books on software testing. Testing Applications on the Web: Test Planning for Mobile and Internet-Based Systems (Wiley, 2nd ed. 2003), and Global Software Test Automation (Happy About Publishing, 2006). He is a founding member of the Board of Advisors at the University of California Berkeley Extension and has taught for the Certificate in Software Quality Engineering and Management at the University of California Santa Cruz Extension. As a member of IEEE, his training courses have brought Silicon Valley testing expertise to over 16 countries. Michael holds a Bachelor of Science in Engineering from Carnegie Mellon University.

The Related Post

This survey takes an in-depth look at teams that practice DevOps and compares it to teams that don’t practice DevOps. For 2017, LogiGear is conducting a 4-part survey to assess the state of the software testing practice as it stands today. This is a 4-part series to mirror LogiGear Magazine’s issues this year.
Data was compiled and analyzed by Michael Hackett, LogiGear Senior Vice President. This is the first analysis of the 2010 Global Testing Survey. More survey results will be included in subsequent magazine issues.
METHODS M1. The test cases for your effort are based primarily on: Response percent Response count Requirements documents 61.3% 46 Discussions with users on expected use 2.7% 2 Discussions with product, business analysts, and marketing representatives 9.3% 7 Technical documents 4% 3 Discussions with developers 8% 6 My experience and subject or technical expertise 12% ...
This survey on modern test team staffing completes our four-part 2017 state of software testing survey series. We’ll have more results and the “CEO Secrets” survey responses to publish in 2018.
I am Senior Vice President at LogiGear. My main work is consulting, training, and organizational optimization. I’ve always been interested in collecting data on software testing – it keeps us rooted in reality and not some wonkish fantasy about someone’s purported best practice! Just as importantly, many software development teams can easily become myopic as ...
METRICS AND MEASUREMENTS MM1. Do you have a metric or measurement dashboard built to report to your project team? Response percent Response count Yes 69% 49 No 31% 22 Result analysis: Anything worth doing is worth measuring. Why would almost 1/3 of teams not measure? Is the work not important or respected? Does the team ...
Process The objective of this survey and analysis is to gather information on the actual state-of-the-practice in software testing today. The questions originate from software development team assessments I executed over the years. A process assessment is an observation and questioning of how and what you and your team does.
The target audience of the survey were black box testers. Please note that to these respondents, test automation is mainly about UI level automation, not unit, performance or load testing.
In this installment of the 2010-2011 Global Testing Survey, we analyze the demographics of the more than 100 respondents from 14 countries. The next and final installment will analyze the “For Managers” section of the survey.
Data was compiled and analyzed by Michael Hackett, LogiGear Senior Vice President. This is the sixth analysis of the 2010 Global Testing Survey Series. More survey results will be included in subsequent magazine issues. To read past surveys, visit https://magazine.logigear.com/category/issue/survey/. Part 1- The Home Team HT1. Do you outsource testing (outside your company)? Response percent ...
Complete 2010 – 2011 Global Survey Results LogiGear Corporation LogiGear Corporation provides global solutions for software testing, and offers public and corporate software-testing training programs worldwide through LogiGear University. LogiGear is a leader in the integration of test automation, offshore resources and US project management for fast and cost-effective results. Since 1994, LogiGear has worked ...
Check out the results of our poll where we asked practitioners what software testing trends they think will dominate in 2019. You can barely go online today without being asked to respond to a poll. Many have a hook to a sale or to win a free phone. But, to cut to the point, many ...

Leave a Reply

Your email address will not be published.

Stay in the loop with the lastest
software testing news