Monday, May 10, 2010

Prioritizing and automating tests

After choosing an automation platform and methodology, one of the greatest challenges QA/automation teams face is deciding what to automate and defining test priorities for automation. In Delver initially we asked QFOs for lists of Test Cases (TCs) to automate. The QFOs were supposed to compile lists, send them to Automation Lead who would review the TCs and allocate them to Automation Engineers (AE) to automate. In order to monitor this process, I used to measure the amount of automated tests Vs total TCs in the system. The amazing thing about this procedure was that it managed to frustrate everyone equally:
  • The QFOs kept feeling that hunting TCs in the system for the automation team was a time consuming burden. It was reoccurring task and often the AEs rejected the selected TCs for low ROI in automation.
  • Secondly, QFOs discovered that the regression tests that were needed for each version were not the ones automated. This was due to the fact that when the automation team were assigned TC to automate, the assigned cases were not correlated to any specific version, just arbitrarily selected from the TC pool.
  • The automation engineers did not manage to increase coverage rates. When comparing the automated cases to the TC pool, despite all the efforts, the impact was slim, and it was rarely over 35% coverage.
  • We also found out that of all the team in Delver R&D, the QA automation team were the only team who did not feel part of the release cycle. Their output was not correlated to any specific version or content and they slowly became "those dudes" who came in the morning, wrote a bunch of "automation things", and went home in the evening.
  • I became frustrated myself, seeing my talented automation engineers slowly drifting apart from the rest of the QA team, doing "their thing" without being able to relate to what everyone else was doing. Their feeling of belonging was waning slowly.
So we changed everything. Instead of going after (and annoying the hell out of) the QFOs and demanding lists of TCs, we decided to automate the needed regression tests based on the known content of the upcoming version. In Delver we have a new version once in two weeks, so the AEs typically had two weeks to automate the needed TCs for the upcoming version. The list of tests for each version is compiled in advance anyway, so the only thing needed was to go to the list (we call them Test Runs, a list of Ad hoc TCs, compiled specifically for each version based on the contents of that version) take the most important TCs and start automating them, moving to TCs of lesser priority when to higher priority TCs were done. This left us in a more favorable position:
  • We no longer need the QFOs for setting automation priorities. We know what to automate: the tests we're about to run in two weeks.
  • QFOs immediately felt the relief. Since we started on automating the next version, when testing started, they had very little regression tests to run manually.
  • Automation team is now measured differently. Instead of measuring automated TCs Vs the whole pool of all available tests, they are now measured by the percentage of automated tests in each version. The numbers are significantly higher, reaching ~70% automation of regression tests.
  • Automation Engineers now feel they are part of the group because their schedules are now directly synchronized to everyone else's and their work affects the team within a week or two, not in some obscure future.
Managing a team of developers in the QA group is challenging. The automation engineers' work is dramatically different from classic testing work and left unchecked, there is a risk they'll start drifting apart from the rest of the group. By finding a method in which the automation efforts directly affect our day by day testing efforts, we manged to bring them back to the team and dramatically increase their effectiveness on our daily work.

6 comments:

oraclefusion said...

Thanks for sharing the useful information and good points were stated in the blog which was much interesting for further information visit our site
Oracle Fusion Training

Unknown said...

Thanks for sharing the useful information and its very useful blogs,
Hi We at Colan Infotech Private Limited best web design company in chennai,is Situated in US and India, will provide you

best service in
qa testing services

Sandeep SEO said...

You're so interesting and Fantastic; so nice to find someone with some original thoughts on this subject seriously. Many thanks for starting this up.................................Click here for more information about Oracle Fusion Financial Training details

Unknown said...

You’re doing great work. In this blog we can get useful information. Thanks for posting this blog.

Oracle Fusion PPM Training in Dallas

Oracle Fusion PPM Training in Dallas

Rajesh said...

Nice information. I was searching for the same. It helped me a lot and saved my time.
Oracle Fusion HCM Technical Training in Ameerpet

billharis said...

Thanks for sharing the information and it is very useful.
Chatbot development services
Messenger bot developer
Bot development services
Chatbot development company
Facebook bot development