Introducing Test Automation for a team
Hello all,
A team I'm working with would like to start doing automation but they are limited in the amount they can do by an upcoming release date. One of the testers on the team is getting training from another team already doing at the moment. The hope is that he'll learn, practice and then start passing on the knowledge.
There is the possibility to do some work on automation before the release and I am trying to think of the best way to do it.
I was thinking of the following and would like some feedback:
* Have the PO identify tests of highest business value
* Have the development/test team identify test of highest technical value
* Add them to the backlog as PBIs
* Have them discussed as normal during the refinement session
* Plan during the spring planning, following normal process from here
* In the retro, update the definition of done to say test automation must be passing or maybe test automation should increase by 5% per sprint. (Opinion on this too would be good)
The other approach was to start doing testing of the upcoming stories but I'm somewhat afraid that they might over test as they are not used to having automation. I'd like the opportunity to help them understand that selecting automation tests at a low level may lead to over testing. Additionally, the team would not be able to test each of the stories making it harder to set a common DoD.
Many thanks,
Úna
As an automation test developer in my previous career, the best experience I had was when the entire development team discussed if a story needs to be automation tested during the sprint planning meeting. If it did, the task was added as part of the story's acceptance criteria.
This not only gave the team insights into the value of automation testing (possibly as a part of a continuous integration testing repository on Jenkins, for example), but also added substantial gravitas to the role of teset-focused engineers on the team.
It's not a good idea to add tests to a Product Backlog. They are assertions of product quality, not elements of product scope.
It's reasonable however to include them in a Sprint Backlog as part of the team's plan of work for that Sprint. The amount of work that is taken on from the Product Backlog may need to be reduced accordingly. Fewer stories will be completed, but quality of the product will improve. It's also reasonable to assert a minimum level of test coverage in the Definition of Done, increasing the level sprint by sprint.
Hi all,
My main challenge is to cover the main flows which were not covered with automation testing in the last few months and will not be touched again for a while (the product has not been released to a customer but has had internal releases). I hope to get to the point where the most valuable tests are in place and the team can focus on the current sprint and adding tests as described.
When you say that the automation tests can be added to the Sprint Backlog, do they have to be related to the current sprint work or can they be items unrelated to the Sprint Goal?
I am unsure how to track the most valuable flows without them being forgotten. The team see some of the areas which they have done before as more valuable to automate than what is in the upcoming sprints. As such, they would like to automate those and then get into the flow of bringing automation into the Sprint.
Many thanks,
Úna
> When you say that the automation tests can be added to the Sprint Backlog,
> do they have to be related to the current sprint work or can they be items
> unrelated to the Sprint Goal?
They don't necessarily have to be related to the Sprint Goal, given that they are related to the Definition of Done.
The team should plan its work into the Sprint Backlog to meet the Goal, in accordance with the quallity standards of the DoD.
My advice is for the team to track and prioritize these missing tests in some sort of technical debt register.
Hi Ian,
Many thanks, this gives us a nice way to organise without polluting the backlog.
@Ian, I understand what you mean of not adding tests to the Product Backlog.
I'm just curious - if the team uses an electronic system (Jira/TFS) to manage their backlog items, how would you suggest to include this work in the Sprint Backlog without adding anything to the Product Backlog, and of course keep all their work transparent?
I don't recommend using tools that force a certain way of working onto a team.
In some cases it might be possible to muddle through with a workaround, such as by having a zero-estimate placeholder in a Product Backlog against which Sprint Backlog tasks can be planned.