Best Practices for Agile and Outsourced QA Testing

shutterstock_258929474

Outsourcing was one of the most powerful technology trends of the 90s and 2000s.  Its promise was clear:  Knowledge work with knowledgeable people at a lower cost than having the work done in the United States.  Many larger companies got the message at the same time and outsourced en masse.  As of 2003, McKinsey reported in its article Offshoring:  Is it a win-win game?:

Offshoring will allow the US to capture economic value through multiple channels: Reduced costs, new revenues, repatriated earnings, and redeployed labor

Meanwhile, at around the same time, Agile and Scrum were gaining steam, albeit starting from smaller firms and working up the value chain.  This brings us to today where organizations want to become more Agile but still continue to work in a geographically distributed environment.

While Agile purists tend to be against outsourced QA testing, its a stark reality for many companies.  The decision to outsource may have long predated the decision to transition to Agile.  Given this constraint, its much more productive to talk about how some companies are getting the best of both worlds.  Here are 3 best practices for getting the most from your Agile efforts and your outsourced qa testing.

shutterstock_33331030

Improve communication and decrease hand-offs

Use video technology to bring the distributed team together.  While email is perfectly fine to fill in the gaps and provide additional detail.  There is nothing that can replace face to face interaction.  Using tools like Google Hangouts or Skype, you can approximate the high bandwidth communication that you get with co-located teams.

A headache associated with geographical distribution is meeting times.  A solution is to alternate the time of the team’s daily standup every sprint.  One sprint use a time that is convenient to the onshore team (and let offshore suffer late nights/early mornings).  The next sprint use a time that is convenient to the offshore team (and let the onshore team suffer late nights/early mornings).

Julian Clayton, VP of Product at FieldLens, a communications platform for the construction industry, remarks, “do not underestimate the need for
an internal QA lead. Even with only moderately complex products it can be nearly impossible for an external lead to keep up with the day to day
changes in the fast pace of an agile environment”

shutterstock_200797052

Create a clear separation of testing concerns

A challenge with marrying Agile practices and offshore arrangements is that when there is a single deliverable that needs to be developed (onshore) and tested (offshore), you are adding in another step to be able to deliver a “potentially shippable” product.  With every issue, you add 24 hours of delay.  No good.

The alternative is to create a clear separation of testing concerns.  Engage onshore developers for verification testing via automated unit tests (using an xUnit framework).  Engage offshore testers for validation testing (e.g., performance/integration/exploratory testing).  This will eliminate waiting for initial verification testing, but still ensure the more robust validation testing still happens, using your offshore testing experts.

shutterstock_119131810

Enforce modern engineering practices

Companies tend to focus more on “process” based solutions when there are many technical solutions to enable you to get the most from your offshore partnership.  Two crucial engineering practices are continuous integration and automated unit testing.

Continuous integration describes a practice by which code is integrated and a new integrated build is created upon each check-in.  This ensures the system is always stable and any integration related issues are immediately visible.  Tools such as Jenkins, Bamboo, CircleCI, Team Foundation Server can help with continuous integration.

Automated unit testing is a practice whereby developers create code to test the code they have written.  This creates an ever-growing regression suite that grows as the code base grows.  When you combine it with continuous integration, you create an extremely robust “always on” quality framework that runs and reruns every automated unit test every time any code is checked in.

If you want to enable the best practice of “follow the sun” workflow, continuous integration and automated unit testing are must-haves.

shutterstock_43047022

Doing more with less

Agile and outsourcing are not as diametrically opposed as they first appear.  Both practices reflect companies’ attempts to do more with less.  While one is rooted in predictive planning and the other is rooted in adaptive planning, there are still ways to make them work better together.

Until Next Time,
Stay Agile My Friends
PS, Get the cheat sheet for Agile with Outsourced QA Best Practices here

 

There was an issue loading your timed LeadBox™. Please check plugin settings.

Read More

How Agile Testing Works

shutterstock_181081433

Unlike traditional waterfall models, where testing was treated as a separate phase by itself with silo’d teams and heavy weight processes built in, agile principles strongly prescribe that testing as a practice that will evolve and involve all development team members of a cross-functional team through the life of the project.

Testers in the team bring in special testing expertise to the team, with constant interaction with other team members to ensure delivering the early business value desired to the customer at frequent intervals maintaining a sustainable pace. Agile roles largely overlap, so early and continuous testing is the key to maintain high quality.

Agile software development emphasizes testing as an integral part of the software development and  recommends using the “entire-team” approach to embed quality into the software product development.

Both coding and testing are done iteratively and incrementally to build the business value upwards, until it get full points from the customer. So feedback and collaboration between the developers and testers is the key to make Agile testing successful.

Agile principles replace the need of a dedicated Test Manager with short feedback loops of interactions between the Product owners and team members. Agile testing also influences developers to contribute to testing, since all agile team members are generalized specialists.

Testers also help developers in the team, to design and write unit test cases. Other team members may give the feedback using “Show and Tell” technique before the code being checked in into the repository. In short, developers are encouraged to think more like testers, continually checking their own code for potential errors, and testers are encouraged to think more like developers, tempering their natural destructive tendencies to engage more fully in the creative process itself.

The Definition of Done, also known as the exit criteria, is a checklist we go over before we can call a story “done” The acceptance criteria of user stories may be refined to relate the user stories to a testable outcome.

Extreme Programming (XP) gives set of best engineering practices to introduce quality in Agile development.  Test Driven development (TDD) is a test first model. Start development with a unit test case, make it fail, then make it pass, and refractor it gradually. We call these three stages as Red, yellow and green. We can also expand TDD to acceptance testing as well.

Pair programming is another practice, in which a tester can pair up with another developer to test and give feedback on a piece of code.

Agile testing strongly recommends test Automation at every level. Writing automated acceptance test cases that will test business logic using API, hooks, business logic layer and GUI is called Acceptance Test Driven Development. These tests should be implemented as a part of the continuous integration platform. Automating all the test cases gives us an edge to perform quick regression testing at the end of every sprint to make sure that something was developed before was not broken. It is also advisable to check in the automated tests into the repository for future references and modification.

Exploratory testing is another face of Agile testing which involves testing at a broader level uncovering the implicit expectations, without any test documents. It bring outs the critical and innovative thinking, and diverse ideas to make the better software products.

Various tools used for agile testing:

JUnit, Native Groovy, JMETER, SONAR, Selenium, DBUnit, Cucumber, Crucible, Jenkins CI, Hudson CI

5 FREE Bug Example User Stories

Enter your email address to get 5 free examples of bug-related user stories

Work Email Address *
There was an issue loading your timed LeadBox™. Please check plugin settings.

Read More

How to manage detailed test plans with Agile

shutterstock_174483155

Question:  I have a new scrum team struggling with if they should be writing more detailed testing scenarios/test plans. The acceptance criteria isn’t broad enough to capture all the different testing scenarios that should be tested before migrating code to production (e.g. regression testing, detailed scenarios, etc.). How do other teams handle/manage these more detailed testing documents?

================================================>

Hi Jayne,

“How do other teams handle/manage these more detailed testing documents?”

Often, we spend a lot of time thinking about testing and testing documentation when we would be better served thinking about QUALITY.

Quality is not something that can be easily added at the end of a development cycle. It is better added as part of the development cycle itself.

Most projects’ testing hiearchy looks like this, with mostly manual testing with some integration testing.

The un-automated testing triangle

You can remove a lot of the overhead associated with needing to create detailed test plan documentation by engaging in aggressive Test Driven Development/Unit Tests and Automated Acceptance Test Driven Development.

When projects save their testing to the end of the development cycle (dev->test), it has a couple of negative side effects:
1. It tends to make developers have the mindset that they are not responsible for testing.
2. It defers the quality risk to the end of the development cycle, where it costs the most to fix.

Test Driven Development/Unit Testing forces your developers to think about testibility up front, in terms of code as well as design, and makes them responsible for ensuring that the individual units/functions/methods are verified working as part of the build.

Acceptance Test Driven Development, with a tool such as cucumber, is a wonderful way to test the software in aggregate, automatically. Automated ATDD tests can be written by ANYBODY (PO, QA, Devs, SM, Stakeholders, etc) and the automation of the acceptance test criteria can be decoupled from the code, meaning they can be written in parallel.

As a result, an Agile testing hierarchy looks more like this:

The Automated Testing Triangle

With unit tests as a foundation and manual testing at the very top.

If you go down the road of automating unit tests and automating acceptance tests in parallel with the code being developed, you will gain a few things:

  • Quality as part of the process, as part of the design, as a shared responsibility
  • Quality not saved as a step near the end of the development cycle
  • Less of a need for rote test plans, as most of the rote stuff will be automated, meaning that your human testers can do the kind of testing humans are good at, exploratory testing.
  • Less of a need for a large manual regression cycle, since every build and every change can be automatically regressed at both the unit level and the acceptance level multiple times a day.

If you move toward unit test->acceptance test automation then the documentation becomes a lot more lightweight and less of a chore to “manage”.

There was an issue loading your timed LeadBox™. Please check plugin settings.

Read More