Agile Software Testing in Large

Download Report

Transcript Agile Software Testing in Large

Review: Agile Software Testing in Large-Scale Project

Talha Majeed COMP 587 Spring 2011

2

Agenda

      Definitions Introduction Adopting XP and diverging from XP Guidelines in four Development areas  Test design and activity execution    Working with professional testers Planning Defect management Summery Conclusion

3

Definitions

  Agile testing is a software testing practice that follows the principles of the overall agile manifesto In agile testing, testing is not a phase. It is integrated into Development & Testing   QA is not the last line of defense but involves at the beginning of the project.

Agile testing focuses on testing iteratively, as often as stable code base is available, until quality is achieved from end customer’s perspective.

4

Definitions

   Extreme Programming (XP) is a software development methodology which is intended to improve software quality and responsiveness to changing customer requirements.   Frequent "releases" in short development cycles which is intended to improve productivity .

XP Approach is that if a little testing can eliminate a few flaws, a lot of testing can eliminate many more flaws.

Unit tests determine whether a given feature works as intended. A programmer writes as many automated tests as they can think of that might "break" the code; if all tests run successfully, then the coding is complete. Acceptance tests verify that the requirements as understood by the programmers satisfy the customer's actual requirements.

5

Introduction

   Israel Air Force (IAF) Enterprise information system   Critical to daily operations & information security Highly complex and extreme quality Project was risky   size Scope (XP-based method) Risk management   Metrics were developed Collected & analyzed data

6

Project overview

 Data collection  Four releases (eight months)   Quantitative data  Project team used for project tracking & decision making Qualitative data  Recording of formal reflection meeting & debriefings

7

Confirming to XP

  Team implemented standard XP practices Short release   Working software two week iteration Customer feedback  Goal to achieve measureable testing benefits  Short average time for defect fixes  Short average defect longevity

8

Confirming to XP

  Planning game practice  To plan project iterations and releases   Entire teem met and customer described and prioritized stories Goal was to achieve simple design by refactoring activities XP’s whole-team and site-together practices  Two measureable results   Defect management overhead was lower Fewer false defects

9

Diverging from XP

    The project differed from classic XP in three aspects

First aspect

 Team was required to produce detailed specification in addition to code and test due to organizational policy.

Second aspect

 Project’s accepting testing( user testing) was far too complex for customer to fully specify

Third aspect

 Modified test-driven development to focus on the continuous creation of automated acceptance tests rather than unit tests.

 Never consider a feature “complete before all its tests were written and passed.

10

Test design and execution

   Each team’s member was responsible for testing In a traditional project, "everyone is responsible for quality,” but in agile project, everyone actually writes testes. Advantages    First, It eliminated the team’s reliance's on single tester Second, Developers test-awareness increased Finally, team members wrote specification and code to write test cases for their work.

11

Test design and execution

    Product size = test size Progress was measured by defining the project’s product size metrics as the number of regression test steps ran at each iteration The above approach was chosen for two reasons First, test size is more correlated with complexity than are lines of code or specification

12

Test design and execution

 Second, only features with full regression testing at each iteration are counted as delivered product size.

Figure 1 plots the product size metric period of time

13

Test design and execution

       Untested work = no work (major shifting to agile development) Untested work equals to no work in all channels and all under conditions.

Written Questionnaire response after two releases 90% team members considered acceptance and system tests “highly important” 10% ranked as “important” 33% were interested in taking a leading role in acceptance testing 60% were “interested” in leading system testing.

14

Working with professional testers

  Why professional testers?

 Add value through more testing  Developers code more features Professional tester two key challenges for organization  Testing bottlenecks  Coordinating the testing work among testers and programmers

15

Working with professional testers

 Easing bottlenecks  Developer s simply code less, test more 

Figure 2. illustrates by comparing the product size metric points attained by tests that the entire team ran to the team’s tester alone

16

Working with professional testers

   Encourage interaction over isolation Initially testers worked alone due to two reasons   Testers must be independent of programmers to prevent implementation details from affecting their test designs.

Specification were up-to-date so testers write test according to specification Testers were integrated to project team

17

Activity Planning

 Planning for quality activities involve  Time allocation    Challenges in feature testing Regression testing Repair defects

18

Activity Planning

Figure 3. Testing and defect repair (a) Net hours devoted to running (b) Net hours devoted to fixing defects and maintaining regression tests.

19

Activity planning

 Integrate feature testing and coding  Testing and coding time are usually equal  Regression testing: divide and conquer  Alocate bug-fix time globally  To estimate the time to fix bug is difficult  Reasonably predict the time required to fixing defects in forthcoming iterations

20

Defect management

  Defect management involves two major challenges  Managing workflow  Selecting and scheduling the defects to fix Use a team-centered defect-management approach  Anyone can open defect   Anyone can close a defect, after fixing it Anyone who finds a defect also assigns it to someone to be fixed.

21

Defect management

  Fix defects as soon as possible   Bug fix time was slightly over an hour Bug fix requires more time ( a day or more) should be suspended to prevent bottleneck and to ensure deadlines are met.

Advantages    Defects require far less time to fix Working on clean highly stable code base makes new development faster Avoids unpleasant customer negotiations over which defects to solve.

22

Defect management

Figure 3. Average net time to fix a defect per iteration

23

Summery

 Presented definitions of Agile testing, XP , Unit test, Acceptance tests etc.

   How Project adopts/ diverges from XP Identified How agile software testing works on large-scale project Showed the graphs of real data from Israel Air Force enterprise information system.

 Described the Guidelines in four Development areas

24

Conclusion

   Project's data and analysis are useful both to practitioners wishing to establish new agile teams and to research work Even on large-scale project, the team achieved full regression testing at each iteration and developer testing There is a need of quality data on large-scale project so, it is highly encouraged for further investigation on long-term agile projects.