A Model for Estimating Agile Project Schedule Acceleration

Download Report

Transcript A Model for Estimating Agile Project Schedule Acceleration

University of Southern California
Center for Systems and
Software Engineering
A Model for Estimating Agile
Project Schedule Acceleration
Dan Ingold, USC-CSSE
SSCM/COCOMO Forum
17 October 2012
University of Southern California
Center for Systems and
Software Engineering
Project Goals
Research Question:
Can we quantify the schedule acceleration to be
expected from employing agile techniques, given a
range of development project characteristics?
• Goal is not to estimate what a team using poor
SE/architectural practices & processes can
achieve
• Can always cut corners to reduce schedule… for a
while, at least
• Goal is to examine what effects the various
characteristics of a project using good practices
have on achieving schedule compression
October 17, 2012
Copyright © USC-CSSE
2
University of Southern California
Center for Systems and
Software Engineering
COCOMO for Agile Projects?
• COCOMO II calibrated against larger projects
– Larger projects typically optimized to minimize cost
– Agile projects typically optimized to minimize schedule
• Over-estimates schedule for smaller projects
– Estimates schedule varies with cube-root of effort
– Smaller projects vary with square-root of effort
• Optimizes 27-PM project as 2.45 persons / 11 mos.
– Minimizes communication overhead, optimizes effort
– But… 11 months is too long under competitive pressure
October 17, 2012
Copyright © USC-CSSE
3
University of Southern California
Center for Systems and
Software Engineering
(Re)introducing CORADMO
• Constructive Rapid Application Development
Model
• Observations of early-agile projects completing 27PM projects in 5 months with 5.4 persons, and
even 9 persons to complete in 3 months
• Derivative of COCOMO II, introduced ~2000
– Implemented as COCOMO II / COPSEMO post-processor
– Derived six drivers through initial two-round Delphi
• Lacked critical mass of data to calibrate model
October 17, 2012
Copyright © USC-CSSE
4
University of Southern California
Center for Systems and
Software Engineering
New CORADMO Drivers
• SERC RT-34 tasked to study “expediting SE”
– Identified candidate firms and agencies that were
successfully compressing project development time
– Conducted series of onsite visits and in-depth interviews
• Derived expanded set of factors common across
these entities, good candidates for new drivers
–
–
–
–
–
Product: describes nature of system to be developed
Process: characterizes the development methodology
Project: describes execution of the development effort
People: characterizes capabilities of development staff
Risk: describes stakeholder willingness to accept risk
October 17, 2012
Copyright © USC-CSSE
5
University of Southern California
Center for Systems and
Software Engineering
General CORADMO Structure
• CORADMO depends on the existence of a good
baseline effort estimate; it does not estimate effort
• Estimated duration D is proportional to square-root
of estimated effort PM
D = Õ Fi PM
• Like COCOMO, CORADMO uses product of
multiplier factors, rated according to project
characteristics
October 17, 2012
Copyright © USC-CSSE
6
University of Southern California
Center for Systems and
Software Engineering
Product Factors
•
•
•
•
•
Simplicity: simple products are easier to develop
Reuse: reuse saves work (or does it?)
Deferrals: postpone features to fit schedule
Modeling: working models vs complete documents
Maturity: fewer technologies needing development
October 17, 2012
Copyright © USC-CSSE
7
University of Southern California
Center for Systems and
Software Engineering
Process Factors
• Concurrency: serial waterfall / concurrent iteration
• Streamlining: bureaucracy requires “just so”?
• Tool support: integrated development, continuous
integration, automated testing, model-to-code, etc.
October 17, 2012
Copyright © USC-CSSE
8
University of Southern California
Center for Systems and
Software Engineering
Project Factors
• Staff size: more people ≈ higher communication
overhead (are factors large enough for big staff?)
• Collaboration: how well does team share data?
• MMPTs: tool support within and across domains
October 17, 2012
Copyright © USC-CSSE
9
University of Southern California
Center for Systems and
Software Engineering
People Factors
• KSAs: how senior is team? How agile is team?
• Single vs Multi-domain: how well do team skills
cross domain boundaries (analogous to MMPTs)
• Compatibility: can’t we all just get along?
October 17, 2012
Copyright © USC-CSSE
10
University of Southern California
Center for Systems and
Software Engineering
Risk Acceptance Factors
• How willing are stakeholders to accept risk?
– Tolerance of chaotic, evolving processes
– “We’ve always done it this way.” “The manual says these
are the required processes and artifacts.”
– Adaptive, oriented toward product completion
• Many of the accelerated-development teams
reviewed had compliant, risk-tolerant customers
October 17, 2012
Copyright © USC-CSSE
11
University of Southern California
Center for Systems and
Software Engineering
Commercial Calibration
• Midwest software development firm using agile
• Supplemented agile with additional SE processes
–
–
–
–
–
Detailed business process analysis
Delphi estimates of software testing effort
Risk-based situation audits
Componentized architectures
…
• Makes this firm reasonably comparable to complex
aerospace/defense projects from which CORADMO
factors derived
October 17, 2012
Copyright © USC-CSSE
12
University of Southern California
Center for Systems and
Software Engineering
Commercial Calibration (cont’d)
October 17, 2012
Copyright © USC-CSSE
13
University of Southern California
Center for Systems and
Software Engineering
Commercial Calibration (cont’d)
• Projects varying from 10 KSLOC to 400 KSLOC
• Varying levels of complexity and technology
• Selected rating factors based on reported project
characteristics, and of firm as a whole
– Product: C++ projects received Low ratings; HTML/VB
projects received Very High ratings
– Process: Most projects reported as “highly concurrent,”
received Very High ratings
– Project: Variation in staff sizes results in different ratings
– People: Very senior staff rated as Very High
– Risk: Consistent, rigorous and balanced approach yielded
Nominal risk ratings
October 17, 2012
Copyright © USC-CSSE
14
University of Southern California
Center for Systems and
Software Engineering
Calibration Discussion
• Acceptable results for first-cut
– One outlier discarded, described as having high
requirements churn
– Tends to over-estimate schedule compression
• Outlier suggests Product may require sub-factor
for requirements churn: perhaps “stability?”
• Process within narrow range
• Project within narrow range
• People factor has single rating
– May not extend to wider range or less capable staff
– But… rapid projects often employ most-senior staff
October 17, 2012
Copyright © USC-CSSE
15
University of Southern California
Center for Systems and
Software Engineering
Decision Support Case Study
• Hypothetical company, composite of real affiliates
of USC CSSE
• Illustrates use of CORADMO tool to support
decision to move to more agile approach
• Evaluate as-is and to-be conditions, as rated by
model sub-factors
• Determine potential schedule compression
through adoption of new strategy
October 17, 2012
Copyright © USC-CSSE
16
University of Southern California
Center for Systems and
Software Engineering
Case Study—As-is
• Evaluate current
state against factors
• Use to inform
decision on change
to more rapid
development
• Overall acceleration
factor of current
state = 1.01
October 17, 2012
Copyright © USC-CSSE
17
University of Southern California
Center for Systems and
Software Engineering
Case Study—Initial To-Be
• Produce artifacts more
concurrently
• Causes reductions in
–
–
–
–
Technology maturity
SE tool support
General SE KSAs
Team compatibility
• Expected 5% schedule
improvement
• Saw 4% schedule
increase
October 17, 2012
Copyright © USC-CSSE
18
University of Southern California
Center for Systems and
Software Engineering
Case Study—Final To-Be
• Restore reduced
factors to baseline,
by being aware of the
potential problems
• Choose to
– Perform more
activities concurrently
– Improve bureaucratic
internal and external
processes
• Schedule improves
by 8%
October 17, 2012
Copyright © USC-CSSE
19
University of Southern California
Center for Systems and
Software Engineering
Case Study Shortcomings
• Case study illustrates some problems with using
the factors in the model
• We really would have expected some more
noticeable change in schedule
• The expected improvements and discovered
shortfalls are so small as to be lost in the noise
– Suggests either that the factors are too small
– Or that the method of combining sub-factors (in this
example, by averaging them) is incorrect
• So, more work to be done…
October 17, 2012
Copyright © USC-CSSE
20
University of Southern California
Center for Systems and
Software Engineering
Further Issues
• How to handle contribution of sub-factors
– Average, additive, multiplicative, preponderance?
– Offsetting: does a very-low complement a very-high?
• Factors complete and correct?
• Too many factors (too complex)?
• Accuracy, consistency of how factors would be
coded by potential users
• Overall range of factor multipliers, 1.63-0.48, (3.4:1)
seems consistent with reported range of schedule
compression in agile projects, but…
October 17, 2012
Copyright © USC-CSSE
21
University of Southern California
Center for Systems and
Software Engineering
Next Steps
• Need additional data on wider range of projects
that would be characterized as “rapid” or “agile”
• So, looking for volunteers who would be willing to
share project performance data
• Conducting workshop here to discuss factors
– Delphi to uncover omitted (or superfluous) factors
– Effect of contravening factors
– Sufficient range of multiplier factors
October 17, 2012
Copyright © USC-CSSE
22
University of Southern California
Center for Systems and
Software Engineering
Questions?
October 17, 2012
Copyright © USC-CSSE
23