Development Lifecycle Models

Download Report

Transcript Development Lifecycle Models

CSC 4900: Advanced Software
Project
Fall 2009
Dr. Chuck Lillie
Course Outline

Goals:
–
–
–

Comprehend quality software development methodologies.
Apply software engineering principles.
Construct a quality software product.
Objectives:
–
By the end of this course the student should be able to:








Evaluation



Chart 2
Recognize and define five major software development lifecycle models.
Evaluate a Statement of Work (SOW) and determine project requirements.
Create a Work Breakdown Structure (WBS) for a software project.
Develop a software requirements analysis document.
Design and document a software project.
Write and execute a software test plan.
Implement a programming project using an established software development
process.

Weekly Progress Reports
Programming Project
Team Evaluation
Final Exam
20%
40%
20%
20%
Course Outline

Two Parts
–
Software Development Process




–
Problem Definition
Requirements Analysis
Development Lifecycle
Program Management
Major Programming Project



Each Team will Select Project
Project Definition and Planning
Implementation
–


Chart 3
Status Reports
Testing
Project Presentation
Major Programming Project

Weekly Progress Reports
–
–









Chart 4
Every Wednesday
Different Team Member
Statement of Work*
Requirements Specification*
Work Breakdown Structure (WBS)*
Technical Specification Documents (Design Document)*
Development Schedule and Plan*
Test Plan*
Test Report*
Code
* This document will have a
User’s Manual*
peer review report attached
Past Projects

Budget Planning and Reporting System
–
–
–

Use web to inform out of state students of how to get from
point a to point b
–



–

Chart 5
Include foreign language capability
Web-based multimedia presentation of entertainment
content for children ages 4-8
Electronic Grade Book for Primary & Secondary School
Web-based product to manage sale of merchandise
–

Move spreadsheets to web
Allow update function across all divisions
Propagate updates to composite worksheet
Track inventory
Determine when to reorder
Web-based recipe system
Electronic Message Board
Suggested Project: iPod Touch
Application

Write an application for the iPod Touch
–
–

Second life application
Other suggestions
What is needed
–
Access to Apple desktop computer


Needed for iPod development environment
Old Main Room 237 and other locations on campus
–
–
–
–
Knowledge of iPod interfaces (APIs)
Knowledge of Second Life interfaces (APIs)
Can Second Life be accessed from iPod?

Chart 6
Not in Oxendine
On-going projects for last two or three years
iPod Touch Background

Background:
–



Download the SDK
Read the following documents:
Getting started doucments
–
–
–
Chart 7
Link to the Apple iPhone Development page:
Register to be a iPhone developer
Getting started with iPhone
Getting started with tools
Getting started with data management
Major Decision Point: August 31, 2009

What is needed to write an interface of Second Life to
iPod?
–
–
How long will it take?
What are the resources required?



–
Person hours
Tools
Knowledge
What can be accomplished in one semester? Two semesters?

Can this be structured as a phased approace or incremental
development project?
–

Chart 8
What are the phases or increments?
Any other projects that could support Mass
Communications?
Why do Software Engineering


Microsoft Word 1.0 (WinWord)
Directive to team
–
–

Develop best word processor ever
Within 12 months
Problems
–
–
Unachievable goals
Project size of WinWord takes at best 460 days


–
High turnover

–
Chart 9
The highest estimate for WinWord was 395 days
Actually took 5 years and 660 man-months
Four managers in five years
Took 12 months in stabilization vice 3 months
Lecture 1
Chart 10
General Development Strategy






Chart 11
Select/Customize a Process
Create Best Possible Schedule
Avoid Classic Mistake
Understand Development Fundamentals
Identify and Mange Risk Factors
Follow Schedule Oriented Practices
Software Development
Fundamentals – Three Areas



Chart 12
Management
Technical
Quality-Assurance
Management Fundamentals




Chart 13
Planning
Estimation and Scheduling
Tracking
Measurement
Technical Fundamentals




Chart 14
Requirements Management
Design
Construction
Software Configuration Management
Quality-Assurance Fundamentals



Identify Error-Prone Modules
Design and Conduct Testing
Technical Reviews
–
–
–
Chart 15
Walkthroughs
Code Reading
Inspections
Software Development Activities
Problem Definition
 Requirements Analysis
 Implementation Planning
 High-level Design (or Architecture)
 Detailed Design
 Coding and Unit Testing (Debugging)
 Integration and System Testing
 Deployment and Maintenance
 Documentation, Verification, Management

Chart 16
Problem Definition



A clear statement of the problem
Defines problem without any reference to solution
Should be in user’s language
–

Not in technical terms
This is typically in Statement of Work (SOW)
Failure to define problem may cause
you to solve the wrong problem
Chart 17
Requirements Analysis





This is the what is to be solved
Helps ensure that the user rather than the
programmer drives system’s functionality
Helps avoid arguments
Minimizes changes to the system after
development begins
Change is inevitable
–
–
Set up change-control procedure
Use development approaches that accommodate
changes
Specifying requirements adequately is a key to project success
Chart 18
Implementation Planning



Defines how the product will be implemented
Establishes development schedule
Identifies resources required
–
–


Labor hours and people
Other direct costs
Estimated budget
Create Work Breakdown Structure (WBS)
This is the roadmap that will be used throughout
the development process. Without a roadmap you
don’t know where you are going and you won’t
know that you arrived.
Chart 19
High-level Design
(or Architecture)

This is how to solve the problem
–


(recall that requirements is the what is to be solved)
Defines the overall organization of the system in
terms of high-level components and interactions
All design levels are documented in specification
documents that keep track of design decisions
High-level and detail-level are usually not separated
Chart 20
Detailed Design




Components are decomposed into lower level
modules
Precisely defined interactions
Interfaces are defined
Constraints are identified
The purpose of the design phase is to specify a particular
software system that will meet the stated requirements
Chart 21
High Level Design for a Compile
Lexical
Analyzer
Source code
tokens
Parser
Semantic
Analyzer
Symbol
Table
parse tree
Intermediate Code
Generation
Semantic Analyzer
intermediate representation
Optimization
Identification
intermediate representation
Assembly code
Chart 22
Assembly Code
Generation
Type checking
Coding and Unit Testing
(Debugging)


Produces the actual code that will be delivered to
the customer
Typically develop modules that are independently
tested
Results in an implemented and tested collection of modules
Chart 23
Integration and System Testing



All the modules that have been developed and
tested individually are put together – integrated –
and are tested as a whole system
Integrated and tested progressively (on larger
sets of modules)
Some coding may be necessary to complete the
integration
The final stage is “actual” use or alpha testing.
Chart 24
Deployment and Maintenance

Deployment: defines the physical run-time architecture of
the system
–
–

Set proper system parameters
Install applications
Maintenance: long-term development
–
–
60% of total cost of software is maintenance
Cost of maintenance:








Chart 25
40% to changes in user’s requirements
17% to changes in data formats
12% to emergency fixes
9% to routine debugging
6% to hardware changes
5% to improvements in documentation
4% to improvements in efficiency
7% to other sources
Documentation, Verification,
Management


Common to all the other activities
Documentation
–

Verification and Validation
–
–
–
–

Verification: Assessment of the internal correctness of process
Validation: how the product responds to needs of customer
Performed as quality control as reviews, walk-througs, and
inspections
Discovery and removal of errors as early as possible
Management
–
Chart 26
Main result of any activity – each activity products at least one
document
Budget, schedule, resources
Lecture 2
Chart 27
Schedule







Chart 28
Requirements Analysis
Architectural Design
Detailed Design
Implementation
Testing
Demo
Delivery
2/7/2007
2/21/2007
2/21/2007
4/11/2007
4/20/2007
4/23/2007
4/27/2007
Development Lifecycle Models









Chart 29
Waterfall (Pure Waterfall)
Spiral (Spiral)
Evolutionary (Evolutionary Prototype)
Incremental (Staged Delivery)
Commercial-off-the-shelf (COTS) Integration
(COTS)
Rehost/port
Reengineering
Automated Application Generation (AAG)
Maintenance
Development Lifecycle Models
-- Features





Chart 30
Requirements are defined first
Multiple internal development cycles
Multiple customer deliveries
Functional content primary driver
Process primary driver
Development Lifecycle Models
– Selection Criteria









Chart 31
Does the system have a precedent (I.E., Have similar
systems been built before)?
Is the technology understood and stable?
Are the requirements understood and stable?
Are suitable COTS products available and ready to be
used in end products?
Is this a large or complex project or product?
Is the project fully funded at startup?
Is the project cost or schedule constrained and
requirements cannot be reduced?
Is there a need for engineering subprojects driven by risk
identification and mitigation?
Is the existing system’s maintenance cost too high/is her
a need to facilitate future system enhancements?
Waterfall

Description
–
–
–
–
Chart 32
An orderly sequence of steps from the initial software
concept through system testing
A review at the end of each phase
Document driven
Phases are discontinuous
Waterfall

Advantages
–
–
–
–

Disadvantages
–
–
–
Chart 33
Helps minimize planning overhead
Works well for projects that are well understood buy
complex
Works well when quality requirements dominate cost
and schedule
Works well if you have a technically weak staff
–
Have to fully specify requirements at beginning of
project
Waterfall model isn’t flexible
Generates few visible signs of progress until the very
end
Excessive amount of documentation
Waterfall Model
Stakeholders
Needs
Analysis
System
Requirements
Analysis
Architecture
Design
Detailed
Design
Coding
And Testing
System Testing
Chart 34
Spiral

Description
–
–
–
–
A metamodel that can accommodate any
process development model
A particular model is chosen based on level of
risk
Spiral model is cyclic
Four stages
 Objectives
identified
 Alternatives evaluated and risk areas identified
 Develop and verify next level of product
 Review and plan for next iteration
Chart 35
Spiral
Chart 36
http://www.stsc.hill.af.mil/crosstalk/2001/05/boehm.html
Spiral

Advantages
–
–
–
–

Requirements not well understood
Risks not known
As costs increase risks decrease
Provides management insight
Disadvantages
–
Difficult to use for contracted software


Chart 37
Don’t know the outcome at beginning of project
Only as good at mitigating risk as engineers are at identifying
risks
Evolutionary

Description
–
–

Advantages
–
–
–

Manage changing requirements
Unsure of optimal architecture or algorithms
Produces steady, visible signs of progress
Disadvantages
–
–
Chart 38
Develop system concept as you move through the
project
Develop prototypes including real requirements
analysis, real design, and real maintainable code
Don’t know time required to complete project
Can become an excuse to do code-and-fix
Evolutionary
Design and
Refine
Initial
Refine
Refine
implement
prototype
Refineuntil
Complete
concept
prototype
until
prototype
until
initial prototype acceptable
prototype
until and release
acceptable
acceptable
acceptable
prototype
Chart 39
Incremental

Description
–
–

Advantages
–
–

Put useful functionality into hands of customer early
Provides tangible signs of progress
Disadvantages
–
–
Chart 40
Also known as Staged Delivery
Deliver software in successive stages throughout the
project
Won’t work without careful planning
Determining stage dependencies
Incremental
Software
concept
Requirements
Analysis
Architectural
Design
Stage 1: Detailed design,. Code, debug, test, and delivery
Stage 2: Detailed design,. Code, debug, test, and delivery
Chart 41
Stage n: Detailed design,. Code, debug, test, and delivery
Commercial-off-the-shelf (COTS)
Integration



Chart 42
Description
Advantages
Disadvantages
Rehost/port



Chart 43
Description
Advantages
Disadvantages
Reengineering



Chart 44
Description
Advantages
Disadvantages
Automated Application
Generation (AAG)



Chart 45
Description
Advantages
Disadvantages
Maintenance



Chart 46
Description
Advantages
Disadvantages
Lecture 3
Chart 47
Estimation



Size Estimation
Effort Estimation
Schedule Estimation
Most projects overshoot their estimated
schedules by anywhere from 25% to 100%
Without an accurate schedule estimate, there is
no foundation for effective planning
Chart 48
Estimation




Chart 49
Constructing software is like constructing a house: you
can’t tell exactly how much it is going to cost until you
know exactly what “it” is.
As with building a house, you can either build your dream
house – expense be hanged – or you can build to a
budget, you have to be very flexible about the product
characteristics.
Whether you build to a budget or not, software
development is a process of gradual refinement, so some
imprecision is unavoidable. Unlike building a home, in
software the only way to refine the product concept and
thereby the estimate is to actually build the software.
Estimates can be refined over the course of a project.
Promise your customer that you will provide more refined
estimates at each stage.
Estimation Process

Estimate the size of the product (number of lines
of code or function points)
–

Estimate the effort (man-months)
–

–
Chart 50
Need accurate size estimates and historical data on
past projects
Estimate the schedule (calendar months)
–

First need to estimate the size of the program to be
build
With size and effort estimates, schedule is easy
Selling the schedule is HARD
Provide estimates in ranges and periodically
refine the ranges to provide increasing precision
as the project progresses
Size Estimation



Use an algorithmic approach, such as function
points, that estimates program size from program
features.
Use size-estimation software that estimates
program size from your description of program
features (screens, dialogs, files, database tables,
etc.)
Estimate each major piece of the new system as
a percentage of the size of a similar piece of an
old system. Add the pieces to get the total size.
Size estimation should be in terms of lines-of-code
Chart 51
Estimation Tips











Chart 52
Avoid off-the-cuff estimates
Allow time for the estimate, and plan it
Use data from previous projects
Use developer-based estimates
Estimate by walk-through
Estimate by categories
Estimate at a low level of detail
Don’t omit common tasks
Use software estimation tools
Use several different estimation techniques, and compare
the results
Change estimation practices as the project progresses
Effort Estimation



Use estimation software to create an effort
estimate directly from the size estimate
Use organization's historical data to determine
how much effort previous projects of the
estimated size have taken
Use an algorithmic approach such as Barry
Boehm’s COCOMO model or Putnam and Myer’s
lifecycle model to convert a lines-of-code
estimate into an effort estimate
Effort estimates should be in terms of man-months
Chart 53
Schedule Estimation

Size estimate: 65,000 lines of code
–
–

One programmer can produce 1000 lines of code in one month
65,000/1000 = 65 man-months
Can get schedule estimate from effort estimate with the
following equation
Schedule in months = 3.0 * man-months1/3

Example
–
–

65 man-months to build project
12 months = 3*651/3 -- (651/3 = 4.0207)
65 man-months / 12 months = 5 or 6 team members
One of the common problems with schedule estimates is
that they are usually done so crudely that people pad them
to give themselves a margin of error.
Chart 54
Lecture 4
Risk Management
Chart 55
Risk Management
Risk Identification
Risk Assessment
Risk Analysis
Risk Prioritization
Risk Management
Risk Management Planning
Risk Control
Risk Resolution
Risk Monitoring
Chart 56
Risk Identification

Most Common Schedule Risks
–
–
–
–
–
–
–
–
–
–
Chart 57
Feature creep
Requirements or development gold-plating
Shortchanged quality
Overly optimistic schedules
Inadequate design
Silver-bullet syndrome
Research oriented development
Weak personnel
Contractor failure
Friction between developers and customers
Risk Analysis

Risk identified
–
–
–
Chart 58
Probability of loss (%)
Size of loss (weeks or dollars or …)
Risk exposure (weeks or dollars or …)
Risk Prioritization



Chart 59
Helps to identify the most important risks
Plan mitigation
Assign resources as needed
Risk Control


Risk management planning
Risk resolution
–
–
–
–
–
–
–

Chart 60
Avoid the risk
Transfer the risk from one part of a system to another
Buy information about the risk
Estimate the root cause of the risk
Assume the risk
Publicize the risk
Control the risk
Risk monitoring
Steps in risk management
Checklist
Decomposition
Assumption analysis
Risk assessment Risk identification
Decision driver analysis
System dynamics
Performance models
Cost models
Risk analysis
Network analysis
Decision analysis
Risk management
Quality risk factor analysis
Risk exposure
Risk prioritization
Compound risk reduction
Buying information
Risk avoidance
Risk reduction
Risk transfer
Risk reduction leverage
Development process
Risk control Risk management planning Risk element planning
Risk plan integration
Risk mitigation
Risk resolution
Risk monitoring and reporting
Risk reassessment
Chart 61
Example of risk exposure calculation
Chart 62
Lecture 5
Classic Mistakes
Chart 63
Classic Mistakes




Chart 64
People related mistakes
Process related mistakes
Product related mistakes
Technology related mistakes
Classic Mistakes
People Related Mistakes












Chart 65
Undermined motivation
Weak personnel
Uncontrolled problem personnel
Heroics
Adding people to late project
Noisy, crowded offices
Friction between developers and customers
Unrealistic expectations
Lack of effective project sponsorship
Lack of stakeholder buy-in
Politics placed over substance
Wishful thinking
Classic Mistakes
Process Related Mistakes














Chart 66
Overly optimistic schedules
Insufficient risk management
Contractor failure
Insufficient planning
Abandonment of planning under pressure
Wasted time during the fuzzy front end
Shortchanged upstream activities
Inadequate design
Shortchanged quality assurance
Insufficient management controls
Premature or overly frequent convergence
Omitting necessary task from estimates
Planning to catch up later
Code-like-hell programming
Classic Mistakes
Product Related Mistakes





Chart 67
Requirements gold-platting
Feature creep
Developer gold-platting
Push-me, pull-me negotiation
Research-oriented development
Classic Mistakes
Technology Related Mistakes




Chart 68
Silver-bullet syndrome
Overestimated savings from new tools or
methods
Switching tools in the middle of a project
Lack of automated source code control
Lecture 6
Best Practices
Chart 69
Best Practices

Change Board
–
–
Group that controls changes to software
Efficacy




–
Major Risk

Chart 70
Potential reduction from nominal schedule:
Improvement in progress visibility:
Effect on schedule risk:
Chance of first-time success:
Approving to few or too many changes
Fair
Fair
Decreased Risk
Very Good
Best Practices

Daily Build and Smoke Test
–
–
Product is built every day (compiled, linked, and
combined into an executable program) – the product is
then tested to see if it “smokes”
Efficacy




–
Good
Good
Decreased Risk
Very Good
Major Risk

Chart 71
Potential reduction from nominal schedule:
Improvement in progress visibility:
Effect on schedule risk:
Chance of first-time success:
Pressure to release interim versions of a product too frequently
Best Practices

Designing for Change
–
–
Identifying likely changes, developing a change plan,
and hiding design decisions so that changes do not
ripple through a program.
Efficacy





–
Fair
None
Decreased Risk
Good
Excellent
Major Risk

Chart 72
Potential reduction from nominal schedule:
Improvement in progress visibility:
Effect on schedule risk:
Chance of first-time success:
Chance of long-term success:
Overeliance on the use of programming languages to solve
design problems rather than on change-oriented design
practices
Best Practices

Evolutionary Delivery
–
–
Deliver selected portions of the software earlier than
would otherwise be possible.
Efficacy





–
Good
Excellent
Decreased Risk
Very Good
Excellent
Major Risk

Chart 73
Potential reduction from nominal schedule:
Improvement in progress visibility:
Effect on schedule risk:
Chance of first-time success:
Chance of long-term success:
Feature creep, diminished project control, unrealistic schedule
and budget expectations, inefficient use of development time by
developers.
Best Practices

Evolutionary Prototyping
–
–
System developed in increments so that it can readily
be modified in response to end-user and customer
feedback.
Efficacy





–
Excellent
Excellent
Increased Risk
Very Good
Excellent
Major Risk

Chart 74
Potential reduction from nominal schedule:
Improvement in progress visibility:
Effect on schedule risk:
Chance of first-time success:
Chance of long-term success:
Unrealistic schedule and budget expectations, inefficient use of
prototyping time, unrealistic performance expectations, poor
design, poor maintainability
Best Practices

Goal Setting
–
–
Use goals to motivate software developers
(Shorten schedule, decrease risk, maximum visibility)
Efficacy





–
Major Risk

Chart 75
Potential reduction from nominal schedule: (Very Good, None, None)
Improvement in progress visibility: (None, Good, Excellent)
Effect on schedule risk: (Increased Risk, Decreased Risk, Decreased
Risk)
Chance of first-time success:
(Good, Good, Good)
Chance of long-term success:
(Very Good, Very Good, Very Good)
Significant loss of motivation if goals are changed
Best Practices

Inspections
–
–
Formal technical review
Efficacy





–
Major Risk

Chart 76
Potential reduction from nominal schedule:
Improvement in progress visibility:
Effect on schedule risk:
Chance of first-time success:
Chance of long-term success:
None
Very Good
Fair
Decreased Risk
Good
Excellent
Best Practices

Joint Application Development (JAD)
–
–
Requirements-definition and user-interfaced design
methodology in which end-users, executives, and
developers attend intense off-site meetings to work out
a system’s details.
Efficacy





–
Good
Fair
Decreased Risk
Good
Excellent
Major Risk

Chart 77
Potential reduction from nominal schedule:
Improvement in progress visibility:
Effect on schedule risk:
Chance of first-time success:
Chance of long-term success:
Unrealistic productivity expectations following the JAD
sessions, premature, inaccurate estimates of remaining work
following JAD sessions
Backup Slides
Chart 78
Waterfall Model
Software
Requirements
Analysis
Stakeholders
Needs
Analysis
Software
Detailed
Design
Software
Architectural
Design
Software
Integration
Software
Coding
And Testing
System
Requirements
Analysis
Software
Qualification
Testing
Software
Item n
Software/Hardware
Component Integration
& Qualification
System
Architecture
Design
Hardware
Item n
Hardware
Component
Requirements
Hardware
Design
Hardware
Make/Buy
Decision
Chart 79
Hardware
Qualification
Testing
Fabrication/Purchase
And Assembly
System
Qualification
& Release
Activities
Effective Practices
Ineffective Practices
Effective Practices
ScheduleOriented
Practices
Chart 80
Set of Practices
You Use on Any
Particular Project
Scheduled Oriented Practices
ScheduleOriented
Practices
SpeedOriented
Practices
ScheduleRisk
Oriented
Practices
Chart 81
VisibilityOriented
Practices
Course Outline

Chapters 1, 2, 4
–
–

Software development strategy
Software development
fundamentals
Lifecycle planning

–
Chapters 1, 2, 4, 5, 7, 8, 9
Chapter 6
Chapters 10, 11, 12, 13
–
–
–
–


SOW, requirements analysis,
WBS, technical specifications,
test plan
Midterm Exam
–
Chart 82

Project
–

Risk management
Classical mistakes
– Core issues
– Time versus quality
Estimation
Scheduling
Chapter 5
–


Chapters 8, 9
–

Waterfall, spiral, prototype
Chapter 3
–
Chapter 7
–



Customer oriented development
Motivation
Teamwork
Team structure
Best Practices
Project
– Project reports
– Test results
– Working demonstration
– User’s manual
Final Exam
– Comprehensive