No Slide Title

Download Report

Transcript No Slide Title

Project Management Details
•
•
•
•
•
Tracking Project Progress
Project Estimation
Project Risk Analysis
Project Organization
RUP Project Management Workflow
Chapter 3
Tracking project progress
• Do you understand customer problem and
needs?
• Can you design a system to solve customer
problem or satisfy customer needs?
• How long will it take you to develop the
system?
• How much will it cost to develop the
system?
Chapter 3
Project deliverables
•
•
•
•
•
Documents
Demonstrations of function
Demonstrations of subsystems
Demonstrations of accuracy
Demonstrations of reliability, performance
or security
Chapter 3
Milestones and activities
• Activity: takes place over a period of time
• Milestone: completion of an activity -- a
particular point in time
• Precursor: event or set of events that must
occur in order for an activity to start
• Duration: length of time needed to
complete an activity
• Due date: date by which an activity must be
completed
Chapter 3
Slack or float time
Slack time = available time - real time
= latest start time - earliest start time
Chapter 3
Table 3.4. Slack time for project activities.
Activity
1.1
1.2
1.3
1.4
2.1
2.2
2.3
2.4
2.5
2.6
2.7
2.8
3.1
3.2
3.3
3.4
3.5
3.6
Finish
Earliest start time
1
1
16
26
36
51
71
81
91
99
104
104
71
83
98
107
107
118
124
Latest start time
13
1
16
26
36
51
83
93
103
111
119
116
71
83
98
107
107
118
124
Chapter 3
Slack
12
0
0
0
0
0
12
12
12
12
15
12
0
0
0
0
0
0
0
Effort estimation
• Expert judgment
–
–
–
–
analogy
proportion
Delphi technique
Wolverton model
• Algorithmic methods: E = (a + bSc) m(X)
– Walston and Felix model: E = 5.25S 0.91
– Bailey and Basili model: E = 5.5 + 0.73S1.16
Chapter 3
Evaluating models
• Mean magnitude of relative error (MMRE)
– absolute value of mean of
[(actual - estimate)/actual]
– goal: should be .25 or less
• Pred(x/100): percentage of projects for
which estimate is within x% of the actual
– goal: should be .75 or greater for x = .25
Chapter 3
Table 3.14. Summary of model performance.
Model
Walston-Felix
Basic COCOMO
Intermediate COCOMO
Intermediate COCOMO (variation)
Bailey-Basili
Pfleeger
SLIM
Jensen
COPMO
General COPMO
PRED(0.25)
0.30
0.27
0.63
0.76
0.78
0.50
0.06-0.24
0.06-0.33
0.38-0.63
0.78
Chapter 3
MMRE
0.48
0.60
0.22
0.19
0.18
0.29
0.78-1.04
0.70-1.01
0.23-5.7
0.25
Table 3.6. Wolverton model cost matrix.
Type of software
Control
Input/output
Pre/post processor
Algorithm
Data management
Time-critical
OE
21
17
16
15
24
75
OM
27
24
23
20
31
75
Difficulty
OH
NE
30
33
27
28
26
28
22
25
35
37
75
75
Chapter 3
NM
40
35
34
30
46
75
NH
49
43
42
35
57
75
Table 3.7. Walston and Felix model productivity factors.
1. Customer interface complexity
2. User participation in requirements
definition
3. Customer-originated program design
changes
4. Customer experience with the
application area
5. Overall personnel experience
6. Percentage of development
programmers who participated in the
design of functional specifications
7. Previous experience with the
operational computer
8. Previous experience with the
programming language
9. Previous experience with applications
of similar size and complexity
10. Ratio of average staff size to project
duration (people per month)
11. Hardware under concurrent
development
12. Access to development computer open
under special request
13. Access to development computer
closed
14. Classified security environment for
computer and at least 25% of programs
and data
15. Use of structured programming
16. Use of design and code inspections
17. Use of top-down development
18. Use of a chief programmer team
19. Overall complexity of code
20. Complexity of application processing
21. Complexity of program flow
22. Overall constraints on program’s
design
23. Design constraints on the program’s
main storage
24. Design constraints on the program’s
timing
25. Code for real-time or interactive
operation or for execution under severe
time constraints
26. Percentage of code for delivery
27. Code classified as nonmathematical
application and input/output formatting
programs
28. Number of classes of items in the
database per 1000 lines of code
29. Number of pages of delivered
documentation per 1000 lines of code
Chapter 3
Bailey-Basili technique
• Minimize standard error estimate to produce an equation such as:
E = 5.5 + 0.73S1.16
• Adjust initial estimate based on the ratio of errors.
If R is the ratio between the actual effort, E, and the predicted effort, E’,
then the effort adjustment is defined as
ERadj = R – 1
if R > 1
= 1 – 1/R if R < 1
• Then adjust the initial effort estimate E:
Eadj
= (1 + ERadj)E if R > 1
= E/(1 + ERadj) if R < 1
Chapter 3
Table 3.8. Bailey-Basili effort modifiers.
Total methodology (METH)
Top-down design
Cumulative complexity
(CPLX)
Customer interface
complexity
Application complexity
Formal documentation
Program flow complexity
Chief programmer teams
Internal communication
complexity
Database complexity
External communication
complexity
Customer-initiated
program design changes
Tree charts
Formal training
Formal test plans
Design formalisms
Code reading
Unit development folders
Chapter 3
Cumulative experience
(EXP)
Programmer qualifications
Programmer machine
experience
Programmer language
experience
Programmer application
experience
Team experience
COCOMO model: stages of
development
• application composition:
– prototyping to resolve high-risk user interface issues
– size estimates in object points
• early design:
– to explore alternative architectures and concepts
– size estimates in function points
• postarchitecture:
– development has begun
– size estimates in lines of code
Chapter 3
TABLE 3.9 Three Stages of COCOMO II
Model Aspect
Stage 1:
Application
Composition
Stage 2:
Early
Design
Stage 3:
Post-architecture
Size
Application
points
Function points (FP)
and language
FP and language or source lines
of code (SLOC)
Reuse
Implicit in
model
Equivalent SLOC as
function of other
variables
Equivalent SLOC as function of
other variables
Requirements
change
Implicit in
model
% change expressed as
a cost factor
% change expressed as a
cost factor
Maintenance
Application
Point
Annual
Change
Traffic
Function of ACT, software
understanding,
unfamiliarity
Function of ACT, software
understanding,
unfamiliarity
Scale (c) in
nominal effort
equation
1.0
0.91 to 1.23, depending
on precedentedness,
conformity, early
architecture, risk
resolution, team
cohesion, and SEI
process maturity
0.91 to 1.23, depending on
precedentedness, conformity,
early architecture, risk resolution,
team cohesion, and SEI process
maturity
Product cost
drivers
None
Complexity, required
reusability
Reliability, database size,
documentation needs, required reuse,
and product complexity
Platform cost
drivers
None
Platform difficulty
Execution time constraints, main
storage constraints, and virtual
machine volatility
Personnel
None
Personnel capability
cost drivers
and experience
programmer experience,
experience, and personnel continuity
Analyst capability, applications
experience, programmer capability,
language and tool
Project cost
drivers
Use of software tools, required
development schedule, and
None
Required development
schedule, development
environment
multisite development
Chapter 3
Table 3.10. Application point complexity levels.
For Screens
Number of
views
contained
<3
3-7
8+
For Reports
Number and source of data tables
Total < 4 Total < 8 Total 8+
(<2
(2-3
(>3
server,
server,
server, >5
3-5
client)
<3
client)
client)
simple
simple
medium
simple
medium
difficult
medium
difficult
difficult
Number of
sections
contained
0 or 1
2 or 3
4+
Chapter 3
Number and source of data tables
Total < 4 Total < 8
Total 8+
(<2
(2-3
(>3
server,
server, 3server,
5 client)
>5
<3
client)
client)
simple
simple
medium
simple
medium
difficult
medium
difficult
difficult
Table 3.11. Complexity weights for application points.
Object type
Screen
Report
3GL component
Simple
1
2
-
Medium
2
5
-
Difficult
3
8
10
Table 3.12. Productivity estimate calculation.
Developers’ experience and
capability
CASE maturity and capability
Productivity factor
Very low
Low
Nominal
High
Very low
Low
Nominal
High
4
7
13
25
Very
high
Very
high
50
Table 3.13. Tool use categories.
Category
Very low
Low
Nominal
High
Very high
Meaning
Edit, code, debug
Simple front-end, back-end CASE, little integration
Basic life-cycle tools, moderately integrated
Strong, mature life-cycle tools, moderately integrated
Strong, mature, proactive life-cycle tools, well-integrated with
processes, methods, reuse
Chapter 3
Machine learning techniques
• Example: case-based reasoning
–
–
–
–
user identifies new problem as a case
system retrieves similar cases from repository
system reuses knowledge from previous cases
system suggests solution for new case
• Example: neural network
– cause-effect network “trained” with data from
past history
Chapter 3
Project Estimation Links
•
•
•
•
•
•
•
•
•
•
http://sunset.usc.edu/research/COCOMOII/index.html
http://www.starbase.com/products/tools.asp?ID=1900
http://www-cs.etsu-tn.edu/softeng/
http://www.construx.com/estimate/home.htm
http://dec.bournemouth.ac.uk/ESERG/ANGEL/
http://risex.insead.fr/rise/index.htm
http://www.brunel.ac.uk/~csstmmh2/exe11.html
http://www.softengprod.com/
http://www.itworld.com/Career/2019/ITW-estimation-020901/
http://www.ifi.uio.no/~best/
Chapter 3
Risk management requirements
• Risk impact: the loss associated with the event
• Risk probability: the likelihood that the event
will occur
• Risk control: the degree to which we can
change the outcome
Risk exposure = (risk probability) x (risk impact)
Chapter 3
Three strategies for risk reduction
• avoiding the risk: change requirements for
performance or functionality
• transferring the risk: transfer to other
system, or buy insurance
• assuming the risk: accept and control it
risk leverage = difference in risk exposure
divided by cost of reducing the risk
Chapter 3
Boehm’s top ten risk items
•
•
•
•
•
•
•
•
•
•
Personnel shortfalls
Unrealistic schedules and budgets
Developing the wrong functions
Developing the wrong user interfaces
Gold-plating
Continuing stream of requirements changes
Shortfalls in externally-performed tasks
Shortfalls in externally-furnished components
Real-time performance shortfalls
Straining computer science capabilities
Chapter 3
Project organization
• Depends on
– backgrounds and work styles of team members
– number of people on team
– management styles of customers and
developers
• Examples:
– Chief programmer team
– Egoless approach
Chapter 3
Table 3.5. Comparison of organizational structures.
Highly structured
High certainty
Repetition
Large projects
Loosely structured
Uncertainty
New techniques or technology
Small projects
Chapter 3
Digital Alpha AXP: Enrollment
management model
• Establish an appropriately large shared
vision
• Delegate completely and elicit specific
commitments from participants
• Inspect vigorously and provide supportive
feedback
• Acknowledge every advance and learn as
the program progresses
Chapter 3
Lockheed Martin: Accountability
modeling
• Matrix organization
– Each engineer belongs to a functional unit
based on type of skill
• Integrated product development team
– Combines people from different functional
units into interdisciplinary work unit
• Each activity tracked using cost estimation,
critical path analysis, schedule tracking
– Earned value a common measure for progress
Chapter 3
Anchoring milestones
•
•
•
•
Objectives: Why is the system being developed?
Milestones and schedules: What will be done by when?
Responsibilities: Who is responsible for a function?
Approach: How will the job be done, technically and
managerially?
• Resources: How much of each resource is needed?
• Feasibility: Can this be done, and is there a good
business reason for doing it?
Chapter 3
Project plan contents
•
•
•
•
project scope
project schedule
project team organization
technical description of
system
• project standards and
procedures
• quality assurance plan
• configuration management
plan
• documentation plan
• data management plan
• resource management
plan
• test plan
• training plan
• security plan
• risk management plan
• maintenance plan
Chapter 3
Project Management Workflow
Chapter 3
Conceive New Project
Chapter 3
Conceive New Project Steps
•
•
•
•
•
•
•
Identify Potential Risks
Analyze and Prioritize Risks
Identify Risk Avoidance Strategies
Identify Risk Mitigation Strategies
Identify Risk Contingency Strategies
Revisiting Risks during the Iteration
Revisiting Risks at the End of an Iteration
Chapter 3
Evaluate Scope and Risk
Chapter 3
Software Development Plan
Chapter 3
Monitor and Control Project
Chapter 3
Plan for Next Iteration
Chapter 3
Manage Iteration
Chapter 3
Close Out Phase
Chapter 3
Close Out Project
Chapter 3
Activity Overview
Chapter 3
Artifact Overview
Chapter 3