Transcript Title

Software Engineering
Natallia Kokash
email: [email protected]
N. Kokash, Software Engineering
1
Software Engineering
Agenda
Software quality
 Process and product
quality
 Software metrics

N. Kokash, Software Engineering
2
Software Engineering
What is quality?





Transcendent (“I really like this program”)
User-based (“fitness for use”)
Value-based (balancing time and cost vs.
profits
Manufacturing-based (conformance to specs)
Product-based (based on attributes of the
software)
N. Kokash, Software Engineering
3
Software Engineering
Software quality

Absence of defects?

Program does not crash
 Computes correct output
N. Kokash, Software Engineering

We cannot establish the absence
of defects, only their presence.

We can count the number of
defects we find after X hours of
testing
4
Software Engineering
Why is software quality difficult?

This is problematical for software
systems
 There
is a tension between customer
quality requirements (efficiency,
reliability, etc.) and developer quality
requirements (maintainability,
reusability, etc.);
 Some quality requirements are difficult
to specify in an unambiguous way;
 Software specifications are usually
incomplete and often inconsistent.
N. Kokash, Software Engineering
5
Software Engineering
Approaches to quality


software

+
measures
N. Kokash, Software Engineering
Quality of the product versus
quality of the process
Check whether (product or
process) conforms to certain norms
Improve quality by improving the
product or process
Conformance
Improvement
Product
ISO 9126
best practices
Process
ISO 9001
SQA
CMM
CMMI
6
Software Engineering
Software quality management



N. Kokash, Software Engineering
Concerned with ensuring that the
required level of quality is
achieved in a software product.
Involves defining appropriate
quality standards and procedures
and ensuring that these are
followed.
Should aim to develop a ‘quality
culture’ where quality is seen as
everyone’s responsibility.
7
Software Engineering
Quality management activities

Quality assurance


Quality planning


N. Kokash, Software Engineering
Select applicable procedures and standards
for a particular project and modify these as
required.
Quality control


Establish organisational procedures and
standards for quality.
Ensure that procedures and standards are
followed by the software development team.
Quality management should be separate
from project management to ensure
independence.
8
Software Engineering
Process-based quality



In manufactured goods, there is a
straightforward relation between process and
product-quality.
This is more complex for software
The relationship between software processes
and product quality is a very complex and
poorly understood.


N. Kokash, Software Engineering
The application of individual skills and experience is
particularly important in software development;
External factors such as the novelty of an
application or the need for an accelerated
development schedule may impair product quality.
10
Software Engineering
Process-based quality
people
process
product
technology
Quality of a product depends on the quality
of the people, process and technology!
Define process
Develop
product
Improve
process
N. Kokash, Software Engineering
Assess product
quality
No
Quality
ok?
Yes
Standardize
process
11
Software Engineering
Quality assurance and
standards
Standards are the key to effective quality
management.
 They may be international, national,
organizational or project standards.
 Product standards define characteristics that
all components should exhibit e.g. a common
programming style.
 Process standards define how the software
process should be enacted.

N. Kokash, Software Engineering
12
Software Engineering
Product and process standards
Product standards
Process standards
Design review form
Design review conduct
Requirements document structure
Submission of documents to CM
Method header format
Version release process
Java programming style
Project plan approval process
Project plan format
Change control process
Change request form
Test recording process
N. Kokash, Software Engineering
13
Software Engineering
ISO 9001
Model for quality assurance in design,
development, production, installation and
servicing
 Basic premise:



confidence in product conformance can be
obtained by adequate demonstration of
supplier’s capabilities in processes (design,
development, …)
ISO registration by an officially accredited
body, re-registration every three years
N. Kokash, Software Engineering
14
Software Engineering
Software Process Improvement (SPI)
Premise:
“The quality of a product is largely determined
by the quality of the process that is used to
develop and maintain it.”
PEOPLE
PROCESS
N. Kokash, Software Engineering
TECHNOLOGY

Approach to SPI
 Formulate hypotheses
 Carefully select metrics
 Collect data
 Interpret data
 Initiate improvement actions
 Iterate
15
Software Engineering
Capability Maturity Model (CMM)

Initial level
 software
development is ad-
hoc

Repeatable level
 basic

Defined level
 there

processes are in place
are standard processes
Quantitatively managed level
 data
is gathered and
analyzed routinely

Optimizing level
 stable
base, data is gathered
to improve the process
N. Kokash, Software Engineering
16
Software Engineering
Initial  repeatable level







N. Kokash, Software Engineering
Requirements management
Project planning
Project monitoring and
control
Supplier agreement
management
Measurement and analysis
Process and product quality
assurance
Configuration management
17
Software Engineering
Repeatable  defined level











N. Kokash, Software Engineering
Requirements development
Technical solution
Product integration
Verification
Validation
Organization process focus
Organization process definition
Organizational training
Integrated project management
Risk management
Decision analysis and resolution
18
Software Engineering
CMM: critical notes




N. Kokash, Software Engineering
Most appropriate for big
companies
Pure CMM approach may
stifle creativity
Focus mostly on activities
and supporting artifacts
associated with a
conventional waterfall
process
Crude 5-point scale (now:
CMMI)
19
Software Engineering
Capability Maturity Model
Integration (CMMI)
http://www.cdainfo.com/down/1-Desarrollo/CMM2.pdf
N. Kokash, Software Engineering
20
Software Engineering
CMMI: process areas by maturity
level
Level
Focus
5 Optimizing
Continuous
process
improvement
Organizational Innovation and Deployment
Causal Analysis and Resolution
4 Quantitatively
Managed
Quantitative
management
Organizational Process Performance
Quantitative Project Management
3
Defined
Process
standardization
Integrated
Product and
Process
Development
(IPPD)
2 Managed
1 Initial
N. Kokash, Software Engineering
Process Areas
Basic
project
management
Requirements Development
Technical Solution
Product Integration
Verification
Validation
Organizational Process Focus
Organizational Process Definition
Organizational Training
Integrated Project Management
Integrated Supplier Management
Risk Management
Decision Analysis and Resolution
Organizational Environment for Integration
Integrated Teaming
Requirements Management
Project Planning
Project Monitoring and Control
Supplier Agreement Management
Measurement and Analysis
Process and Product Quality Assurance
Configuration Management
21
Software Engineering
Documentation standards

Particularly important - documents are the
tangible manifestation of the software.

Documentation process standards
 Concerned
with how documents should be
developed, validated and maintained.

Document standards
 Concerned
with document contents,
structure, and appearance.

Document interchange standards
 Concerned
with the compatibility of
electronic documents.
N. Kokash, Software Engineering
22
Software Engineering
Problems with standards / QA

They may not be seen as relevant
and up-to-date by software
engineers.

They often involve too much
bureaucratic form filling.

If they are unsupported by software
tools, tedious manual work is often
involved to maintain the
documentation associated with the
standards.
N. Kokash, Software Engineering
23
Software Engineering
Quality models

Boehm
 Bottom
McCall
ISO 9126
Dromey
…
Decomposition of
characteristics

level: metrics
Differences in
 Relations
between
characteristics
 Vocabulary
Boehm’s Quality Model
N. Kokash, Software Engineering
24
Software Engineering
ISO 9126
N. Kokash, Software Engineering
25
Software Engineering
McCalls quality factors and criteria
N. Kokash, Software Engineering
26
Software Engineering
Boehm’s quality tree
N. Kokash, Software Engineering
27
Software Engineering
Data collection


A metrics programme should be based
on a set of product and process data.
Data should be collected immediately
(not after a project has finished) and, if
possible, automatically.


Don’t collect unnecessary data


The questions to be answered should be
decided in advance and the required data
identified.
Tell people why the data is being
collected.

N. Kokash, Software Engineering
Don’t rely on memory
It should not be part of personnel
evaluation.
28
Software Engineering
Data collection
Closed loop principle: result of
data analysis must be useful
to supplier of data
 Do not use data collected for
other purposes
 Focus on continuous
improvement
 Only collect data you really
need

N. Kokash, Software Engineering
29
Software Engineering
GQM-Approach


Goal – Question – Metric (Basili)
Approach to select metrics


Avoids “let’s collect a lot of data and decide
afterwards what we do with the values” (fishing for
results)
Approach
1.
2.
3.
4.
5.
Classify the entities
Express goals of organization
Generate questions to meet goals
Analyze questions and define metrics
Finally, check whether metrics can be collected
N. Kokash, Software Engineering
30
Software Engineering
GQM: Example
Goal
Evaluate effectiveness of coding standard
Who is using the
Question standard?
What is coder productivity?
What is code
quality?
Metric
Proportion of
coders
•Using standard
•Using language
Experience of coders
•With standars
•With language
Code size (LOC,
Function
Points…)
Errors, Effort,…
•With environment
•…
N. Kokash, Software Engineering
31
Software Engineering
Cost of quality
Internal costs of failure
Costs due to lack of quality
Cost of
quality
External costs of failure
Appraisal costs
Costs of achieving quality
Prevention costs
N. Kokash, Software Engineering
32
Software Engineering
Cost of quality
Category
Definition
Typical Costs for
Software
Internal
failures
Quality failures detected
prior to product shipment
Defect management,
rework, retesting
External
failures
Quality failures detected
after product shipment
Technical support,
complaint investigation,
defect notification
Appraisal
Discovering the condition of Testing and associated
the product
activities, product quality
audits
Prevention
Efforts to ensure product
quality
N. Kokash, Software Engineering
SQA administration,
inspections, process
improvements, metrics
collection and analysis
33
Software Engineering
Cost of repair
Cost of
Defect Repair
Cost of repair increases exponentially
As a project progresses,
more and more work
depends on earlier decisions.
Defects should be eliminated as soon as possible after introduction
N. Kokash, Software Engineering
34
Software Engineering
Question

The highest probability of undetected
defects are in modules which show…
…HIGHEST # of known defects
2. …LOWEST # of known defects
1.
?
N. Kokash, Software Engineering
35
Software Engineering
If you cannot measure it, then it is
not science!
In physical science the first essential step in the direction
of learning any subject is to find principles of numerical
reckoning and practicable methods for measuring some
quality connected with it. I often say that when you can
measure what you are speaking about, and express it in
numbers, you know something about it;
But when you cannot measure it, when you
cannot express it in numbers, your knowledge
is of a meagre and unsatisfactory kind; it may
be the beginning of knowledge, but you have
scarcely in your thoughts advanced to the
state of Science, whatever the matter may be.
— Sir William Thompson, Lord Kelvin (1824-1907)
From 'Electrical Units of Measurement', a lecture delivered at the Institution of Civil Engineers, London (3 May 1883), Popular
Lectures and Addresses (1889), Vol. 1, 73. Quoted in American Association for the Advancement of Science, Science (Jan-Jun
1892), 19, 127.
N. Kokash, Software Engineering
36
Software Engineering
If you cannot measure it, then it is
not science?
“Not everything that is important
can be measured, and not
everything that can be measured
is important.“
Albert Einstein
N. Kokash, Software Engineering
37
Software Engineering
Why measure?

Gilb’s principle of fuzzy targets:
Projects without clear goals will not
achieve goals clearly
Tom Gilb (1940), an American systems engineer, consultant,
and author, known for the development of software metric,
software inspection, and evolutionary processes.

You can neither predict nor control
what you cannot measure
Tom DeMarco (1940), an American software engineer, author,
teacher and speaker on software engineering topics. Best known
as one of the developers of Structured analysis in the 1980s.
N. Kokash, Software Engineering
38
Software Engineering
Why measure?
Controlling
 Understanding
 Comparing
 Predicting


}
Metrics are
 objective
 (often)
automatically
collectable
N. Kokash, Software Engineering

Why measure
software?
 Determine
the quality of
the current product or
process
 Predict qualities of a
product/process
 Improve quality of a
product/process
39
Software Engineering
Representation condition


A measure M is valid if it satisfies the representation
condition, i.e. if A>B in the real world, then M(A)>M(B)
E.g. if we measure complexity as the number of ifstatements, then:
 Two
programs with the same number of if-statements are
equally complex
 If program A has more if-statements than program B, then
A is more complex than B
M(Jan) = 200cm
M(Joep) = 150cm
N. Kokash, Software Engineering
41
Software Engineering
Motivation for metrics
Estimate the cost & schedule of future
projects bidding
 Evaluate the productivity impacts of new
tools and techniques
 Establish productivity trends over time
 Monitor/Improve software quality
 Forecast future staffing needs
 Anticipate and reduce future
maintenance needs

N. Kokash, Software Engineering
42
Software Engineering
Software measurement and
metrics



N. Kokash, Software Engineering
Software measurement is concerned with
deriving a numeric value for an attribute
of a software product or process.
This allows for
 objective comparisons between
techniques and processes
 predictions
 management control
Most organisations don’t make
systematic use of software
measurement.
43
Software Engineering
Measurement, Metrics, Indicators
• Measure: a quantitative indication of the extent,
amount, dimension, capacity or size of some
attribute of a product or process.
• A single data point (e.g. number of defects from a
single review)
• Measurement: The act of determining a measure
• Metric: A measure of the degree to which a system,
component or process possesses a given attribute.
• Metrics relate measures (e.g. average number of
defects found in reviews)
• Relate data points to each other
• Indicator: A metric or series of metrics that provide
insight into a process, project or product.
N. Kokash, Software Engineering
44
Software Engineering
Has productivity improved over
time?

1000 systems completed
between 1996 and 2011.

PI = Productivity Index
FP = Function Points
PM = Person Month



http://www.qsm.com/blog
/2011/has-softwareproductivity-declinedover-time
N. Kokash, Software Engineering
45
Software Engineering
Productivity measures?



The QSM (Quantitative Software Management,
Inc.) methodology is based upon the use of a
productivity parameter called the Productivity
Index (PI).
A PI is calculated using an empirical formula
developed by Larry Putnam, Sr. in the 1970's.
This index includes in its calculation the
following measures related to a software
project: size, duration, and effort.
PI = size/(effort*duration)
N. Kokash, Software Engineering
46
Software Engineering
Has productivity improved over
time?
N. Kokash, Software Engineering
47
Software Engineering
Literature on software metrics
N. Kokash, Software Engineering

Metrics and Models in Software
Quality Engineering (2nd edition)
Stephen H. Kan
Addison Wesley, 2002

Software Metrics
A Rigorous & Practical Approach
Norman E. Fenton
& Shari Lawrence Pleeger,
2nd ed. International Thomson
Computer Press, 1997
48
Software Engineering
Measurement scales

Nominal scale



Ordinal scale



Ordered w.r.t. an attribute, ranking only
Example: preference, defect complexity (serious, moderate, simple)
Interval scale



Just categories, no ordering, no magnitude
Example: specification fault, design fault,…
Preserves order and differences, no ratios
Example: grades A, B, C…
Ratio scale


Order, difference and ratio; has a zero element
Example: development time, lines of code
Measurement scales restrict the kind of allowed analysis
N. Kokash, Software Engineering
49
Software Engineering
Scales may not be unique
N. Kokash, Software Engineering
50
Software Engineering
Software metric

Any type of measurement which
relates to a software system,
process or related documentation
 Lines
of code in a program
 Number of methods per class
 Number of requirements
 Number of components in a system
 Number of person-days required to
develop a component
N. Kokash, Software Engineering
51
Software Engineering
Applications of metrics

Bidding
 feasibility
 cost
prediction
Tracking progress
 Identifying fault-prone elements

 Focus
on QA-activities
Assessing the Quality of a
System
 “Bad smells”  refactoring

N. Kokash, Software Engineering
52
Software Engineering
Metrics domains

Process metrics
 Duration or effort of tasks,
 Number of changes in requirements

Resource metrics
 Number of staff working
 Staff overturn
 Staff experience/skills

on a task;
Product metrics
 Requirements document
 Architecture document
 Design document
 Implementation (code, libraries)
N. Kokash, Software Engineering
• size (lines of code)
• complexity
• functionality
53
Software Engineering
Process metrics





N. Kokash, Software Engineering
Effort/time per SE task
Defects detected per
review hour
Scheduled vs. actual
milestone dates
Changes (number) and
their characteristics
Distribution of effort on SE
tasks
54
Software Engineering
Example of process metric
Distribution of effort over
different activities in
development
N. Kokash, Software Engineering
55
Software Engineering
Product metrics
•
•
•
Focus on the quality of deliverables
Measures of analysis model
Complexity of the design
•
•
•
•
Code measures
•
N. Kokash, Software Engineering
Internal algorithmic complexity
Architectural complexity
Data flow complexity
e.g., Halstead (1977)
56
Software Engineering
Size metrics

Lines of Code (LOC)
 How
to do we deal with…
Empty lines?
 Comment?
 Multiple statements in one line?

 Variation
of LOC for equal program
in different languages
 Productivity = LOC / hour

N. Kokash, Software Engineering
Wrong incentive: verbose
programming style
57
Software Engineering
Size metric: Function points

Based on a combination of program
characteristics





external inputs and outputs;
user interactions;
external interfaces;
files used by the system.
A weight is associated with each of these
and the function point count is computed
by multiplying each raw count by the
weight and summing all values.
UFC = (number of elements of given type)  (weight)
N. Kokash, Software Engineering
58
Software Engineering
Quality metrics
What: Testability, extensibility, maintainability,
error-proneness, …
How:
 # of defects
 Coupling, cohesion
 Complexity
 Inheritance
metrics
Novel:
 Measure earlier: Design rather than code
 Completeness & Consistency
 Combining different views: Structure & Behavior
N. Kokash, Software Engineering
59
Software Engineering
Structure-based metrics
Degree of modularity is an
indicator for quality.
Eases:
•subdivision of work (design,
implementation, test, maintain)
•reuse
N. Kokash, Software Engineering
60
Software Engineering
McCabe’s complexity metric

McCabe’s cyclomatic number
 Measures
complexity of a module
 Heuristic: should be <10
G is control flow graph with
e edges and n nodes
V(G) = e – n + 2
(number of linearly independent paths in G)
Here: V(G) = 12 – 10 + 2 = 4
More simply, d is number of decision nodes
V(G) = d + 1
N. Kokash, Software Engineering
61
Software Engineering
Object-oriented metrics







WMC, CBO, RFC, LCOM most useful



WMC: Weighted Methods per Class
DIT: Depth of Inheritance Tree
NOC: Number Of Children
CBO: Coupling Between Object
Classes
RFC: Response For a Class
LCOM: Lack of COhesion of a
Method
Predict fault proneness during design
Strong relationship to maintenance effort
Many OO metrics correlate strongly with size
N. Kokash, Software Engineering
62
Software Engineering
Weighted methods per class
Measure for size of class
 WMC =  c(i), i = 1, …, n
(number of methods)
 c(i) = complexity of method I
 mostly, c(i) = 1

N. Kokash, Software Engineering
63
Software Engineering
Depth of class in inheritance tree
DIT = distance of class to root of
its inheritance tree
 DIT is somewhat languagedependent
 Widely accepted heuristic: strive
for a forest of classes, a
collection of inheritance trees of
medium height

N. Kokash, Software Engineering
64
Software Engineering
Number of children
NOC = number of
immediate descendants
 Higher values NOC are
considered bad:

 possibly
improper abstraction
of the parent class
 also suggests that class is to
be used in a variety of
settings
N. Kokash, Software Engineering
65
Software Engineering
Coupling between object classes
Two classes are coupled if a
method of one class uses a
method or state variable of
another class
 CBO = count of all classes a
given class is coupled with
 High values: something is
wrong
 All couplings are counted alike

N. Kokash, Software Engineering
66
Software Engineering
Lack of cohesion of a method





N. Kokash, Software Engineering
Cohesion = glue that keeps the
module (class) together
If all methods use the same set of
state variables: OK, & that is the glue
if some methods use a subset of the
state variables, and others use
another subset, the class lacks
cohesion
Two methods in the same set share
at least one state variable
LCOM = number of disjoint sets of
methods in a class
67
Software Engineering
Response for a class
RFC measures the “immediate
surroundings” of a class
 RFC = size of the “response set”
 response set = {M}  {Ri}

R1
M1
M2
M3
N. Kokash, Software Engineering
68
Software Engineering
Dependency: Coupling
Coupling is the degree of interdependence
between modules
high coupling
low coupling
Heuristic: minimize coupling between modules
N. Kokash, Software Engineering
69
Software Engineering
Complexity: Fan-in & Fan-out
Fan-in = number of ingoing dependencies
Fan-out = number of outgoing dependencies
Heuristic: a high fan-in/fan-out indicates a high complexity
N. Kokash, Software Engineering
70
Software Engineering
Façade pattern
Client
classes
Façade
Note the effect on the fan-in/coupling of the component
N. Kokash, Software Engineering
73
Software Engineering
Dependency: Cohesion
Cohesion is concerned with the
interactions within a module
high cohesion
low cohesion
Heuristic: Keep things together that
belong together.
High cohesion within a
module is good
N. Kokash, Software Engineering
74
Software Engineering
Example: Extensibility
Metrics:
•Complexity of topology
•Number of changes
A
X
A
X
Mediator
B
C
Difficult to extend
Complexity O(n2)
N. Kokash, Software Engineering
B
C
Easy to extend
Complexity O(n)
75
Software Engineering
Graphical tool for developer
feedback
UML model
Analysis Tool
Quality Metrics/Rules
• Completeness
• Consistency
• Conventions
N. Kokash, Software Engineering
Visualization of model + metrics
76
Software Engineering
MetricView video
http://www.youtube.com/watch?v=G3HJ_QR9EG4
N. Kokash, Software Engineering
78
Software Engineering
Other software metrics
Number of pages of requirements
 Team-size
 Programmer experience
 Number of use cases
 Average change request
completion time


Are these product or process
metrics?
N. Kokash, Software Engineering
79
Software Engineering
Metrics for qualitative properties
Scalability
 Maintainability (w.r.t. features
& technology):
 Comprehensibility, adaptability
and extensibility
 Openness & interoperability
 Portability & reusability

N. Kokash, Software Engineering
80
Software Engineering
Strengths and weaknesses
of metrics

Strenghts
 Objective
 Incremental
 Fast results (if automated)
 Require little effort (if automated)

Weaknesses
 Interpretation is difficult
 Rather than black-white: use as indicator for weak spots
 Results are sensitive to ‘completeness’ of input
N. Kokash, Software Engineering
81
Software Engineering
Basic definitions

A failure is an unacceptable behaviour exhibited by a system




A defect is a flaw in any aspect of the system that contributes, or
may potentially contribute, to the occurrence of one or more
failures



The frequency of failures measures the reliability
An important design objective is to achieve a very low failure rate and
hence high reliability.
A failure can result from a violation of an explicit or implicit
requirement
could be in the requirements, the design and the code
It might take several defects to cause a particular failure
An error is a slip-up or inappropriate decision by a software
developer that leads to the introduction of a defect
error
N. Kokash, Software Engineering
defect
failure
84
Software Engineering
Measuring defects

Defect Density: Standard quality measure


Defect Arrival Rates/Fix Rates: Standard Process and
Progress Measurements


Number of defects per KLOC or FP
Defects detected/fixed per unit of time (or effort)
Removed Defects: Defects which are identified and then
taken out of the product

Due to some defect removal activity, such as code reviews
N. Kokash, Software Engineering
85
Software Engineering
Measuring the effectiveness of
defect detection
 Develop program
 Inject (or label) defects
 Detect defects (testing, reviewing)
 Count how many of the injected defects
you found
 Count how many non-injected defects you found
The number of remaining defects =
number of non-injected defects found
% of injected defects not found
defect
Injected defect
N. Kokash, Software Engineering
86
Software Engineering
Measure quality for continuous
improvement

Measures regarding


the quality of a software product
the quality of the process

The number of defects found when inspecting a product.

The number of failures found when testing a product.

The number of failures encountered by users.

The number of questions posed by users to the help desk.


As a measure of usability and the quality of documentation.
Root cause analysis

Determine the source of problems & Improve
N. Kokash, Software Engineering
87
Software Engineering
Predicted Software Failure Arrival Rates
Software
Failure
50
System B
25
Time
N. Kokash, Software Engineering
6 months
88
Software Engineering
Projected Software Defects
In general, follow a Rayleigh Distribution Curve…can predict, based upon
project size and past defect densities, the curve, along with the Upper and
Lower Control Bounds
Defects
Upper Limit
Lower
Limit
Time
N. Kokash, Software Engineering
89
Software Engineering
Summary
Product quality versus process
quality
 Quality conformance versus quality
improvement
 Quality has to be actively pursued
 There are different notions of quality
 Quality has many aspects
 Quality is hard to measure

N. Kokash, Software Engineering
90
Software Engineering
Summary





Software quality management is
concerned with ensuring that software
meets its required standards.
Quality assurance procedures should be
documented
Software measurement gathers
information about both the software
process and the software product.
Product quality metrics should be used to
identify potentially problematical
components.
There are no standardised and universally
applicable software metrics.
N. Kokash, Software Engineering
91
Software Engineering
Homework

Read chapter 6

Assignment
 Start
implementation!
 Make estimates and track
hours
 Deliver Milestone 1 – 6
 Prepare documentation
 Prepare demonstration
N. Kokash, Software Engineering
92
Software Engineering
Software Engineering (3rd Ed.)


















N. Kokash, Software Engineering
1. Introduction
2. Introduction to Software Engineering Management
3. The Software Life Cycle Revisited
4. Configuration Management
5. People Management and Team Organization
6. On Managing Software Quality
7. Cost Estimation
8. Project Planning and Control
9. Requirements Engineering
10. Modeling
11. Software Architecture
12. Software Design
13. Software Testing
14. Software Maintenance
17. Software Reusability
18. Component-Based Software Engineering
19. Service Orientation
20. Global Software Development
93