No Slide Title

Download Report

Transcript No Slide Title

Project Management for Modern Software Development

Keeping Manager Activities up to date with Technology

Timothy Korson

Southern Adventist University

Software Project Management

Direct

Restricted Use

This copyrighted material is provided to attendees of courses taught by Timothy Korson under a restricted licensing agreement exclusively for the personal use of the attendees. Dr. Korson retains ownership rights to the material contained in these course notes.

Please respect these rights.

Copyright © 2008 Timothy D. Korson. All rights reserved.

Version 1.5

3 -263

Successful Project Management

“Why do some companies succeed in managing object oriented projects and others fail? Most companies prepare their technical staffs by sending them to language and object-oriented analysis and design classes. Unfortunately, these same companies often ignore the training needs of managers who’ll be directing these technical staffs. Managers are left to fend for themselves, armed with yesterdays tools with little insight into the potential of today’s technologies.” John D. Williams “Managing Iteration in Object Oriented Projects” IEEE Computer, September 1996, p. 39 4 -263

Tutorial Prerequisites

 This course assumes previous project management experience either as Project Manager or Project Team Member.

 This course is not an in-depth introduction to project management activities. Rather, it generally focuses on the differences in project management between classical, procedural software development and object oriented software development.

For an excellent compilation of information on general project management see the

Project Management Body of Knowledge

published by the Project Management Institute. Information on it can be found at www.pmi.org

5 -263

Recommended Reading

Addison Wesley ISBN-0-201-30958-0

6 -263

Course Objectives

 Be a more effective project manager or team member of a modern software development project.

 Understand the differences between modern and classical project management.

 Effectively use an iterative/ incremental software development life cycle.

7 -263

Course Objectives

 Perform these project management activities for object oriented software development: Manage risk Planning the work Measure the project Resource planning Ensure quality Foster reuse Manage the project team Communicate with stakeholders 8 -263

Table Of Contents

Chapter 1 Introduction To Project Management Chapter 2 Modern Software Engineering: Activities, Principles, and Processes Chapter 3 Direct Project Management Issues Chapter 4 Pitfalls Of Modern Development Chapter 5 Summing it up

9 -263

Chapter 1

Introduction To Project Management

10 -263

Chapter Objectives

 At the end of this chapter you will be able to:  List the top 10 Principles of Modern Software Management  Define

Project

,

Stakeholder

, and

Project Management

.

 Give examples of why software development projects fail.

11 -263

Top 10 Principles of Modern Software Management

Direct

1.

2.

3.

4.

5.

Base the process on an

architecture first

approach.

Establish an

iterative lifecycle process

that confronts risk early.

Transition design methods to emphasize

component-based development.

Establish a

change management environment

.

Enhance change freedom through tools that support round-trip engineering .

6.

7.

8.

9.

Capture design artifact in rigorous,

model-based notation

.

Instrument the process for

objective quality control

and progress assessment.

Use a

demonstration-based approach

artifacts.

to assess intermediate Plan intermediate releases in groups of usage scenarios with

evolving levels of detail

.

10.

Establish a

configurable process

that is economically scalable.

State your conclusions first.

Royce

12 -263

Definitions

Project Define

Stakeholder

Project Management

13 -263

Project

 Organizations perform work. This work is done by people; constrained by limited resources; and planned, executed, and controlled.

Operations • Ongoing • Repetitive Project • Temporary • Unique 14 -263

Project

 A

project

is a temporary endeavor undertaken to create a unique product or service.

Define

Temporary

means that every project has a definite beginning and a definite end.

 The end is reached when the project’s objectives have been met or when it becomes clear that the objectives will not or can not be met and the project is terminated.

15 -263

Project

 Projects involve doing something that has not been done before and are therefore

unique.

 The presence of repetitive elements does not change the fundamental uniqueness.

 “Because the product of each project is unique, the characteristics that distinguish the product or service must be progressively elaborated. Progressively means ‘proceeding in steps; continuing steadily by increments’ while elaborated means ‘worked out with care and detail; developed thoroughly.’” [Project Management Body Of Knowledge www.pmi.org] 16 -263

Stakeholders

 A

stakeholder

is any individual who affects or is affected by the application being built:

Define

   Client (those who are paying)  User (those who interact with the application)  17 -263

Stakeholders

“Involve real users [stakeholders] throughout the software development process; their presence is a constant reminder why and for whom the software is being crafted.” [Booch]  What is your approach for formally identifying stakeholders and gathering their combined knowledge?

18 -263

Project Management

Define

Project management

is the application of knowledge, skills, tools, and techniques to project activities in order to meet or exceed stakeholder needs and expectations from a project.

 Meeting or exceeding stakeholder needs and expectations involves balancing competing demands:  Scope, time, cost, and quality  Differing needs and expectations among stakeholders  Identified requirements vs. unidentified requirements 19 -263

Joke*

What is the difference between many software development projects and the Boy Scouts?

The Boy Scouts have adult leadership!

Can you give an example of a project you have observed that did not have “adult leadership”?

20 -263

Why Do Software Projects Fail?

 We fail to properly manage risks  We don’t build the right thing

Question

 We are blindsided by technology Notice the “

We

”. As project managers, we develop idealistic plans, we set unrealistic schedules, we deceive ourselves and others, and we refuse to face reality. These projects eventually enter “free fall” with no one taking responsibility and everyone waiting for the crash (while sending out resumes).

21 -263

Fail To Properly Manage Risks

 “Management must actively attack a project’s risks, otherwise they will actively attack you.” [Gilb]  Examples:     22 -263

Don’t Build The Right Thing

 Often, projects lose their way because there is no shared vision of clients, users, problems to be solved, methods to be used, or goals to be achieved.

 Examples:     23 -263

Blindsided By Technology

 Concepts are more difficult than they seem, tools don’t scale up or they introduce errors, suppliers don’t deliver promised functionality or performance  Examples:     24 -263

Blindsided By Technology

 Is OO for you? Ask the following questions:  Is it possible? Will OO work for us?

 Is it attractive? Can OO make a difference?

What advantages will it achieve for us?

 Is it practical? Can we really do it?

 Is is desirable? Does it provide significant return on our investment?

 If not, is this a project you want to be associated with?

25 -263

Project Management To The Rescue

 Effective project management actively works to minimize these problems by:  Explicitly identifying and creating written mitigation and contingency plans for project risks  Evaluating the product being built against the clients’ documented needs and expectations  Assessing the suitability of the technologies for use on the project 26 -263

Summary

    27 -263

28 -263

This is what your staff should be doing.

Chapter 2

Modern Software Engineering Activities, Principles and Processes

29 -263

Chapter Objectives

 At the end of this chapter you will be able to understand and apply the following concepts:  System requirements  Development phases  Sequencing the development work  Fundamental principals of modern software engineering.

30 -263

Start With System Requirements

 What do we mean by “requirements”?

Question

31 -263

Classical Requirements

Define

Mandatory FUNCTIONS ATTRIBUTES EXPECTATIONS Preferences “The difference between delight and disappointment.” [Weinberg] 32 -263

IEEE 830 Standard

1. Introduction 1.1

1.2

Purpose Scope 1.3

1.4

1.5

Definitions, acronyms, abbreviations References Overview 2. Overall description 2.1

Product perspective 2.2

2.3

Product functions User characteristics 2.4

2.5

Constraints Assumptions and dependencies 33 -263

IEEE 830 Standard

3. Specific requirements 3.1

3.1.1

External interfaces User interfaces 3.1.2

3.1.3

3.1.4

Hardware interfaces Software interfaces Communications interfaces 3.2 Functions 3.2.1 3.2.2

3.2.3

Information flows Process descriptions Data dictionary 34 -263

IEEE 830 Standard

3.3

3.4

3.5

3.6

3.6.1

3.6.2

3.6.3

3.6.4

3.6.5

Performance requirements Logical database requirements Design constraints Software system attributes Reliability Availability Security Maintainability Portability 35 -263

Modern Requirements

 Replace section 3.2 with …

Question

36 -263

Modern Requirements

Define

   

Information Needed

Stakeholders Business goals System functions Technical system requirements     Fundamental domain concepts and relationships Domain algorithms Domain level state behavior application behavior, look and feel    

Information Representation

Enhanced actor model Textual descriptions - must be traced to the architecture Use case hierarchy  Textual descriptions cross referenced to the use cases Domain class model    Interaction diagrams state transition diagrams prototypes 37 -263

Management

Direct

 Don’t let yourself be talked into being satisfied with old style functional requirements. Insist on all the artifacts specified on the previous page.  You, as a manager, need to be familiar with each of these means of representing information.

 Be prepared to hold the architects responsible for implementing the business goals.

38 -263

Development Phases* - Activities

Requirements Use Cases

Systems Engineering Domain Analysis

Class Spec

Application Design Application Analysis

Use Cases

Class Design and development

Code

Incremental Application Assembly and Testing * called workflows in the Rational Unified Process

39 -263

Development Phases - Activities

Systems Engineering Domain Analysis Application Analysis Application Design

Models of bodies of knowledge are constructed. The scope of these areas are only loosely constrained by the purpose of the application to be developed.

The domain models and specific system requirements are used to produce a set of models that define the application to be built.

A skeleton outline of the system is created. Utilizing standard frameworks for the various subsystems, interfaces for the subsystem interactions are defined.

Detailed designs of classes and small clusters of classes are created and implemented. Use of class libraries is an important part of this phase.

Class Design and development

Detailed designs of classes and small clusters of classes are created and implemented. Use of class libraries is an important part of this phase.

Incremental Application Assembly and Testing

The various classes, clusters, and subsystems are integrated incrementally to form larger units. A complete system will evolve after sufficient iterations. The completed portion of the system is tested from a functional point of view.

40 -263

Development Phases - Expertise

Systems Engineering Domain Analysis Application Analysis Application Design Class Design and development Business stakeholders; Business re-engineering experts Domain experts; Business modeling experts Clients; Use case experts Software Architects; Great designers Technology experts; Programmers Incremental Application Assembly and Testing Testers; Integration experts

41 -263

Management

Direct

 Must schedule the right people at the right time.

 Given the iterative nature of modern software engineering processes this is not an easy task 42 -263

Systems Engineering

Define

• Business Requirements • Customer wants • Constraints

Systems Engineering

• Allocation of functionality • High-level software requirements 43 -263

Management

Direct

Make sure the software perspective is heard during systems engineering.

44 -263

Domain Analysis

Define

• High-level software requirements • Domain Knowledge • Existing Domain Models

Domain Analysis

• New Domain models • Updated Domain Models 45 -263

Domain Analysis

 The goal of domain analysis is to understand the

problem domain

, the context or environment in which the problem exists.

Discover

 We identify things and relationships we perceive to be important in the problem domain:  Things, categories of things, concepts, attributes, and behaviors  Relationships between concepts  We use the abstractions and relationships we find in the problem domain as the abstractions and relationships in the system being developed.

46 -263

Domain Models

 Actor models  Top level of the Use Case hierarchy  Class models  State Models  Behavior Models 47 -263

Use Cases

Define

 A

use case

is an end-to-end scenario (sequence of actions) that describes a use of the system by a user (actor) to accomplish a specific goal.

 Use cases can be used to:  Describe a system’s functional requirements  Form the basis of the system requirements document 48 -263

Use Cases

 Use cases provide the following benefits:  Capture requirements from the users’ perspective  Involve users in the requirements gathering process  Provide a basis for the identification of classes and relationships  Provides traceability from requirements through analysis and into design  Serve as the foundation for system test cases 49 -263

Use cases are best developed iterativaly and incrementally

 The only way to get quality is to iterate  Requirements change while the system is being developed  As the development team better understands the domain, the are better able to review the use cases 50 -263

Each Level is Complete

1. Define course policies 1.1 Define late policy 1.2 Define category weights

Use case 1.1 is a specific, more detailed, complete use case within the category of use cases defined by use case 1.

51 -263

Class Diagrams

Define

Class diagrams

are UML diagrams that document the important concepts, their relationships, and their interactions at the domain level.

52 -263

53 -263

54 -263

Management

Direct

 Without domain analysis, you will not get the correct requirements, nor get the requirements correct.

 You must have the courage to formally bound the domain  Insist that the top level use cases are complete  Don’t let design or implementation considerations creep into the domain model 55 -263

Application Analysis

Define

• Domain bounds • Domain-level class diagrams • Domain-level interaction diagram • Domain-level state-transaction diagram • Business requirements

Application Analysis

• Application level use cases • Application model • Detailed requirements specification • Application-level interaction diagram • Application-level state transition diagram 56 -263

Application And Domain Models

 The domain model is pruned and software specific detail is added as we move from the domain to the application level.

Domain Model Database Reports Application Model Networking GUI 57 -263

Application Models

Define

Application models

include class diagrams that document all of the classes and their relationships. In addition, attributes and methods may be included.

58 -263

Requirements

 During Application Analysis the requirements for the current increment are fleshed out.

 Use cases  Standard components and frameworks  Hardware platforms and other resource requirements and constraints  GUI layouts  Overall performance objectives 59 -263

Management

 Don’t start the design or implementation of an increment until the requirements are complete.

Direct

 An iterative/incremental process is no excuse for not having written requirements.

60 -263

Application Design

Define

• Application level use cases • Application model • Detailed requirements specification • Application-level interaction diagram • Application-level state transition diagram • Reusable Components, patterns, and frameworks

Application Design

• Detailed class specifications (pre-conditions, post conditions, invarients, state-transition diagrams) • Interaction diagrams • Application architecture 61 -263

The old days are gone forever

62 -263

You can no longer afford to build your architecture from scratch

 Product lines  Frameworks within a unique product  Patterns and components everywhere 63 -263

Patterns

 I would no more hire a C++ or Java programmer that wasn’t familiar with patterns than I would hire a c programmer that didn’t understand recursion or binary sorts.

64 -263

Pattern Fever

CAUTION INTELLEGENCE REQUIRED

 Many patterns aren’t  Many are misapplied  The terminology is often incorrectly used 65 -263

Architecture

 An architecture for a system is the pattern of connections among the basic components of the system; the pattern embodies the relations and constraints among the constituent pieces of the system.

 Our preferred method of documenting an OO architecture uses: 

static, dynamic, and functional models

of the architecture  A

pattern language

that describes the interrelationships and constraints of the architecture, and governs instantiation of the architecture .

66 -263

Architectural Quality

Metric

“A good architecture provides a clear separation of concerns among disparate elements of a system, creating firewalls that prevent a change in one part of the system from rending the fabric of the entire architecture.”

Booch

67 -263

Convergence

Distributed Objects (e.g. CORBA) TPM (e.g. CICS) Scalability unproven, Immature, flexible, easy to program EJB?, MTS? TUXEDO?

Scaleable, Mature, hard to program

MetaData Approach

  

PeopleSoft SAP Applets

Client runs a virtual machine/translator

Frameworks

Define

 A

framework

is a set of related and interacting classes that provide the basic structure, design, and functionality to implement a business process.

 Commercial

applications frameworks

are a set of core business classes that cooperate in fixed ways to implement core business processes.

 As such, a framework is a significantly large “chunk” that can be reused to implement a number of similar applications.

70 -263

Frameworks

 Frameworks allow analysts, designers, programmers, and testers to concentrate their efforts on building the unique parts of applications while reusing the standard, non unique parts.

Framework Hot Spots Application 71 -263

Components vs. Frameworks

Library of Components Framework Define

72 -263

Components vs. Frameworks

Complexity Type of structure Control flow Embodies Usability Encapsulation Component

Fairly low Static Designer selects Design and code Can be used "as is" Functionality and data

Framework

Very high Dynamic Framework defines Process, architecture, design and code Can not be used "as is" Application 73 -263

Management

 Components, patterns and frameworks are here to stay, but there is much misinformation about reuse

Direct

 When and where to invest in reuse assets is more of a business decision than a technology decision  It is no longer “will we use OO?” but “to what level will we use OO?” 74 -263

Management

There should be at least three types of design reviews

Direct

Architecture review Cluster design review

Do you know why cluster design reviews are so important?

Detail design review 75 -263

Class Design And Development

Define

• Detailed class specifications (pre-conditions, post conditions, invarients, state-transition diagrams) • Interaction diagrams • Application architecture • Performance considerations

Class Design And Development

• Classes, completely coded and tested • Test plan, test cases 76 -263

Management

 Review the code to make sure it implements the specified design

Direct

 Programmers are used to having the final say. This is not a good practice!

 Clients should have the final say about the product  Architects should have the final say about the implementation 77 -263

Application Assembly

Define

• Application models • Classes, completely coded and tested

Application Assembly

• Application 78 -263

Application Assembly

 Traditional integration issues for this increment  but smaller in scale  Test case development  based on use cases for this increment  Integration of this increment with previous increments  Regression testing issues 79 -263

Management

Direct

 Make sure each current increment is tested and debugged as it is produced  Insist on a regression testing framework 80 -263

Sequencing

Waterfall Model

Define Analysis High-level design Detailed design Implementation Testing Production

 The waterfall model is very good for automating a known solution to a well understood problem.

82 -263

Waterfall Model

 The Waterfall Model is based on three (rarely considered) assumptions:  We can know everything from the beginning  Nothing ever changes  We never make mistakes  What do you think of the validity of these assumptions?

83 -263

Why Can’t We Cookbook It

 The system’s stakeholders typically do not know exactly what they want.

 The stakeholders are often unable to articulate what they do know.

 Even if we could state all the requirements, many details can be discovered only as we implement.

 Even if we could know all the details, there are limits to what we can understand and remember.

84 -263

Why Can’t We Cookbook It

 As we begin each new project we bring intellectual baggage (both good and bad) from our previous experience that shape our decisions independent of a system’s real requirements.

 Systems built by humans are always subject to error.

“For all of these reasons, the picture of the software designer deriving his design in a rational, error-free way from a statement of requirements is quite unrealistic. Fortunately, it is possible to fake it.” [Parnas] 85 -263

Riddle

Question: How do you eat an elephant?

Answer: 86 -263

How We Can Fake It

 Iterative – Discovery, invention, and implementation come in cycles.

 Incremental – Deliverables build on earlier ones, each moving closer to the delivery of the final application.

 “One important habit that distinguishes healthy projects from unhealthy ones is the use of an iterative and incremental software development life cycle.” [Booch] 87 -263

Iterative Model

Define

 Since we can’t know it all from the beginning, get it totally correct, and freeze the world outside, let’s use a development model that takes into account the fluid nature of software development.

Analysis Design Code • Analyze a little • Design a little • Code a little • Test a little • Learn a LOT!

88 -263

Management

 Developers have always iterated on an ad hoc basis as needed; the problem is that managers have pretended that the project was following a waterfall.

Direct

 When using an iterative model, the manager plans for development activities to be revisited Analysis Design Code • Analyze a little • Design a little • Code a little • Test a little • Learn a LOT!

89 -263

Planing Iteration

 Why plan to do certain activities over again?

 Add more detail  Correct mistakes  iterate within an increment  Improve quality  Account for cyclical dependencies 90 -263

Incremental Model

 Build a system in increments that represent increasing levels of functionality:  This is different from waterfall until implementation, then incremental development .

Domain Analysis Application Analysis Application design Implementation Testing Production

91 -263

Incremental Model

 An increment is some subset of the system that is completely coded and tested.

Define Analysis High-level design Detailed design Implementation Testing Production

92 -263

Planning Incremental Development

 Rather than focusing on

milestones

, let’s consider

inch-pebbles

.

 Increments should generally be 2-6 weeks in length.

 Shorter and you spend too much time planning and evaluating and not enough time working. Longer and development can get out of control.

93 -263

It’s Incremental Everything

Analyze Design Analyze Design Implement Implement Requirements Specification Domain Model Design Document

Other important documents, such as the user manual and test plan, are also developed, and verified incrementally.

94 -263

Management

 As a manager you want to get the project done on schedule, but you also want it done correctly

Direct

 Use iterative techniques to get it done right  Use incremental development to schedule getting it done on time 95 -263

Prototyping Model

 Some advocate a process with little formal analysis and design, evolving the prototype into the product.

 This is good for small, short lived, in-house projects that will forever be supported by the same person(s).

 Today’s systems need a well documented solid architecture that is matched to the business needs of the product.

 It is a temptation to skip Domain Analysis and Architecture when using a powerful IDE such as JBuilder from Borland

Note: We will use the term

prototype

to designate a disposable product.

96 -263

Good Architects Create Lots of Disposable Protypes:

 Analysis Prototypes  Answer questions about requirements  Design Prototypes  Answer questions about the architecture. Design prototyping is facilitated by simulation tools  Implementation prototypes  Answer questions about feasibility, performance, memory requirements, etc.

97 -263

Management

How much code do you plan to throw away?

Direct

98 -263

An OO Development Process

 Basic process is incremental  Iterate within increments, with prototype support as necessary.

Often this will involve reworking one piece of the system several times before an increment is finished. Previous increments are only revisited to fix errors or serious flaws

.

 How does one incrementally build an architecture?

 Do enough analysis and design to form a context, then concentrate on driving an increment through to completion (running, tested code).

Basic Development Process

Activity I n

Programmer-analysts teams work best

Increment I n+1 I n+2 I n+3 light activity

Domain Analysis Application Analysis Application Design Class Development Application Assembly

heavy activity

Getting Started

Activity I 0a I Increment 0b I 0c

Establishing a basic context

I 01 light activity

Domain Analysis Application Analysis Application Design Class Development Application Assembly

heavy activity

Stepping through the process Getting Started DA AA AD CD

Phase

IT

Project Scope

Stepping through the process Increment 1 I1 DA AA AD CD

Phase

IT

Project Scope

Stepping through the process Increment 2

Project Scope

DA AA I1 I2 AD CD

Phase

IT

When performing the Ith increment, bring the entire analysis and design up to date.

Choosing Increments based on Risk Analysis

 What are the benefits of doing high risk increments first?

 What are the benefits of doing low risk increments first?

 How does one balance scheduling high risk increments with low risk increments?

105 -263

Choosing Increments - “Use Case” versus Architectural

 A use-case driven (

external

) increment implements some subset of the user requirements (e.g. traffic controller works for non-rush hour straight traffic. No turn traffic, pedestrians or emergency vehicles are yet considered).

 An architecturally based (

internal

) increment implements some subset of system functionality (e.g. database access mechanism, inter-process communication).

 How does one balance scheduling external increments with internal increments?

106 -263

Quality Engineered

 Architectural increments imply architectural test cases  The specification of an architecture should include a set of “architectural use cases” 107 -263

Building Complex Systems is Fundamentally an Incremental Process

“A complex system that works is invariably found to have evolved from a simpler system that worked ... . A complex system designed from scratch never works and cannot be patched up to make it work. You have to start over, beginning with a working simple system.”

Ramamoorthy, C., and Sheu, IEEE Expert , Vol. 3 (3), p. 14 108 -263

Incremental Architecture - Example

4/1/98 Jim Davenport

Revision Date: Revised by: Object Model: Architecture

Client GUI Business Model Database Server Weak preconditions that throw exceptions including security

Adventist Accounting International

Strong preconditions

Revision 1.0

No access except via business model

Architecture - Increment 2

4/1/98

Revision Date: Revised by:

Jim Davenport

Object Model: Architecture

Client GUI

CORBA

Business Model Database Server Weak preconditions that throw exceptions including security

Adventist Accounting International

Strong preconditions

Revision 1.0

No access except via business model

Management

Direct

 With the incremental approach every project plan is unique (as it should be)  defining and scheduling increments is a skill that improves with experience  Multiple teams mean that there will be multiple increments being developed in parallel  the schedule for each team should depend on the nature of that team’s increment 111 -263

The Impedance Mismatch

 Our Project – Iterative/Incremental

You

 Everything else in the world – Waterfall       112 -263

Modern Software Engineering

 The iterative/incremental approach is at the heart of modern software development techniques  There are additional issues 113 -263

Rational Unified Process - Terminology

114 -263

Milestones (inch pebbles)

115 -263

Terminology

Typical

    Development Phase Increment Iteration Project Stage

Unified

Core Workflow Release Iteration Phase 116 -263

Basic Goals of a Software Development Process

       Correct, Robust, High Quality Software that is easy to use and meets real business needs Modifiable Software Extensible Software Portable, Integrable Software Rapid Development Capability Maximize Productivity Minimize Costs 117 -263

The Object-Oriented Paradigm Promises to:

 Support RAD through reuse of components, design patterns, architectural frameworks, and domain models  Improve product quality  Facilitate business process reengineering  Solve problems of greater complexity  Improve maintainability and extensibility  Improve communication with customers  Improve communication between developers 118 -263

Potential Benefits

 An optimal use of object-oriented technology requires a new development process  Any promised benefits from the use of OT are only

potential

benefits. They do not magically happen. They require specific management practices to make them happen 

Rarely

however, do corporate policies and practices encourage managers to follow good, modern software engineering techniques 119 -263

In Fact:

 Most corporate policies and practices actively discourage managers from following good, modern software engineering techniques  Can you think of examples from your own experience?

120 -263

The Object-Oriented Paradigm

Traditional Mind-set u What does the system do?

u How do I design a system to provide these functions?

u Focus on version 1 of this system Object-Oriented Mind-set u What are the fundamental business concepts in the domain?

u How can I model the problem and distribute responsibilities to achieve the system's functionality? u Focus on extensibility and families of systems 121 -263

A Use Case Driven Approach

122 -263

Good Software Engineering is NOT Use Case Driven

123 -263

Unified Process

  

Architect-Centric Use-Case Driven Component Based

124 -263

Good Software Engineering is...

Domain Analysis Based

Architecture-Centric

Use-Case Configured

Component Biased

Quality Engineered

125 -263

Domain Analysis Based

 First major activity in software development is the identification of the fundamental business concepts and relationships  Design of the application server is based on the classes and relationships in the domain model 126 -263

Architecture-Centric

 Architecture definition and refinement is viewed as a core development activity  Architecture is not ad hoc, but is based on industry standards, patterns, commercial frameworks and components  All detail design and coding is done within a well defined architectural framework, often according to a specific pattern  Architecture team plays a key role throughout the entire product lifecycle 127 -263

From Domain Analysis to Architecture

Grading Domain Model (Major Domain) Registration Domain Spreadsheet Domain Reporting Domain Grading Application Model

  The architecture for each separate domain describes the way the classes within the domain interact and is based on domain relationships The architecture for the application describes the way the different domains interact 128 -263

Use-Case Configured

Domain Relationships Business Goals & Technical System Requirements Architecture Use Cases Application Commercial Standards & Frameworks

129 -263

Uses cases STOP at the system interface boundary

130 -263

Do not directly use cases derive your design from your

  Use cases DO describe sequences of actions that actors follow in using the system Use cases must NEVER specify what steps the system takes internally in responding to the stimulus from an actor Interface System 131 -263

Example Architectural Drivers

Finance

 Extensibility 

Telecommunications

 24X7 hot switchover 

Aerospace

 COTS and GOTS contract mandate 132 -263

Watch out for Shortcuts

   RAD tools that tie screens directly to databases give you RAD development, but not RAD modifiability Screens should be tied to the business model.

The business model should not directly embed SQL, or any other implementation technology, in it’s methods GUI Report Writer Business Model Technology Interface Relational Database

What are the trade-offs?

133 -263

Component Biased

 Robust libraries of domain-specific and domain independent components  Frameworks that supply substantial portions of an application  Packaging techniques that requires loose coupling and well specified interfaces between modules  Supporting management techniques that enable us to integrate these components 134 -263

Quality Engineered

 Testing starts with requirements  Iterative techniques are used to improve quality  set up and prototyping  functional increment  engineering increment  Extensive prototyping and demonstration by early increments.

 Reviews, inspections and walkthroughs are used to improve quality, not just to rubber stamp or find errors  A culture of refusing to sacrifice quality for functionality 135 -263

A Software Development Process Meta Model

1.

2.

Emphasis Domain Analysis Based Architecture-Centric Use Case Configured Component Biased Quality Engineered Activities Domain Analysis Application Analysis Application Design Class Design / Development Incremental Integration and System Testing 3.

4.

UML Notation Scope Comprehensive 5. Process Model Iterative Process Incremental Application Development Prototype-supported 136 -263

Management

 Management is more than the ability to estimate, plan and track

Direct

 Good managers understand the fundamentals of good software engineering, and build an environment and culture that reward quality.

137 -263

Measurable Progress

 “

Every healthy software project I have encountered has a natural rhythm whose beat is sounded by a steady and deliberate forward motion toward the delivery of a meaningful product at a reasonably defined point in time

.” – Grady Booch  This is a fancy way of saying that healthy projects meet their deadlines for the defined increments.

138 -263

Whirlpool Model

Domain Analysis Application Assembly Application Analysis

Advanced

Class Design/ Development Application Design  Retains the distinct phases while supporting incremental development.

 Phase order is not fixed. At any point in time we can choose the most effective thing to do next.

 What do you think of this idea?

139 -263

Management

 The basic questions:

Question

 Where are we?  Are we making progress?  When will we be finished?  How much will it cost?  What is the quality?

140 -263

7 (±2) Habits Of Successful Projects

    A culture that is centered on results, encourages communication, and yet is not afraid to fail.

 The application of a well-managed iterative and incremental development life cycle.

 A ruthless focus on developing a system that provides a set of essential but minimal characteristics.

Creating and communicating a strong, coherent, and resilient architectural vision.

Effective use of object-oriented modeling.

A management team that is obsessed with quality through adherence to the fundamental principles of software development 141 -263

Top 30 Principles of Conventional Software Engineering

1.

2.

3.

4.

• Make quality #1.

• Depends what one means by quality High –quality software is possible.

Agreed, but bug free software is next to impossible • Give products to customers early.

Yes, but • • Determine the problem before writing the requirements.

Domain analysis before detailed use cases Develop the requirements incrementally ISBN: 0070158401 McGraw Hill 142 -263

Top 30 Principles of Software Engineering cont.

5) • Evaluate design alternatives.

Make sure to to test the design against the business goals 6) • Use an appropriate process model.

Configure yes, but don’t neglect the fundamentals 7) • Use different languages for different phases.

Use cases, class diagrams, … 8) • Minimize intellectual distance.

Absolutely 143 -263

Top 30 Principles of Software Engineering cont.

9) • Put techniques before tools.

The good old days are long gone….

10) Get it right before you make it faster.

• Yes, but 11) Inspect code.

• Not in the top 30 12) Good management is more important than good • technology.

Let me explain...

144 -263

Top 30 Principles of Software Engineering cont.

13) • People are the key to success.

Should be in the top 5 14) • Follow with care.

Good advice 15) • Take responsibility.

Your parents should have taught you this 16) • Understand the customer’s priorities.

Change customer to stakeholder.

17) • The more they see, the more they need.

True, but don’t let that stop you 145 -263

Top 30 Principles of Software Engineering cont.

18) Plan to throw one away.

• Too waterfall...

19) Design for change • Absolutely, but formally define the scope of anticipated change 20) Design without documentation is not design.

• False 21) Use tools, but be realistic.

• Also be realistic about what will happen if you don’t use tools!

22) Avoid tricks.

• How about: “document your tricks” 146 -263

Top 30 Principles of Software Engineering cont.

23) • Encapsulate.

Move this to the top 10 24) • Use coupling and cohesion.

A bit simplistic 25) • Use the McCabe complexity measure.

If you suspect the quality of your software engineers ...

26) • Don’t test your own software.

Don’t be the only one to test ...

147 -263

Top 30 Principles of Software Engineering cont.

27) Analyze causes for errors. • A principle of process improvement 28) Realize that software’s entropy increases.

• Only if you let it 29) People and time are not interchangeable.

• But they are linked 30) Expect Excellence.

• And you may get it 148 -263

Summary

     149 -263

150 -263

This is what your staff should be doing.

Chapter 3

Direct Project Management Issues

151 -263

Chapter Objectives

 At the end of this chapter you will be able to understand and apply the following concepts:  Planning the work  Planning for resources  Managing risk  Measuring the project  Managing the project team  Communicating with stakeholders  Ensuring quality  Automated tools  Fostering reuse 152 -263

Overview of the Development Artifact Set

Requirements Set Design Set 1. Vision document 2. Requirements model(s) 1. domain models 2. Use Cases 1. Software architecture description 2. Platform and process deployment model 3. Cluster and detail design model(s) Implementation Set Deployment Set 1. Source code baselines 2. Associated compile-time files 3. Component executables 1. Integrated product executable 2. Associated run-time files 3. User manual 4. Technical documentation Test Set 1. Test cases 2. Test scripts 3. Regression test environment

153 -263

Overview of the Management Artifact Set

Planning Artifacts 1. Business case 2. Software development plan 3. Work breakdown structure 4. Release specifications 5.

6.

7.

8.

9.

10.

Operational Artifacts Release descriptions Status assessments Software change order database Bug tracking database Deployment documents Environment

154 -263

Planning The Work

Define

 A WBS is simply a hierarchy of elements that decompose the overall project plan into a discrete set of tasks. It is:  A delineation of all significant work  A clear task decomposition for assignment of responsibilities  A framework for scheduling, budgeting, and tracking  A good work breakdown structure (WBS) is key to a project’s success.

155 -263

Planning The Work

 But, the content of a WBS are very dependent on management style, organizational culture, customer preference, financial constraints, and other project parameters.

156 -263

Planning The Work

 Conventional Work Breakdown Structure:

Document

Management System requirements and design Subsystem 1 Component 11 Requirements Design Code Test Documentation Component 1m (same as above) Integration and test Test planning Test procedure preparation Testing Test reports Other support areas Configuration control Quality assurance System administration From:

Software Project Management

Walker Royce Subsystem n (same as above) 157 -263

Planning The Work

 A WBS for iterative/incremental development should be organized around the

process

rather than the

product

.

158 -263

Planning The Work

 The basic format is:  First level – workflows (management, environment, requirements, design, …)  Second level – life cycle phases ( Inception - systems engineering/domain and application analysis; Elaboration - application design; Construction - class design and development and application assembly; Transition - installation)  Third level – activities that produce the artifacts (deliverables) 159 -263

Planning The Work

Document

 Iterative/Incremental Work Breakdown Structure:

Management

Inception phase management Business case development Software development plan Elaboration phase management Construction phase management Transition phase management

Environment

Inception phase environment specification Elaboration phase environment baselining Development environment installation and administration Change order database formulation Construction phase environment maintenance Change order database maintenance Transition phase environment maintenance

Requirements

Inception phase requirements development Vision specification Use case modeling Elaboration phase requirements baselining Construction phase requirements maintenance Transition phase requirements maintenance From:

Software Project Management

Walker Royce 160 -263

Planning The Work

 Iterative/Incremental WBS (continued):

Design

Inception phase architecture prototyping Elaboration phase architecture baselining Architecture design modeling Design demonstration planning and conduct Software architecture description Construction phase design modeling Architecture design model maintenance Component design modeling Transition phase design maintenance

Implementation

Inception phase component prototyping Elaboration phase component implementation Critical component coding demonstration Construction phase component implementation Initial release component coding and stand-alone testing Transition phase component maintenance 161 -263

Planning The Work

 Iterative/Incremental WBS (continued):

Assessment

Inception phase assessment planning Elaboration phase assessment Test modeling Architecture test scenario implementation Construction phase assessment Transition phase assessment Product release assessment and release descriptions

Deployment

Inception phase deployment planning Elaboration phase deployment planning Construction phase deployment Transition phase deployment Product transition to user 162 -263

Planning The Work

Defining Increments – Example

 Increment 1 – Eclipse Project basic business model classes - post a transaction, no database, no distribution  Increment 2 – Eclipse Project posted transaction is saved in the RDBMS 163 -263

Don’t Be Afraid of Early Increments

APPARENT RESULT REAL RESULT Early demonstrations expose design issues and ambiguities in a tangible form.

The design is noncompliant (so far).

Driving requirements issues are exposed, but detailed requirements traceability is lacking.

The design is considered “guilty until proven innocent.” Demonstration expose the important assets and risks of complex systems early, when they can be resolved within the context of life-cycle goals Understanding of compliance matures from important perspectives (architecturally significant requirements and use cases).

Requirements changes are considered in balance with design trade-offs.

Engineering progress and issues are tangible, for incorporation into the next iteration’s plans.

164 -263

Planning For Resources

 Planning requires estimating.

 What to estimate:  Size and scope  Schedule  Time  Money  People  Stability 165 -263

Planning For Resources

 Estimating techniques:  Top-down (this is not estimating – it is allocation)  Bottom-up  Parametric subsystems   weight*factor modules  size Example: Reusable class = 4 weeks Non-reusable class = 2 weeks Utility class = 2 weeks GUI screen class = 3 weeks 166 -263

Planning For Resources

 Most parametric techniques use the following approach: Effort = (Personnel)(Environment)(Quality)(Size Process ) One model, COCOMO , was developed by Barry Boehm in the 1980’s. It has been used and refined over the years.

 The accuracy of described as “within 20% of actuals, 70% of the time”.

COCOMO model has been 167 -263

Planning For Resources

 Remember:  Use common sense; don’t expect miracles.

 Example: It’s your first object oriented project, everything is new, your people are inexperienced, the requirements are ill defined so … estimate the project at 1/2 the time and cost you would have for a conventional project.

 Estimate incrementally; you won’t get it right the first time, incrementally improve your estimates.

168 -263

Managing The Project Team

 The days of the lone hero/cowboy are over.

 Today’s systems are far too complex and the timeframes are far too short for them to be built by individuals.

169 -263

Managing The Project Team

 “Organize around outcomes [deliverables], not tasks.” – Hammer and Champy  Teams must be:  Cohesive  Focused  Specialized (System Planning, Project Monitoring, Architecture, Technology)  Linked 170 -263

Managing The Project Team

Define

 Core Team Roles:  Executive Sponsor – critical to success; has clout! Can be from executive management, the user community, or the technical leadership.

 System Architect – wholistic, high-level, artistic director.

 Requirements Analyst – to gather the requirements.

 Technical Lead – technical excellence combined with superior communication skills.

171 -263

Managing The Project Team

 Core Team Roles (continued):  Technicians – get the best team you can.

 Testers – participating from the beginning.

 Stakeholders  Clients – it’s their money  Users – get real ones full time, not intermediaries or surrogates. They must understand the operation; they must understand the business rules.

172 -263

Managing The Project Team

 Core Team Roles (continued):  Mentor – responsible for improving the team’s ability, not for project work.

 Project Manager – You! Have you got what it takes?

173 -263

Managing The Project Team

 Involve clients and users in:   Domain Analysis Development of the Use Case Hierarchy – Facilitated teams – User interface prototypes – Usability labs – Surveys – Tests – Bulletin boards – Observational study – User groups – Requirements prototype – Trade shows – Interviews

Other Standard Techniques

– Focus groups 174 -263

Managing the Project Team

 Organizing the team: The Detroit Model The Volvo Model Specialists Generalists 175 -263

Managing The Project Team

 Organizing the team:

System Analysts Architects Developers

176 -263

Managing The Project Team

Define

 Service Team Roles:  Process Preacher – processes, techniques, standards, metrics  Technical Writer – documentation specialist  Toolsmith – locate, evaluate, install, support  System Administrator – development environment  Librarian – configuration management  Component Specialist – locate, evaluate, install, support  Technical specialists (GUI designer, DB administrator, … 177 -263

Managing The Project Team

 Skills, Size, Timing  Analysis, design, code, test, tools, environment, …  Generalize vs. specialize  Training and Mentoring  Concentrated vs. spread out  Organizational patterns: Training team/ progress team, architect also implements, developer as user  Reward and recognition 178 -263

Managing The Project Team

 “Retraining in object-think swamps all other costs.” (Alistair Cockburn) 179 -263

Ensuring Quality

 Organizations typically are in one of the following six states of quality awareness: Feeling of complacency and market dominance (no awareness) Awareness of the importance of quality as market share erodes Slogans: Work harder, work smarter Search for “Silver Bullet” Massive inspection and testing efforts to detect errors Defect prevention that focuses on the process, not the product Quality is everyone’s job and overall it is management’s responsibility Where is your organization?

180 -263

Ensuring Quality

  “Quality comes not from inspection but from improvement of the process.” (W. Edwards Deming) Poorly managed organizations often spend 90% of their quality efforts on treating symptoms.  Well managed organizations spend 80% of their quality efforts on building and maintaining a quality culture.

Quality is “built in”

 The question is not “How much does quality cost?” The question is “How much does quality save?” 181 -263

Ensuring Quality

 Quality comes not so much from inspection as from requiring early demonstration  You can’t tell nearly as much about the quality of a system from its artifacts as from its realization.

182 -263

Fostering Reuse

 Mid 90’s the focus was on class libraries  Worked well for stacks, queues, GUI components, certain system level and middleware components  Royal failure of reusable business classes  account class

Credit Suisse

183 -263

Class Libraries

 Much of the blame has been placed on management and cultural issues  Failure of libraries of Business Objects was mainly technical  the cost of employing current technology to capture commonality among business objects is more than the benefit achieved. 184 -263

Detail Design Reuse

 Patterns have caught on because they are widely applicable and save effort.

185 -263

Frameworks

 Successes:  generic application and middleware levels  MVC, MFC  specific design mechanism level  routing  ingest  “variations on a theme” component level  spreadsheets  reports 186 -263

Framework Technical Failures

  Applications with not enough commonality Application frameworks only work well when there are only fairly small variations from application to application  game framework may work for checkers and chess, but not for chess and Monopoly 187 -263

Framework Failures - Corporation’s fault

 Numerous frameworks that could have succeeded from a technical viewpoint have failed for management reasons  managers are only responsible for getting version 1 to market as quickly and cheaply as possible  similar applications under different managers 188 -263

Product Lines

 Best current practice is to use components, patterns, and frameworks to produce software product lines http://www.sei.cmu.edu/plp/plp_prod_serv.html

189 -263

Fostering Reuse

 Why is it rarely done?

 It’s actually easier not to develop for reuse.

 Input from other projects must be sought, evaluated, and incorporated.

 The product being developed must be more generic.

 Increased testing is required to guarantee quality in many different environments.

 This causes delays in our project and probably means additional expense.

190 -263

Fostering Reuse

 Why is it rarely done?

 Clients and customers don’t want additional delays.

 Besides, Fred and Ethyl wrote this and we all know the quality of their work.

191 -263

Product Line Economics

2.25 d 2.5 m 1.5 d 2.0 m d dollars m months

1 2 5 Number of Projects

192 -263

Fostering Reuse

 Reuse approaches:  End-lifecycle model  Generalization after the project is complete. (Often not done because of next project pressures. Often not done because of funding questions.) 193 -263

Fostering Reuse

 Reuse approaches (continued):  Constant creation model Created during the project (Funding covered, reuse as a by-product of development, increased time and cost dropped with budget or delivery pressure).

194 -263

Fostering Reuse

 Reuse approaches (continued):  Two-library model  Develop->library1 (potentially reusable) [during project], then [after project] evaluate, if fairly good then generalize and add to library 2 else drop.

195 -263

Fostering Reuse

 Reuse motivation:  Exhortation (“You really oughta wanta reuse!”)  Pay a bonus for putting things into the reuse library.

 Pay a bonus for taking things out of the reuse library (minimum number or class reuse ratio).

196 -263

Fostering Reuse

 When we reuse correctly “

reuse

” is just “

use

”. 

Using

is not the problem. Creating a high quality reuse asset is the problem.

 When it becomes a corporate goal as well as a technical goal, reuse happens.

 Trying to achieve all possible reuse is not a good idea 197 -263

Managing Risk

Risk analysis

is the process used to decide where to allocate limited resources.

Define

 In the object oriented paradigm, risk analysis becomes the basis for determining the development and testing priorities.

 Use risk analysis to determine the types of problems that are more critical to system success and focus resources in these areas.

198 -263

Managing Risk

Risk Dimensions

Uncertainty

– An event may or may not happen. What is the probability of its occurrence?

Loss

– What is the cost to the organization if the risk occurs? This is usually expressed in monetary units.

Problem

 A risk that has occurred.

199 -263

Managing Risk

The most serious risk factors that affect development projects are:

1) Requirements Problems

Incorrect, incomplete, misunderstood, or creeping

2) Management Malpractice 2.1 Excessive cost or schedule pressure 2.2 Failure to plan, track or control within the framework of a modern development process -- inaccurate resource estimation -- denial 2.3 Poor Team Management 3) Poor Quality Software Engineering

…inadequate technical expertise

4) Technology Failure 200 -263

Managing Risk

Acceptance

– The “whatever” approach.

Define

Mitigation

– Take steps to minimize the loss.

Prevention

– Take steps to minimize the probability.

201 -263

Managing Risk

 Establish risk criteria:  Complexity of idea  Stability of specification  Risk of injury – financial, safety, etc.

 Use risk analysis to:  Prioritize component tests  Allocate testing resources  Choose the number of test cases level of coverage 202 -263

Managing Risk

 Risk management strategies:       Clear the fog Early and regular delivery Prototype Microcosm Holistic diversity Gold rush     Owner per deliverable Someone always makes progress Team per task Sacrifice one person

For further information see:

www.bell-labs.com/cgi-user/OrgPatterns/OrgPatterns 203 -263

Early Risk Resolution

 80% of the engineering is consumed by 20% requirements.

of the  80% of the software cost is consumed by components.

20% of the  80% of the errors are caused by 20% of the components.

 80% of software scrap and rework is caused by 20% changes.

of the  80% of the resource consumption (execution time, disk space, memory) is consumed by 20% of the components.

 80% of the engineering is accomplished by 20% of the tools.

 80% of the progress is made by 20% of the people.

204 -263

Royce

Risk profile of a typical modern project across its life cycle.

Inception Elaboration Construction--Transition High

Controlled Risk Management Period

Low

Modern Project Risk Profile Conventional Project Risk Profile Risk Exploration Period Risk Resolution Period

Project Life Cycle Royce

205 -263

Measuring The Project

“When you can measure what you are speaking about, and express it in numbers, you know something about it; but when you cannot measure it, when you cannot express it in numbers, your knowledge is of a meager and unsatisfactory kind; it may be the beginning of knowledge, but you have scarcely in your thoughts advanced to the stage of science.” - Lord Kelvin 206 -263

Measuring The Project

 What project measurements do you currently make?

 What have you learned from these measurements? How have you changed your development process because of this information?

207 -263

Measuring The Project

 Goal-Question-Metric (GQM) approach

Discover

 Goal – an organizational objective  Question – to determine whether a goal is being met  Metric – to answer a question 208 -263

Measuring The Project

 A good measure must:

Metric

 Be considered meaningful by the customer, manager, and technician.

 Demonstrate quantifiable correlation between the process and business performance.

 Be objective and unambiguously defined.

 Be a natural by-product of the process and be supported by automation.

 Indicate trends

Measuring The Project

 The following types of metrics are important:  Management indicators  Quality indicators 210 -263

Measuring The Project Top 3 Management Metrics

Metric Work and progress Budgeted cost and expenditures Staffing and team dynamics Purpose Iteration planning, plan vs. actuals, management indicator Financial insight, plan vs. actuals, management indicator Resource plan vs. actuals, hiring rate, attrition rate Perspectives SLOC, function points, object points, scenarios, test cases, SCOs Cost per month, full-time staff per month, percentage of budget expended People per month added, people per month leaving

211 -263

Measuring The Project Top 4 Quality Metrics

Metric Change traffic and stability Breakage and modularity Rework and adaptability MTBF and maturity Purpose Iteration planning, management indicator of schedule convergence Convergence, software scrap, quality indicator Convergence software rework, quality indicator Test coverage/adequacy, robustness for use, quality indicator Perspectives SCOs opened vs. SCOs closed, by type (0,1,2,3,4), by release/component/ subsystem Reworked SLOC per change, by type (0,1,2,3,4), by release/component/subsystem Average hours per change, by type (0,1,2,3,4), by release/component/subsystem Failure counts, test hours until failure, by release/component/subsystem

212 -263

Measuring The Project

 But, in the object oriented paradigm, how do we measure things like:  System size  Design quality 213 -263

Measuring The Project

 Proposed System Size metric  Number of Classes  What problems do you see with this metric?

    214 -263

Measuring The Project

 Proposed Design Quality metric  Depth of Inheritance Structure  What problems do you see with this metric?

    215 -263

Measuring The Project

 Proposed Design Quality metric  Superclass:Subclass Ratio  What problems do you see with this metric?

    216 -263

Measuring The Project

 Brian Henderson-Sellers has written the following:  “There has been a natural tendency in industry to focus on easily obtained counts.” (p. 124)  “It is a sad indictment on much of the metrics literature that prognostic equations have been inferred where none exists.” (p.31)  “While managers might want a number as an ‘indicator’ … it is our professional responsibility not to succumb to delivering a number that has neither empirical nor theoretical validation.” (p. 32) 217 -263

Measuring The Project

 “Researchers have not yet been able to find measures that are practical, scientifically sound, and cost-effective to capture and analyze.” Pfleeger, Jeffery, Curtis, and Kitchenham Status Report on Software Measurement

IEEE Software

42(2) 32-43, 1997 218 -263

Measuring The Project

 Jerry Weinberg’s rules of metrics:  Don’t measure it unless you know what it means.

 Don’t measure it unless you are going to do something differently depending on the measurement.

219 -263

Communicating with Stakeholders

 What to communicate?

“The CCPDS-R acquisition included two distinct phases: CD and FSD. During CD a total of 4,163 source lines of prototype components were developed and executed. The Common Subsystem is comprised of six CSCIs. These are a first-generation middleware solution that enabled a true component-based approach to distributed architecture through the use of generic tasks, processes, sockets, and circuits in a real-time run-time infrastructure.” 220 -263

Communicating With Stakeholders

 When to communicate?

“Thank you for explaining the system requirements to us. I propose we meet back here three years from today at 9:00 am at which time we will demonstrate the completed system.” 221 -263

Communicating With Stakeholders

 How to communicate?

222 -263

Automated Tools

• Developer – CASE – Compiler – Debugger – Class browsers – GUI and DB builder – Configuration management – Testing – Installshield • Project Management – Project tracking – Estimating – Scheduling • Both – Word processor – Spreadsheet – Group communications – Defect tracking

Summary

    224 -263

225 -263

“I cannot tell you all the things whereby ye may commit sin; for there are … so many that I cannot number them.” -

Mosiah 4:29

Chapter 4

Pitfalls Of Object Oriented Development

226 -263

Chapter Objectives

 At the end of this chapter you will be able to identify some of the pitfalls of object oriented development and, for each:  Identify its symptoms  Predict its consequences  Aid in prevention 227 -263

Credit

 Some of of the ideas in this chapter are taken from Bruce Webster’s book

Pitfalls of Object-Oriented Development

.

228 -263

Pitfalls

 Let’s consider the following pitfalls: 1) 2) 3) 4) 5) 6) 7) Going Object Oriented For All The Wrong Reasons Confusing Training With Skill Confusing Prototypes With Finished Products Not Enrolling Management Before Starting Underestimating The Resistance Abandoning Good Software Engineering Practices Lying To Yourself And Others 8) 9) Focus on Design, Neglect of Analysis Ignorant Management 10) Overconfidence in the Technology 229 -263

For All The Wrong Reasons

Pitfall #1

 There are many good reasons to move to the object-oriented paradigm. Unfortunately, there are many more bad reasons:  Your boss read an article about it  You want to immediately cut back the size of the development staff  You think it will reduce the need for testing  You think you can complete the project five to ten times faster  You think it will eliminate all your software engineering problems 230 -263

Confusing Training With Skill

Pitfall #2

 Training is “the acquisition of information, knowledge and understanding toward a certain end.”  Skill is the ability to properly apply the knowledge gained through training. Skill comes only through experience; experience comes only through time and effort.

 Knowing the language syntax does not mean you know how to best program in a language. Programming effectively does not guarantee skill in discovering classes and organizing them into useful hierarchies.

231 -263

Prototypes vs. Finished Products

Pitfall #3

 Object-oriented development combined with powerful class libraries and tools can significantly speed prototype development.

 But, people not involved in development – management, investors, and customers – often see prototypes as almost completed applications.

232 -263

Not Enrolling Management

Pitfall #4

 “It is easier to beg forgiveness than ask for permission” is an often quoted maxim but “easier” doesn’t necessarily mean “better”.

 It is often common for a development group to begin using object-oriented technology with only minimal involvement and education of upper management.

 When additional tools and training are needed, when the schedule slips, when other unfortunate things happen, what will management say?

233 -263

Underestimating The Resistance

 Objects are wonderful. So you push ahead to use the technology.

Pitfall #5

 You are surprised that people are pushing back.

 Sometimes it a guerrilla attack, other times its a full frontal assault, but you are becoming fully involved in a political battle.

234 -263

Underestimating The Resistance

 Why are people resisting:  They don’t understand object technology  They don’t want to understand object technology  They are afraid they won’t be able to understand object technology  They like how they currently do things  They don’t like the language, methodology, or tools you are proposing  They want to adopt object technology, but want to do it their way 235 -263

Abandoning Software Engineering

 Many good software engineering practices are already under pressure.

Pitfall #6

 Many developers don’t know them and aren’t willing to learn them. Those who do know are not given the opportunity to practice them.

 And, since object-technology is so wonderful, we don’t really need planning, estimating, tracking, reviews, testing, … 236 -263

Lying To Yourself And Others

Pitfall #7

 Self delusion and group delusion is not uncommon in software development projects. This is not surprising. We have to be eternal optimists in this business; otherwise, we would never even try many of the projects we do.

237 -263

Lying To Yourself And Others

 We often deceive ourselves:  About the time needed to specify and design what will be developed  That no major roadblocks or difficulties will be encountered    By assuming that because a project must be completed by a certain date that it can and will be  238 -263

Focus on Design, Neglect of Analysis

Pitfall #8

239 -263

Ignorant Management

Pitfall #9

240 -263

Over Confidence In The Technology

Pitfall #10

241 -263

Summary

    242 -263

243 -263

Chapter 5

Summing it up

244 -263

What Do Project Managers Do?

 Team Management

Direct

 Plan, Schedule, Track  Resource Allocation  Project Direction  Politics 245 -263

How Does The Use of New Technology Affect These Management Tasks?

 Team Management

Direct

 Plan, Schedule, Track  Resource Allocation  Project Direction  Politics

What’s Different?

246 -263

Team Management

Direct

Manage the people on the team

Motivation

Conflict resolution

Evaluation, promotion

Recruitment, retention

Career development

Task assignment

What’s Different?

247 -263

Scheduling

Direct

 Plan and Track the project  Detailed planning and scheduling     Per person planning and tracking Iterations Increments Testing, Deployment, Support  Big Picture   Functionality vs Time Tradeoffs Delivery dates

What’s Different?

248 -263

Resource Allocation

Direct

      

Staffing Software development tools Software components Computer resources External resources Space Etc.

What’s Different?

249 -263

Direction

Direct

Keep the project direction aligned with the stakeholders vision

Quality vs. Functionality vs. Cost tradeoffs

What’s Different?

250 -263

Politics

Direct

Project interface and team buffer

  Manage stakeholder relationships Protect the team from the whims of exterior forces   Manage interaction with other teams, such as testing and quality assurance  Negotiate with upper level management and project stakeholders Fight for resources

What’s Different?

251 -263

Top 10 Principles of Modern Software Management

Direct

1.

2.

3.

4.

5.

Base the process on an

architecture first

approach.

Establish an

iterative lifecycle process

that confronts risk early.

Transition design methods to emphasize

component-based development.

Establish a

change management environment

.

Enhance change freedom through tools that support round-trip engineering .

6.

7.

8.

9.

Capture design artifact in rigorous,

model-based notation

.

Instrument the process for

objective quality control

and progress assessment.

Use a

demonstration-based approach

artifacts.

to assess intermediate Plan intermediate releases in groups of usage scenarios with

evolving levels of detail

.

10.

Establish a

configurable process

that is economically scalable.

Royce

252 -263

Software Management Best Practices

Direct

         Formal risk management - using an

iterative process

.

Agreement on interfaces - same intent as

architecture-first

principle.

Formal inspections Metric-based scheduling and management - directly related to

model-based notation

and

objective quality control

principles.

Binary quality gates at the inch-pebble level -

evolving levels of detail

principle.

Programwide visibility of progress versus plan.

Defect tracking against quality targets - directly related to

architecture-first

and

objective quality control

principles.

Configuration management - same reasoning behind the

change management

principle.

People-aware management accountability.

Software Acquisition Best Practices Initiative Airlis Software Council

253 -263

Thanks!

 Thanks for attending this course.

 Let us know about your project management work. We’d like to hear about your successes and your difficulties.

 My e-mail address is: [email protected]

254 -263

255 -263