SESSION CODE: ARC201 A Modular Approach to Development Process Juval Lowy IDesign www.idesign.net ©2010 IDesign Inc.

Download Report

Transcript SESSION CODE: ARC201 A Modular Approach to Development Process Juval Lowy IDesign www.idesign.net ©2010 IDesign Inc.

SESSION CODE: ARC201
A Modular Approach to Development Process
Juval Lowy
IDesign
www.idesign.net
©2010 IDesign Inc. All rights reserved
About Juval Löwy
Software architect
Consults and trains on .NET architecture
Microsoft's Regional Director for the Silicon Valley
Recent book
Programming WCF Services (2010 O’Reilly)
Participates in the .NET/WCF design reviews
Publishes at MSDN and other magazines
Speaker at the major international development conferences
Recognized Software Legend by Microsoft
Contact at www.idesign.net
Objectives
Describe only the way key process areas are affected by
componentizing you system
Today we call these things "services"
Yesterday in was "objects"
Each area has much more to it
Suitable for small teams (<7)
Scaleable ?
Everything described is practiced in real life
Metrics and the charts are normalized projects data
What you want to build:
What is a Module?
Ideally a single WCF service class
Can treat a few interacting services as a single logical service
Each module is perceived a world on its own right
Project Planning
Staffing
Product life cycle
Service life cycle
Services integration plan
Staffing
Is this a good design?
Staffing
Is this a good design?
Staffing
Is this a good design?
Staffing
Is this a good design?
Staffing
Cost or Effort
Balance number of services with development effort
Minimum Cost
Cost to
Cost / Service
Integrate
Number of services
Staffing
Not having a skilled architect is the #1 risk
Rather than the technology itself !
Requirements analysis and architecture are contemplative time
consuming activities
More firepower does not expedite
Single architect usually suffices
In large projects, have a senior architect and a junior/apprentice
Assign a service to individual developer (1:1)
Assembly boundary is team boundary
Staffing
Interaction between team members is isomorphic to interaction
between services
Staffing
A good design (minimized interactions, loose coupling,
encapsulation) minimizes communication overhead
Staffing Distribution
Get an architect
Architect breaks product into services
5
Construction
Staff
4
CM and Sys. Testing
3
Management
2
Architecture
Marketing/Product management
1
Jan-
Feb-
Mar-
Apr-
May-
Jun-
Jul-
Calendar Months
Aug-
Sep-
Oct-
Nov-
Product Life Cycle - Staged Delivery
Preparation
Requirement & architecture
Stage 1
Interface compatible
emulator
Stage 2
Basic
capabilities
Stage 3
Setup, logging,
configuration editor
Stage 4
UI elements
Release
Product Life Cycle - Staged Delivery
Preparation
Requirement & architecture
Software
Infrastructure
Stage 0
TA FileNet Viewing
Stage 1
eFiling I
Stage 2 – Release 1.0
eFiling II
Stage 3 – Release 1.5
Tax Calculations
Stage 4
eFiling III
Stage 5 – Release 2.0
Batch Headers
Stage 6
Surrogate Data Entry
Stage 7 – Release 2.5
Release
Services Integration Plan
Derived from services dependency graph
Start bottom-up
Avoids “big-bang” syndrome
Risk reduction oriented
Incompatibility discovered early
Daily builds and smoke tests to test evolving system
Regression if needed
Incrementally build a system
Provide a tested working system at each increment
Services Integration Plan
HW
01/12
S1
01/28
DA
S2
Config
05/05
02/01
Command
S3
03/22
Queue
S4
Setup
Handler
04/05
Emulator
01/13
S5
Client App
System
Services Integration Plan
11
16
17
18
Critical Path
Alternate Path
Milestone
12
8
Activities
5
7
4
13
10
6
9
3
14
15
2
2 Requirements
3 Infrastructure
4 Interface Data Access
5 Interface Manager
6 Security Data Access
7 Security
8 Admin Client
9 Utilities
10 TA FileNet
11 eFiling Client
12 eFiling Web Service
13 Form Manager
14 Form Data Access
15 Form Engine
16 eFiling Integration
17 System Testing
18 Release 1.0
Services Integration Plan
29
Critical Path
Alternate Path
28
Milestone
23
21
27
22
26
20
24
3
25
Activities
3 Infrastructure
20 Tax Authority Data Access
21 Tax Authority Data Conversion
22 Tax Authority Manager
23 Tax Authority Integration
24 Accounts Data Access
25 Accounts Data Conversion
26 Accounts Manager
27 Accounts Integration
28 System Testing
29 Release 1.5
Service Life Cycle
SRS
SRS
Review
Some
Construction
STP
Detailed Design
(standard documentation)
Design Review
Test Client
Integration
Testing
Construction
Code
Review
Service Testing
EVERY service has its own testing environment
Visible signs of progress to management
Spice up “boring” testing
Test all method calls, call backs and errors (white box)
Fall back to isolate problems
Assumption - no need to test the test SW
System level test SW is provided to customer as well
Service Testing
Estimation and Tracking
Service-based effort estimation
Service-based earned value tracking
There is No Silver Bullet
SOA/WCF projects do not take less than
.NET projects
Marginal overall improvement
in time to market
Applications are more complex
Service-Based Effort Estimation
Use estimation tools
Team members participate in estimation
Itemize lifecycle of all services
Do not omit:
Learning curves
Test clients
Installation
Integration points
Peer reviews
Documentation
Service-Based Effort Estimation
Both underestimation and overestimation are deadly
Padding allows for increased complexity
Gold plating
Inverted pyramid
Too aggressive schedule guarantee failures
Cutting corners and best practices
Nominal estimation maximizes probability for success
Service Based Effort Estimation
Requirements List and Resouces Allocation
Owner
Perform market research
SW development plan
Prototype SW interfaces
SRS
Test plan
Configuration management plan
Build environment
Requirements management
detailed estimation
User manual
Architecture
SW Detailed design
Serial communication
Emulator
Commands queue
Modular handler
Error handling
Error logging
Installation program
Configuration editor
Integration and testing
Client application
Commands
Save Type
Release activities
Project Management
Total by Category (man day)
Total by category (man month)
Total (man month)
Total - Zise
Training
C++
NT
MFC
Win32 Interface
COM
Advance COM
Domain Knowledge
Total Training
Total Training (man-month)
JL
JL
JL
JL
QA
JL
JL
JL
JL,FN
MW
JL
JL
JL
CP
JL
CP
CP,JL
CP
KD
KD
QA, KD, CP,JL
KD
CP
CP
QA, KD, CP,JL
FN,JL
Status
% Completed
100%
75%
100%
75%
0%
50%
100%
25%
25%
25%
75%
25%
75%
0%
25%
0%
0%
0%
0%
0%
0%
0%
0%
0%
0%
5%
% Remainder
0%
25%
0%
25%
0%
50%
0%
75%
75%
75%
25%
75%
25%
100%
75%
100%
100%
100%
100%
100%
100%
100%
100%
100%
100%
95%
Planning
and Design
30
12
7
15
10
5
10
15
2
Construction
(code, test)
System
testing
Size by LOC
30
35
15
15
15
5
81
272
13.6
20
30
10
20
20
15
10
30
30
60
30
10
3
60
378
18.9
10
5
10
10
5
3
20
15
140
10
5
1355
810
1236
2300
500
500
2500
3500
2000
820
2000
60
293
24.4
943
56.9
17521.0
40
120
20
120
120
240
240
120
1020
51
Stage 3
UI Elements
Wafer ID
Carried ID
Wafer Management
Advanced error logging
30
10
10
30
10
40
15
15
40
20
20
10
10
20
5
Total
4.3
6.2
3.1
Earned Value Planning
Assign value of work item for the completion of service
Compare earned value (sum of all accomplished activities across
services) against effort spent
Can predict completion date and costs
Earned Value - Example
Activity
Effort Estimated
Earned Value
Architecture
40 days
20 %
DB Service
30 days
15 %
UI
40 days
20 %
Control Service
20 days
10%
Queue Service
40 days
20 %
System Testing
30 days
15 %
200 days
100 %
Total
Earned Value - Planning
Build a Gantt chart to derive actual dates
And convert work days to calendar days
Given each activity scheduled date and earned value plot planned
progress
100%
Planned
Completion
Date
Date
Earned Value - Planning
Can detect unrealistically optimistic plans
100%
Date
Earned Value - Planning
Earned Value - Planning
Can detect unrealistically pessimistic plans
100%
Date
Earned Value - Planning
Pitch of plan curve is the planned team throughput
Fixed teams and resources should yield straight line
100%
Date
Earned Value - Planning
Properly staffed and planned projects yield shallow S
Construction done
End of system testing
100%
Planning Done
Dev staffing
Date
Core team only
Earned Value - Planning
Plan is both for progress and effort
Progress, Effort
%
Date
Earned Value - Example
When requirements, DD and test plan completed, the service is 45%
done
Activity Phase
% Completed
Detailed Requirement
15
Detailed Design
20
Test Plan
10
Construction
40
Documentation
15
Earned Value - Example
Finding accumulated earned value:
Activity
Effort Estimated
Accomplished
Earned Value
Architecture
20 %
100 %
20 %
DB Service
15 %
75 %
11.25 %
UI
Control Service
20 %
10 %
45 %
0%
9%
0%
Queue Service
20 %
0%
0%
Sys. Testing
15 %
0%
0%
Total
40.25 %
Earned Value - Example
Can track effort spent on activities
Time spent
Unrelated to progress
Earned Value - Example
Earned Value - Example
Earned Value - Example
Earned Value Projections
Can extrapolate actual progress line
Project actual completion date early on
Can extrapolate actual effort line
Project actual cost early on
Take corrective actions when still have effect
%
Plan
Progress
Effort
100%
Planned
Planned
Completion
Date
Date
%
Plan
Progress
Effort
100%
Planned
Planned
Completion
Date
Date
%
Plan
Progress
Effort
100%
Planned
Planned
Completion
Date
Projected Schedule
Overrun
Projected
Completion
Date
Date
%
Plan
Progress
Effort
100%
Planned
Planned
Completion
Date
Projected Schedule
Overrun
Projected
Completion
Date
Date
%
Projected
Cost
Plan
Progress
Effort
Projected Cost
Overrun
100%
Planned
Planned
Completion
Date
Projected Schedule
Overrun
Projected
Completion
Date
Date
Earned Value - Example
Earned Value - Symptoms
Life is good
Plan
Progress
Effort
Earned Value - Symptoms
Under Estimating
Plan
Progress
Effort
Earned Value - Symptoms
Resource Leaks
Plan
Progress
Effort
Earned Value - Symptoms
Over Estimating
Plan
Progress
Effort
Division of Labor
Architect plans
Designs the project
Project manager tracks
Both close the loop
Bonus Material
Metrics - Code Categories
9%
Source Code
Contracts, Proxies
48%
Misc (Config, metrics, build…)
34%
Installation Scripts
Test Environment Source Code
Sample Client Program
6%
1% 2%
Metrics - Code Growth
V1.0 Code Growth by Category
120000
Install & Setup
80000
Misc
Contracts, Proxies
60000
Example Client Code
Test Code
40000
Services code
20000
5/19
4/19
3/19
2/19
1/19
12/19
0
11/19
Lines of Code
100000
Metrics – Complexity
Complexity is better indication than size in predicting
Quality
Cost
Schedule
Complexity is easy to measure objectively
Cyclomatic complexity
Much easier than
Function points
User screens and inputs
Indirect measure of design
Not just code
Metrics – Complexity
For .NET, use Reflector Add-In
http://reflectoraddins.codeplex.com/Wiki/View.aspx?title=CodeMetrics
Metrics scope
Method
Type
Assembly
Complexity metrics
Size
Types/fields/methods/properties
Call depth
Cyclomatic complexity
Metrics – Complexity
Can provide graphical view of complexity
Represented by area
Non-linear as it should be
Metrics – Complexity
When fighting poor quality focus on complex types first
Set threshold for acceptance of no more than 100-150 CC
75 CC is probably the practical value
Internal and external code
Refactoring effort should yield at least
square root reduction in CC
Set on-going goals for decreased CC metrics
More at TechEd
AppFabric Service Bus Design Patterns
Thursday 9:45 AM
Resources
Programming WCF Services 3rd Edition
Juval Löwy, O'Reilly 2010
www.idesign.net
Code library
Coding standard
Sample architecture report
IDesign Method™
Master Classes
Architect’s Master Class
November 15-19, California
http://www.idesign.net/idesign/download/IDesignCD.zip
www.microsoft.com/teched
www.microsoft.com/learning
http://microsoft.com/technet
http://microsoft.com/msdn