Function Point Training & Certification Preparation

Download Report

Transcript Function Point Training & Certification Preparation

Function Point Training

Instructor: David Longstreet [email protected]

www.SoftwareMetrics.Com

816-739-4058 www.SoftwareMetrics.Com

119

Two Day Schedule

 Day One – Function Point Concepts – Measurement Theory – Estimating Models  Day Two – Function Point Case Studies – More on Estimating Models www.SoftwareMetrics.Com

120

Course Objectives

 Understand and apply function point concepts  Understand basics of measurement theory  Introduction to software economics  Review/remember some basic statistical concepts www.SoftwareMetrics.Com

121

Estimate the Surface Area of a A can of diet coke www.SoftwareMetrics.Com

122

A different perspective

www.SoftwareMetrics.Com

123

Who is David Longstreet

 Metrics  Background  Clients  Research  Publishing  Teaching www.SoftwareMetrics.Com

124

My

Metrics

   Over 2 million frequent flyer miles Consulted on every continent except Antarctica Presented papers at conferences in USA, Europe, Middle East, Asia and Africa www.SoftwareMetrics.Com

125

Cities

www.SoftwareMetrics.Com

126

And…Milwaukee

www.SoftwareMetrics.Com

www.MAM.org

127

www.SoftwareMetrics.Com

128

www.SoftwareMetrics.Com

129

www.SoftwareMetrics.Com

130

Website Metrics

 About 15,000 unique visitors per month  Over 7,000 visitors view more than 5 pages  Over 1,000 visitors view more than 20 pages.

 Free manual is viewed over 15,000 times per month.

www.SoftwareMetrics.Com

131

Clients

 Clients include Banking & Finance, Aerospace, Retailers, Animal Food, Telephony, Consulting Companies, Medical Research, Defense Contractors, Automotive, Universities, Government Agencies and others  Some clients: MasterCard, Amadeus, Ralston Purina, Lockheed, Transamerica, DirecTV, Biologic, Accenture, Motorola, Nissan, others… www.SoftwareMetrics.Com

132

Type of Work

 Productivity Assessments  Benchmark Studies  Estimating Models  Mergers and Acquisitions  Venture Capital and Initial Public Offerings  Outsourcing Agreements  Expert Testimony www.SoftwareMetrics.Com

133

Learning from organizations

(ethology)  Collect both quantitative and qualitative data  Observe behaviors, customs, rituals, myths and ways of life  Examine artifacts and physical evidence  Build holistic picture of organization  Trend the industry as a whole www.SoftwareMetrics.Com

134

Worst Practices

      No historical data Failure to monitor and report status Creating analysis documentation after coding Excessive and irrational schedule pressures Failure to establish clear acceptance criteria Reduce testing time to make schedule www.SoftwareMetrics.Com

135

Research

 Dale Jorgenson, Harvard Business School – Historical Study of Productivity Rates for Software Development (1950 – present).

 Bureau of Economic Analysis – Methods of collecting and reporting software productivity rates based upon Function Points.

– Measuring the IT Economy.

 Securities Exchange Commission www.SoftwareMetrics.Com

136

Adjunct Professor

(Avila University MBA & Graduate Psychology)  Industrial Organization Psychology  Managerial Economics  Statistics  Quantitative Analysis  E-Commerce www.SoftwareMetrics.Com

137

There are lies, dammed lies and statistics.

www.SoftwareMetrics.Com

Mark Twain Hannibal, Missouri 138

Negative things I have heard….

 Software Voodoo!

 Figures won't lie but liars will figure  You may prove anything with metrics. www.SoftwareMetrics.Com

139

Scientific Method

 Knowledge comes from – Systematic observation – Measurement of particular variables & events  Develop both descriptive & predictive metrics.

– Descriptive: describes current environment – Predictive: used to estimate www.SoftwareMetrics.Com

140

Theory of Measurement

 The use of numbers to represent events, variables and characteristics.

 Quantitative variables  Qualitative variables www.SoftwareMetrics.Com

141

www.SoftwareMetrics.Com

142

What gets measured gets done

 Partially Correct  What gets rewarded gets repeated  Measurements without consequences get ignored.

www.SoftwareMetrics.Com

143

Background Summary

 Constantly learning  Industry diverse client base  Geographically diverse client base  Research/Publishing  Growing industry knowledge  Better ways to teach and instruct www.SoftwareMetrics.Com

144

Measurement Theory

 Introduction to measurement  Productivity  Scientific method  Historical perspective  Introduction to software economics  The idea of function points www.SoftwareMetrics.Com

145

Measurement Theory

 Concept not new to many other disciplines – Scientific Method  Relatively new to software development www.SoftwareMetrics.Com

146

Productivity

 The output-input ratio within a time period with due consideration for quality.

 Productivity = outputs/inputs www.SoftwareMetrics.Com

147

Improving Productivity (reducing unit costs)

 Costs/FP  Hours/FP  We can mathematical reduce unit cost by  Reducing Cost  Increasing FP produced www.SoftwareMetrics.Com

148

Productivity

 It is difficult (maybe impossible) to reduce productivity by reducing cost.

 Cost / FP  In fact, for each $1 reduction in cost increases cost/fp by about $1.18

www.SoftwareMetrics.Com

149

Effectiveness v. Efficiency

 Effectiveness is the achievement of objectives  Efficiency is the achievement of the ends with least amount of resources.

www.SoftwareMetrics.Com

150

www.SoftwareMetrics.Com

151

International Weights & Measures

 International weights and standards – Standardization in 1863  Motivation for international weights & standards was driven by trade and specifically international trade.

www.SoftwareMetrics.Com

152

Concepts from Industrial Revolution

 Management methods 1901  Break - Even charts 1903  Centralized accounting 1908  Total Quality Management 1950’s www.SoftwareMetrics.Com

153

Size of Largest Projects Since 1970

25,000 20,000 15,000 10,000 5,000 0 1970 1980 1990

Years

2000 2005 www.SoftwareMetrics.Com

154

Hours/FP Since 1970

(Unit Cost) 70 60 50 40 30 20 10 0 1970 1980 1990 2000 2005 www.SoftwareMetrics.Com

155

Software Past, Present, Future

www.SoftwareMetrics.Com

156

Failure Rates are High

 75% of metrics programs fail.

 70 – 80% of people who try to stop smoking do not succeed.

 60% of Alcoholics return to drinking.

 93% of of those addicted to gambling gamble again within 1 year.

www.SoftwareMetrics.Com

157

If at first you don’t succeed

 The most thoroughly accepted notions in psychology is the principle that behavior eventually extinguishes if it is not followed by reward.

 What gets rewarded gets done!

www.SoftwareMetrics.Com

158

Psychology

Software Economics

Software Development

www.SoftwareMetrics.Com

159

Software Economics

 Study Prices and Costs  Study Behavior  Study the whys  Comparative analysis www.SoftwareMetrics.Com

160

All systems have…..

www.SoftwareMetrics.Com

Inputs Storage Outputs 161

Two Major Categories

 Transactions – Inputs – Inquiries (reads) – Outputs (calculations)  Storage – Maintained Data – Referenced Data www.SoftwareMetrics.Com

162

Validity of Function Points

 Face Validity – Does this make sense?

 Predictive Validity – Useful for predicting effort, time, cost, other?

 Convergent Validity – Do FP’s move in same direction as LOC, Test Cases, Use Cases, Objects www.SoftwareMetrics.Com

163

Face Validity

 Does this idea make sense?

 Can a software application be sized by looking at transactions and files?

 Are there other items we need to consider?

www.SoftwareMetrics.Com

164

Predictive Validity

 Does function points actually predict what it is suppose to predict?

 To what degree (how accurate) – The effort required to implement 5,000 fp's is more than 500 fp’s?

– How about 4,500 fp’s v. 5,000 fp’s?

www.SoftwareMetrics.Com

165

Predictive Validity

(other inputs)  How accurate is historical time reporting, staffing levels, defect tracking?

– Hours Per Function Point – Duration Per Function Point – Defects Per Function Point  How accurate are project plans?

www.SoftwareMetrics.Com

166

Convergent Validity

 Function Points x 1.2 approximates test cases.

 Lines of Code divided by 100 approximates function points.

– Varies by language  Seems to be a positive relationship between test cases and function points  More staff is needed as function points increase.

www.SoftwareMetrics.Com

167

Measurement Theory

 Introduction to measurement  Productivity  Scientific method  Historical perspective  Introduction to software economics  The idea of function points www.SoftwareMetrics.Com

168

Understanding Software Development Costs

 Large Projects  Marginal Cost  Interval Estimating v. Point Estimating  Building Estimating Models  Samples v. Populations  Industry Data www.SoftwareMetrics.Com

169

Large Projects

 Increasing Marginal Cost – As size increases unit cost rise.

 Any large engineering or construction project follows the same economic model.

www.SoftwareMetrics.Com

170

Marginal Cost

 The change in total cost attributable to a one-unit change in output.

 The unit cost of software is not fixed.

 Unit cost changes as the size of the project changes. www.SoftwareMetrics.Com

171

www.SoftwareMetrics.Com

172

www.SoftwareMetrics.Com

173

Individuals Range of Productivity

 Range of individual productivity can be 100 times  An expert developer may be 100 times more productivity than a novice.

 True for other industries as well – Automotive Mechanics – Carpentry – Cake Decorating – Roofing www.SoftwareMetrics.Com

174

Cake Decorating?

www.SoftwareMetrics.Com

175

Paying by the hour

 What behavior is encouraged.

 What type of person is hired.

By the way, cake decorators charge by the piece of cake.

www.SoftwareMetrics.Com

176

Industry Data

 $1,500 function points with a margin of error of $500  90% confidence level  Range is $1,000 – $2,000 per function  Range in hours is 100 – 200 hours per function point.

www.SoftwareMetrics.Com

177

Transamerica Building

 499,000 square feet  $144 Million to construct  $290/Square Foot – National Average $102/Sq. Ft – San Francisco Average $140/ Sq Ft.

www.SoftwareMetrics.Com

178

Comparative Costs Per Square Foot

 Office Space $69 - $290  Warehouse $28 - $43  Single Unit Retail $46 - $71  Depending on the type of software application being constructed unit costs varies.

www.SoftwareMetrics.Com

179

Some Statistics

 Range $28/Sq. Ft. - $290/Sq. Ft.

 Average $75/ Sq. Ft  Standard Deviation $54/ Sq. Ft.

 We see similar variations in software costs also.

www.SoftwareMetrics.Com

180

“Many other Factors”

 Size is size  All those other factors are part of the unit cost – $/Sq. Foot – $/ Function Point www.SoftwareMetrics.Com

181

Samples v. Populations

 You do not have to FP count every single application to understand organizational productivity!

 You do not have to examine every single project to understand organizational trends.

www.SoftwareMetrics.Com

182

Gathering Historical Data Samples v. Population

 Population – your entire application portfolio  Sample – key projects/applications to help you understand the entire application portfolio www.SoftwareMetrics.Com

183

Random v. Selective

 Select a range of projects – Best – Worst – In between  Need only 12 or so projects www.SoftwareMetrics.Com

184

How & What

 To really understand the quality of products you must first look at the organization that produces the software.

 How you do it & what you do.

www.SoftwareMetrics.Com

185

Principles

 It is not possible for an unstable organization to produce consistent high quality products.

 It is not possible for an unstable organization (inconsistent) to produce accurate estimates.

www.SoftwareMetrics.Com

186

Who makes software?

 People are the major input used to develop software.

 Understanding human behavior is important to understanding the software development process.

What type of behavior is being encouraged?

www.SoftwareMetrics.Com

187

www.SoftwareMetrics.Com

188

Planning

Cost to Develop Software

Testing/ Implementation

$/FP to make changes Unit cost > < = $/FP to make changes Unit Cost

www.SoftwareMetrics.Com

189

90

th

Percentile

 Has a requirements glossary  Determines root cause & phase found  Establishes acceptance criterion at termination of each phase  Formal inspections – Including “test plans” for all phases.

 Metrics program www.SoftwareMetrics.Com

190

www.SoftwareMetrics.Com

191

www.SoftwareMetrics.Com

192

Statistics Review

 Margin of Error  Standard Deviation  Average (or mean)  Confidence Intervals www.SoftwareMetrics.Com

193

Margin of Error

 Used for interval estimating  Establish upper and lower boundaries  95% Confidence Interval Average  1.96

n

where  = standard deviation and

n

is the sample size www.SoftwareMetrics.Com

194

Example

 You sample 16 projects – Average productivity rate is 164 hours per function point – Standard Deviation is 83 hours per function 83 164  1 .

96 16 123 hrs/fp to 204 hrs/fp with a mean of 164 hrs/fp www.SoftwareMetrics.Com

195

Example Continued

We are 95% confident the project will be completed between 123 hrs/fp to 204 hrs/fp with a mean of 164 hrs/fp Project Size was 1,000 fp’s then….

Project is estimated to between 123,000 and 204,000 total hours.

In this case, we vary the delivery rate not the size of the project.

www.SoftwareMetrics.Com

196

Probability of being late

 Let’s assume it is decided the estimate should be 140 hours per function point.

 lower = 123, mean = 160 and upper = 204  What is the probability of missing this estimate?

 40% based upon past historical performance www.SoftwareMetrics.Com

197

95% Certain?

 204 Hours Per Function Point.

 lower = 123 mean = 160 and upper = 204 www.SoftwareMetrics.Com

198

Scope Changes

 If size of project increases and estimate is not revised, then what must happen to productivity?

 Productivity has to increase at a much greater rate than increasing size of the project.

www.SoftwareMetrics.Com

199

Standard Deviation

 How spread out is your historical delivery rate  Measures consistency  Measures dispersion www.SoftwareMetrics.Com

200

In Other Words…

 The larger the standard deviation, the larger the margin of error – The wider the confidence interval – Risk is higher of not hitting an average productivity level.

www.SoftwareMetrics.Com

201

Standard Deviation Measures

 How consistent is the development process.

 Large dispersion of data indicates projects are developed differently every single time – Difficult to predict  Small dispersion of data indicates projects are developed consistently.

www.SoftwareMetrics.Com

202

Size is Relative

www.SoftwareMetrics.Com

203

What is a Function Point?

 Function Points are a unit of measure – Like a hour is to measuring time – An inch is to measuring distance – A degree is to measuring temperature  A unit is important to understanding and communicating such metrics as

Average Cost.

www.SoftwareMetrics.Com

204

Other Interval Measures

 Examples – Miles, Meters, Gallons, Liters, Pounds, Celsius, Feet, Kelvin  Familiarity of scale is important – Is 293.5K, warm, cold, hot, or what?

– Is 2,000 Function Points big, small, medium or what? www.SoftwareMetrics.Com

205

Industry Specific Metrics

 Calories  Octane  BTU’s  Watts  Joule www.SoftwareMetrics.Com

206

Sophisticated Users Perspective

www.SoftwareMetrics.Com

207

www.SoftwareMetrics.Com

208

Who is faster?

www.SoftwareMetrics.Com

209

Treating Symptoms

 Focusing on defect removal is not an effective way to improve quality.

– It is better to concentrate on and measure defect prevention.

 Holding teams accountable for estimates is not an effective way to improve estimating.

www.SoftwareMetrics.Com

210

Estimating Models

 Past performance is the best indicator of future performance.

 Best models are built based upon internal historical data  Develop estimates with confidence intervals.

www.SoftwareMetrics.Com

211

www.SoftwareMetrics.Com

212

www.SoftwareMetrics.Com

213

www.SoftwareMetrics.Com

214

Estimating Questions

 Three question managers need to ask 1.

How did you come up with your estimate?

2.

How do you plan on revising your estimate?

3.

At what points in time will you provide an update to your estimate?

www.SoftwareMetrics.Com

215

Chapter 1 Questions

(pg.16)

Problem 1

How would you estimate the number of hot chocolates being sold at the AFC Championship game in Kansas City?

What are the keys factors to consider? Who would you benchmark against and why?

Problem 2

What is the average cost per mouse pad if you produce 1,000 units at the following costs?

Artwork is a fixed cost at $500 Sets Up costs are $250 Shipping costs are $10 Papers for production will cost $2.50 per unit.

Pads are $ .25 per unit.

Application of paper to pad cost is $.35 per unit.

Are the unit costs the same for all items? Is it correct to assume that unit costs are fixed for software? www.SoftwareMetrics.Com

216

Function Points?

www.SoftwareMetrics.Com

217

Introduction to FPA

 What is a function point  When to count function points  Benefits of FPA  The classification of components of FPA  Problems with other metrics  Constraints to FPA www.SoftwareMetrics.Com

218

What is a Function Point?

 Function Points are a unit of measure – Like a hour is to measuring time – Or a inch is to measuring distance  A unit is important to understanding and communicating such metrics as

Average Cost.

www.SoftwareMetrics.Com

219

Function Point

 A software application is in essence a defined set of elementary business processes.

– Reports, Online's, Updates, – Interfaces between applications, dynamic menus, so on and so forth.

www.SoftwareMetrics.Com

220

www.SoftwareMetrics.Com

221

Update Management Functional Design Specifications

www.SoftwareMetrics.Com

222

www.SoftwareMetrics.Com

223

www.SoftwareMetrics.Com

224

www.SoftwareMetrics.Com

225

Planning

Est.

FP C

Function points are counted across the systems Analysis

Est.

FP C

Design

Est.

FP C

Build and Implement

Delivery SDLC

Actual

FPC Revised Estimate

www.SoftwareMetrics.Com

226

Function Points Analysis

 Is a structured technique of classifying components of a system.  Is a method to break systems into smaller components, so they can be better understood and analyzed.  Measures software by quantifying its functionality provided to the user based primarily on the logical design.

 Logical functionality from a sophisticated user view rather than a physical view.

 A standard method for measuring software development from the customers point of view www.SoftwareMetrics.Com

227

Components of Function Point Analysis

 Transactional Function  Data Function Types Types – Internal Logical Files (ILF) – External Inputs (EI) – External Interface Files (EIF) – External Outputs (EO) – External Inquiries (EQ) www.SoftwareMetrics.Com

228

Count Transactional Function Types Determine type of Count Identify Boundary Count Data Function Types Determine Unadjusted Function Point Count Determine Adjusted Function Point Count www.SoftwareMetrics.Com

229

Problems with LOC Metrics

 Higher level languages produce less LOC  Better programmers produce less LOC  Actual LOC are known too late to be used for estimating  No consistent method to “count” LOC especially between languages www.SoftwareMetrics.Com

230

Constraints to Counting FP’s

(constraints to software development)  Inconsistent Documentation  Incomplete Requirements  No Requirement Standards  Lack of Functional Understanding  Disorganization www.SoftwareMetrics.Com

231

Could we size a building

 No design or blue print available  Terminology used is inconsistent www.SoftwareMetrics.Com

232

Blueprints

 Imagine a blue print document where – Architect is not using standardized terminology – and has not created a glossary of terms www.SoftwareMetrics.Com

233

Consistent Terminology!

 Terminology used to describe a list of data read from a table.

– Get, Fetch, Retrieve, View, Find, Search, Look, Request, Query, and the list goes on and on.

www.SoftwareMetrics.Com

234

Verbs used to describe Add, Change and Delete                Add Activate Amend (change and delete) Cancel Change Convert (change) Create (add) Delete Deassign Disable Enable Edit (change) Disconnect (change or delete) Insert ( add and change) Maintain (add, change, or delete)                Memorize (add) Modify (change) Override (change) Post (add, change and delete) Remove (delete) Reactivate (change) Remit Replace (change) Revise (change and delete) Save (add, change or delete) Store (add, change) Suspend (change or delete) Submit (add, change or delete) Update (add, change or delete) Voids (change and delete) www.SoftwareMetrics.Com

235

Variations in Counts

 Determine Root Cause – Requirement Not Clear  Single requirement became multiple requirements.

– New Requirement – Missed Requirement – Counted Incorrect www.SoftwareMetrics.Com

236

Project Growth all numbers in Function Points

Project 1 (2,400 - 3,600) Project 9 (3,000 - 3,300) 1000 900 800 700 600 500 400 300 200 100 0 Requirements Not Clear New Requirements www.SoftwareMetrics.Com

Missed Requirements 237

Requirement Not Clear

 Inside every big requirement are a bunch of little ones trying to get out.

www.SoftwareMetrics.Com

238

Requirements Standards

Get

describes the action of retrieving information from a data base or data storage.

– Using another verb to describe this same action would be considered a defect.  Determine the frequency of using another verb such as retrieve or fetch.

 Determine the number of verbs used to describe the same exact action.

www.SoftwareMetrics.Com

239

Example (Project 1)

 Get Cycle Groups (new)  Get Bill Run Groups (new)  Get Bill Run Group Definition (new)  Get Cycle Groups For Renewal Run Groups (new)  Get Accounting Period Configurations (new)  Create/Update Accounting Configurations (new)  Get Aging Level Definition (new)  Create/Update Aging Level Definitions (new)  Get Accounting Period Aging Level Count (new) www.SoftwareMetrics.Com

240

Consistent Terminology

(Project 2)  Create/Update Retailer Product  Create/Update Price Structure  Get Price Structure  Create/Update Product Availability  Create/Update Price Structure Determinant  Get Eligible Products and Prices  Get Eligible Total Bill Discount  Get Total Bill Discount www.SoftwareMetrics.Com

241

Measurement of Requirements

 Number of verbs used to mean the same thing.

 Number of times verb is used to mean a different thing.

 Number of times verb used which did not conform to the “glossary.” www.SoftwareMetrics.Com

242

How to test requirements

 Requirement test plans  Established acceptance criteria  Well defined “what is a requirement!”  Established (and enforced) standards for requirements.

.

 You test against a standard www.SoftwareMetrics.Com

243

Question

 If we know there is a relationship to size and effort, then how good is an estimate where projects grow?

 How much, on average, do your projects grow?

 Knowing the unit cost is not constant, is it reasonable to charge a fixed price or fixed unit price?

www.SoftwareMetrics.Com

244

www.SoftwareMetrics.Com

245

Problems with Measurements

 Difficult to quantify anything disorganized  Time consuming  Benefits are not immediate  Consistency www.SoftwareMetrics.Com

246

1.

2.

3.

4.

5.

6.

7.

8.

Motivation of Metrics

Stabilizing Processes Data for Estimating Improve Quality Cost Reduction Schedule Reduction Compare Methods Compare Organizations Better Controls www.SoftwareMetrics.Com

247

Characteristics of Effective Measurement Programs

 Aligned with business objectives  Integrated with continuous process  Tied to decision making  Balanced metrics  Focused on measuring processes  Viewed as mission critical www.SoftwareMetrics.Com

248

Understanding Potential Scope Creep

 A project will grow at 1 percent per month during the entire development schedule.

 FP Analysis allows the ability to compare project size at the end of requirements, analysis, design, and implementation.

www.SoftwareMetrics.Com

249

Examples of Project Growth

 A project that is schedule for 12 months will grow over 12 percent.

 A project that is scheduled for 48 months will grow over 48 percent.

www.SoftwareMetrics.Com

250

Calculating Function Points

 Three types of calculations  Development  Enhancement  Application or Baseline www.SoftwareMetrics.Com

251

Benefits of Function Point Analysis

 can be used to size software applications accurately.  can be counted by different people, at different times, to obtain the same measure within a reasonable margin of error.

 are easily understood by the non technical user.

 can be used to determine whether a tool, a language, an environment, is more productive when compared with others.

www.SoftwareMetrics.Com

252

Objectives of Function Point Analysis

 Measures functionality that the sophisticated user requests, receives and pays for.  Measures software development and maintenance independently of technology used for implementation.

 Simple enough to minimize measurement costs  A consistent measure www.SoftwareMetrics.Com

253

Function Points Analysis

 Is a structured technique of classifying components of a system.  Is a method to break systems into smaller components, so they can be better understood and analyzed.  Measures software by quantifying its functionality provided to the user based primarily on the logical design.

 Logical functionality from a sophisticated user view rather than a physical view.

 A standard method for measuring software development from the customers point of view www.SoftwareMetrics.Com

254

Sophisticated User

 Defines Requirements  Participates in Acceptance Testing  Understands flow of Information  Subject Matter Experts (SMEs) www.SoftwareMetrics.Com

255

Issues of Function Point Analysis

 Does not accurately size “heavy algorithm” applications  Hard to understand and to use  Developed for MIS Type applications  Does not take into account many “other” factors www.SoftwareMetrics.Com

256

Counting Procedures

 Step 1 -- Determine Type of Count  Step 2 -- Establish the Boundary  Step 3 -- Identify and Rate Transactional Function Types (assume an average value)  Step 4 -- Identify and Rate Data Function Types (assume an average value)  Step 5 -- Determine the Value Adjustment Factor (assume a value of 1)  Step 6 -- Determine Adjusted Function Point Count www.SoftwareMetrics.Com

257

Count Transactional Function Types Determine type of Count Identify Boundary Count Data Function Types Determine Unadjusted Function Point Count Determine Adjusted Function Point Count www.SoftwareMetrics.Com

258

Boundary

User Domain

External Inputs (EI) External Outputs (EO) External Inquiries (EQ) Internal Logical Files (ILF) Application Boundary www.SoftwareMetrics.Com

EQ EI EO External Interface Files (EIF) Other Applications 259

Boundary

ILF A ILF B ILF C EO

Other Applications

EIF

www.SoftwareMetrics.Com

260

Components of Function Point Analysis

 Transactional Function  Data Function Types Types – Internal Logical Files (ILF) – External Inputs (EI) – External Interface Files (EIF) – External Outputs (EO) – External Inquiries (EQ) www.SoftwareMetrics.Com

261

Function Point Calculation Table

External Inputs Functional Complexity Low Average High __x 3 =__ __x 4 =__ __x 6 =__ Total External Outputs External Inquiries __x 4 =__ __x 5 =__ __x 7 =__ __x 3 =__ __x 4 =__ __x 6 =__ Internal Logical Files __x 7 =__ __x 10 =__ __x 15=__ External Interface Files __x 5 =__ __x 7 =__ __x 10 =__ Unadjusted Function Points Multiplied by Value Adjustment Factor Adjusted Function Points www.SoftwareMetrics.Com

262

High Level FPA Process

Application Documentation Application Experts FPA Rules

FPA

FP

FPA

FPA Major Processes

FPA for Transactional Function Types FPA for Data Function Types FPA for GSCs www.SoftwareMetrics.Com

263

Function Point Domain

EIs, EOs and EQs Transactions Rating is dependent on transactions and files Files ILFs and EIFs Rating is independent of transactions www.SoftwareMetrics.Com

264

FPA for Transactional Function Types Application Documentation T1. Identify Transaction Transaction Model T2. Type of Transaction (EO, EI, EQ) Data Model FPA Rules Transaction Rules T3. Determine DET’s & FTR’s T4. Classify as Low, Average or High T5. Values Determined Functional Complexity Tables of Weight T6. All Transactions are summed to obtain UFP for Transactional Function Types.

www.SoftwareMetrics.Com

265

EI EI ILF A ILF C ILF B EQ EO ILF A ILF B ILF C

========== www.SoftwareMetrics.Com

266

FPA for Data Function Types

Application Documentation F1. Identify logical groupings of information Transaction Model F2. Determine if ILF or EIF Data Model F3. Determine RET’s & DET’s FPA Rules File Rules F4. Low, Average or High Functional Complexity F5. Values Determined Tables of Weight F6. All Files are summed to obtain UFP for Data Function Types.

www.SoftwareMetrics.Com

267

Chapter 2 Questions

Questions:

Is there any benefit to the sequence or order of counting function points? Are transactions independent or dependent on FTR’s (files referenced)?

What about FTR’s? Are they counted independent or dependent of Transactions?

www.SoftwareMetrics.Com

268

Identify components Component External Inputs External Outputs External Inquiries Internal Logical Files External Interface Files RET’s

 

Rate components FTR’s

   RET: Record Element Type FTR: File Type Referenced DET: Data Element Type www.SoftwareMetrics.Com

DET’s

     269

Identifying RET’s, DET’s and FTR’s

Record Element Type (RET)

: A RET is user recognizable sub group of data elements within a ILF or an EIF. It is best to look at logical groupings of data to help identify them.

Data Element Type (DET)

: A DET is a unique user recognizable, nonrecursive field.

File Type Referenced (FTR)

: A FTR is a file type referenced by a transaction.

An FTR must also be a internal logical file or an external interface file.

www.SoftwareMetrics.Com

270

Function Point Calculation Table

External Inputs Low __x 3 =__

Functional Complexity

Average High

__x 4 =__

__x 6 =__ Total External Outputs External Inquiries Internal Logical Files __x 4 =__ __x 3 =__ __x 7 =__

__x 5 =__ __x 4 =__

__x 6 =__

__x 10 =__

__x 7 =__ __x 15=__ External Interface Files __x 5 =__

__x 7 =__

__x 10 =__ Unadjusted Function Points Multiplied by Value Adjustment Factor Adjusted Function Points www.SoftwareMetrics.Com

271

Boundary Defined

 The boundary must be drawn according to the sophisticated user’s point of view.

 The boundary indicates the border between the project or application being measured and the external applications.

 Once the border has been established, components can be classified, ranked and tallied.

www.SoftwareMetrics.Com

272

Chapter 3 Questions

Questions:

1.

In theory, how does making the boundary too large impact a function point count?

2.

What if the boundary is too small?

www.SoftwareMetrics.Com

273

Data Elements Types (DET’s)

 Transactional Functional Types -- data input fields, error messages, buttons, data fields on reports, and calculated values.

 Data Function Types -- unique user recognizable, non recursive fields (columns of information) www.SoftwareMetrics.Com

274

Inventory Report

University MousePads Inc.

Item Description Quantity

Hawk Pad University of Iowa MousePad 1,250 JayPad University of Kansas MousePad 500 HuskerPad University of Nebraska MousPad 3,000 Total MousePads 4,750 www.SoftwareMetrics.Com

275

Inventory Report

University MousePads Inc.

1 3

Item Description Quantity

Hawk Pad University of Iowa MousePad 1,250 JayPad University of Kansas MousePad 500 HuskerPad University of Nebraska MousPad 3,000 2 Total MousePads 4,750 4 www.SoftwareMetrics.Com

276

Data Element Types for GUI’s

 Radio Buttons  Check Boxes  Command Buttons  Result of a Pick List Box  Sound Bytes  Photographic Images www.SoftwareMetrics.Com

277

Data Element Types for Real Time and Embedded Systems

 Temperature  Lamp (on / off)  Channel  Pressure Units (psi / mBar)  Polarity (Normal / Reverse)  Pressure Type (Pressure / Vacuum) www.SoftwareMetrics.Com

278

Chapter 4 Questions

1. The following information is heard in the Rome Train Station. How many data elements are heard? The train arriving from Florence will arrive on Track 46 at 8:30 a.m.

The train arriving from Naples will arrive on Track 43 at 11:00 a.m.

2. The totals on a particular report change colors depending if the amount is above or below $ 500. For example if the amount is -$250 it appears as red (

$250 ),

but if the amount is over $500 then the value appears blue (

$1,000 ).

How many data elements are represented by the number and by the color?

www.SoftwareMetrics.Com

279

External Inputs Defined

 Is an elementary process in which data or control information crosses the boundary from outside to inside.

 The data may come from a data input screen or an other application.

 The data is used to maintain one or more internal logical files. Maintain means to add, change or delete information.

 An external input is rated based upon the number of data elements types (DET’s) and the number of files referenced (FTR).

www.SoftwareMetrics.Com

280

EI ILF A ILF B ILF C EI & 1 FTR (ILF) EI & 1 FTR (ILF) EI & 1 FTR (ILF)

3 EI’s , 3 ILF’s www.SoftwareMetrics.Com

281

ILF A ILF B ILF C 1 EI and 2 FTR

www.SoftwareMetrics.Com

282

ILF A ILF B ILF C 1 EI and 2 FTR

www.SoftwareMetrics.Com

283

Control EI

ILF A ILF B ILF C 1 EI and 0 FTR

www.SoftwareMetrics.Com

284

Control EI

ILF A ILF B ILF C 1 EI and 0 FTR

www.SoftwareMetrics.Com

285

What Makes External Inputs Unique?

 Unique Processing Logic  Different ILFs and EIFs updated and referenced  Different Calculations and/or Algorithms  Data Elements identified are different from other external inputs for the application.

www.SoftwareMetrics.Com

286

Examples of External Inputs

Business Data

: customer name, address, phone, so on and so forth that updates an internal logical file (ILF).

Control Data

: sort sequence, printer port, number of copies may or may not update an ILF.

Rules Data

: number of days before customer is placed for collection, updates an ILF.

www.SoftwareMetrics.Com

287

Examples of External Inputs Real Time Systems

 Hardware to Software states  Operator Controls  Volume Controls  Sensor Readings  Radio Frequencies  Standard and Limit Settings (Alarm Settings)  Other Subsystems Outputs  Initialization Files (control input) www.SoftwareMetrics.Com

288

Examples of Invalid External Inputs

 Log on Screens -- counted as an external inquiry  Menus -- impacts usability not functionality  Navigational Screens -- impacts usability not functionality  Reference Information -- counted as external interface file www.SoftwareMetrics.Com

289

Rating External Inputs

File Type Referenced (FTR’s) Less than 2 2 More than 2

1-4 Low

Data Elements (DET’s)

5-15 Greater than 15 Low Average Low Average High Average High High www.SoftwareMetrics.Com

290

Examples of Data Elements for an External Input

Data Input Fields

: customer name and other business information 

GUI

: Radio buttons and check boxes 

Calculated Values

that are stored 

Error Messages

: a transaction was not completed www.SoftwareMetrics.Com

291

Identification Rules for an EI

 Data is received from outside the application boundary.

 The data in an ILF is maintained though an elementary process.

 The process is self contained and leaves the business of the application being counted in a consistent state.

 Processing Logic must be unique.

www.SoftwareMetrics.Com

292

Chapter 5 Questions

www.SoftwareMetrics.Com

293

External Outputs Defined

 An elementary process in which

derived data

passes across the boundary from inside to outside. The data creates reports or output files sent to other applications. These reports and files are created from one or more internal logical files and/or external interface files.

 Derived Data is data that is processed beyond direct retrieval and editing of information from internal logical file or external interface files. Derived data is usually the result of Edits, Algorithms, or calculations.

www.SoftwareMetrics.Com

294

What Makes an External Output Unique?

 Processing Logic Different  Different ILF’s and EIF’s read and referenced  Unique Set of Calculations www.SoftwareMetrics.Com

295

A/B

=

C

EO

Derived Data

ILF

A

ILF

B A

*

B

=

D Ƒ(A,B)

=

X External Output (EO) and 2 FTR’s

www.SoftwareMetrics.Com

296

Examples of External Outputs

 EO’s almost always contain business data 

Notification Messages

are considered EO’s. A notification message is the result of some business logic processing.

 Textual Reports  Graphical Reports  Reports produced on different media  Electronic Outputs to other applications www.SoftwareMetrics.Com

297

ILF

A Derived Data

ILF

B External Output (EO) and 2 FTR’s

www.SoftwareMetrics.Com

EO 298

Examples of Data Elements for an External Output

 Error Messages  Calculated values on a report  Values on a report that are read from a ILF or EIF  Non recursive values  Generally, do not count report headings (literals) as data elements unless they are dynamic.

www.SoftwareMetrics.Com

299

ILF

A

ILF

B ILF C

www.SoftwareMetrics.Com

EO EO

2 EO’s

300

External Outputs Real Time Systems

 Alarms  Displays to Operator Panels  Communication sent to Hardware Devices  Electronic transmission to other sub-systems  Graphical displays www.SoftwareMetrics.Com

301

EO ILF

A Derived Data

Error Message ILF

B 1 EO, 2 FTR’s & 4 DET’s

4 DET’s ( Blue , Yellow , Green , & Error Message) www.SoftwareMetrics.Com

302

Rating External Outputs

File Type Referenced (FTR’s) Less than 2 2 or 3 More than 3

1-5

Data Elements (DET’s)

6-19 Greater than 19 Low Low Average Low Average High Average High High www.SoftwareMetrics.Com

303

Examples of Invalid External Outputs

 Error Messages  Reports that do not contain derived data  Output side of an inquiry  Undefined “Ad-Hoc” reports  Confirmation Message www.SoftwareMetrics.Com

304

Identification Rules for EO’s

 The process sends data or control information external to the application’s boundary.

 The data or control information is sent via an elementary process.

 The process is self contained and leaves the business of the application in a consistent state.

 Processing logic must be unique.

www.SoftwareMetrics.Com

305

Chapter 6 Questions

www.SoftwareMetrics.Com

306

External Inquiries Defined

 An elementary process with both input and output components that result in data retrieval from one or more internal logical files and/or external interface files.  The input process does not maintain any internal logical files.

 The output side does not contain derived data.

 Unique processing logic from other EQ’s (edits, a reference to or use of an ILF or EIF).

www.SoftwareMetrics.Com

307

Examples of External Inquiries

 Log On screens  Request for a specific record  Help request and answer  Listing of information www.SoftwareMetrics.Com

308

ILF A

ILF B Ƒ(A,B)

=

X

www.SoftwareMetrics.Com

EQ

309

EQ ILF C ILF A ILF B Ƒ(A,B)

=

X

1 EQ’s , 3 FTR’s & 3 DET’s www.SoftwareMetrics.Com

or 310

Examples of External Inquiries Real Time Systems

 Request Current Parameter Settings  Request Current Hardware State  Display of Stored Data  Current Standards and Limits www.SoftwareMetrics.Com

311

Searching

ILF A

Request (input side)

ILF B

Searching = 1 DET (input side)

Ƒ(A,B)

=

X EQ

www.SoftwareMetrics.Com

312

Request (input side) Not Found

EQ

ILF A

ILF B Ƒ(A,B)

=

X

Not Found = 1 DET (output side) www.SoftwareMetrics.Com

313

Searching Not Found

EQ

ILF A

Request (input side)

ILF B Ƒ(A,B)

=

X

Total DET’s = 4 DET (input + output side) www.SoftwareMetrics.Com

314

Examples of Data Elements for an External Inquiry

 Input Side – a customer name to search on – a click of the on a scroll bar  Output Side – a listing of customers by name – display of a particular customer www.SoftwareMetrics.Com

315

Examples of Invalid External Inquires

 Screen data that contains derived data  Navigational Screens  Error/Confirmation Messages www.SoftwareMetrics.Com

316

Identification Rules for an EQ

 An input request enters the application boundary.

 Output results exist the boundary.

 Data is retrieved.

 The data retrieved does not contain derived data.

 The input request and output results together make up a process that is considered an elementary process.   Process does not maintain or update an ILF.

Must be unique from other EQ’s.

www.SoftwareMetrics.Com

317

Chapter 7 Questions

www.SoftwareMetrics.Com

318

New Customer Exercise www.SoftwareMetrics.Com

319

www.SoftwareMetrics.Com

320

www.SoftwareMetrics.Com

321

Transaction Review

Description or Activity

EI Transactions EO EQ

DET’s retrieved from FTR’s Sorting of Data Updates an ILF Maintains an ILF Contains Derived Data Info from outside to inside Shares complexity matrix table Are valued the same for Low, Avg, and High Never Contains Derived Data At least on FTR is referenced Information from inside to outside

www.SoftwareMetrics.Com

322

Transaction Answers

Description or Activity DET’s retrieved from FTR’s Sorting of Data Updates an ILF Maintains an ILF Contains Derived Data Info from outside to inside Shares complexity matrix table Are valued the same for Low, Avg, and High Never Contains Derived Data At least on FTR is referenced Information from inside to outside X X X

EI Transactions EO EQ

X X X X X X X X X X X X X X X X X X

www.SoftwareMetrics.Com

323

Internal Logical Files Defined

 A user identifiable group of logically related data or control information that resides entirely within the applications boundary and is maintained through External Inputs. www.SoftwareMetrics.Com

324

Examples of Internal Logical Files

 Business Data (customer name, tax id, address)  Control Data (color, copies, printer port)  Rules Based Data (meta data, payment criteria, tax zones) www.SoftwareMetrics.Com

325

Examples of Internal Logical Files Real Time Systems

 Log Files  Diagnostic Files  Hardware Parameter Settings  Initialization Files  Data Files www.SoftwareMetrics.Com

326

Rating Internal Logical Files

Record Element Types (RET’s) 1 2 to 5 More than 5

1-19

Data Elements (DET’s)

20-50 Greater than 50 Low Low Average Low Average High Average High High www.SoftwareMetrics.Com

327

ILF Identification Rules

 Group of data or control information is logical, user identifiable, and fulfills specific user requirements.

 Data is maintained within the application boundary.

 Data is modified via an elementary process (one or more EI’s).  Has not been counted as an EIF for the application.

www.SoftwareMetrics.Com

328

Examples of Record Elements

 A RET is user recognizable sub group of data elements within a ILF or an EIF. It is best to look at logical groupings of data to help identify them.

 Groupings can be either optional or mandatory www.SoftwareMetrics.Com

329

1 ILF, 2 RET www.SoftwareMetrics.Com

2 ILF, 1 RET each 330

Temporary

ILF

1 EI and 1 ILF

www.SoftwareMetrics.Com

331

DET’s & Shared ILF’s

Count 1 ILF (student) 3 DETs Application A

student Name Address Phone Date of birth Hobbies Favorite color

Application B Count 1 ILF (student) 4 DETs

www.SoftwareMetrics.Com

332

Chapter 9 Questions

www.SoftwareMetrics.Com

333

Cache?

ILF B EQ EO ILF C 1 EQ 1 EO

www.SoftwareMetrics.Com

334

External Interface Files Defined

 A user identifiable group of logically related data that resides entirely outside the applications boundary and is not maintained by the application.

 An external interface file is an internal logical file for another application. www.SoftwareMetrics.Com

335

Examples of External Interface Files

 Reference Information  Edit data  Control Information  Information that does not update an internal logical file www.SoftwareMetrics.Com

336

Examples of External Interface Files Real Time Systems

 Other systems hardware state  Data that belongs to another system or application  Control Information  Information that is used but that does not update any internal logical file www.SoftwareMetrics.Com

337

Application A

ILF A ILF B ILF C EO

www.SoftwareMetrics.Com

Application B

EIF 1 ILF 1 EIF 1 EO, 2 FTR

338

Application B ILF EI

ILF

ILF EI www.SoftwareMetrics.Com

339

Rating External Interface Files

Record Element Types (RET’s) 1 2 to 5 More than 5

1-19

Data Elements (DET’s)

20-50 Greater than 50 Low Low Average Low Average High Average High High www.SoftwareMetrics.Com

340

Function Point Calculation Table

Functional Complexity

External Inputs External Outputs External Inquiries Internal Logical Files External Interface Files Low __x 3 =__ __x 4 =__ __x 3 =__ __x 7 =__

Average __x 4 =__ __x 5 =__ __x 4 =__ __x 10 =__

High __x 6 =__ __x 7 =__ __x 6 =__ __x 15=__ Total __x 5 =__

__x 7 =__

__x 10 =__ Unadjusted Function Points Multiplied by Value Adjustment Factor X 1.0

Adjusted Function Points www.SoftwareMetrics.Com

341

EIF Identification Rules

 Group of data or control information is a logical, user identifiable, and fulfills specific user requirements.

 Group of data is referenced by, and external to, the application being counted.

 Group of data has not been counted as an ILF by the application.

www.SoftwareMetrics.Com

342

General System Characteristics

 There are 14 general system characteristics (GSC’s) that rate the general functionality of the application being counted.  Each characteristic has associated descriptions that help determine the degrees of influence of the characteristics.

www.SoftwareMetrics.Com

343

Boundary

User Domain

External Inputs (EI) External Outputs (EO) External Inquiries (EQ) Internal Logical Files (ILF) Application Boundary www.SoftwareMetrics.Com

EQ EI EO External Interface Files (EIF) Other Applications 344

Sophisticated User

 Defines Requirements  Participates in Acceptance Testing  Understands flow of Information  Subject Matter Experts (SMEs) www.SoftwareMetrics.Com

345

Count Transactional Function Types Determine type of Count Identify Boundary Count Data Function Types Determine Unadjusted Function Point Count Determine Adjusted Function Point Count www.SoftwareMetrics.Com

346

General System Characteristics

 There are 14 general system characteristics (GSC’s) that rate the general functionality of the application being counted.  Each characteristic has associated descriptions that help determine the degrees of influence of the characteristics.

www.SoftwareMetrics.Com

347

GSC Rating Scale

 0 Not present or no influence  1 Incidental influence  2 Moderate influence  3 Average influence  4 Significant influence  5 Strong influence throughout www.SoftwareMetrics.Com

348

Value Adjustment Factor

 Once all the 14 GSC’s have been rated, they should be tabulated using the IFPUG Value Adjustment Equation (VAF) --

VAF = 0.65 + [( ∑ Ci) / 100]

Where: Ci = degree of influence for each GSC i = is from 1 to 14 representing each GSC.

= is summation of all 14 GSC’s.

www.SoftwareMetrics.Com

349

1.

2.

3.

4.

5.

6.

7.

8.

9.

GSC’s at a Glance

Data

GSC

Communications Distributed DP Performance Heavily Used Config.

Transaction Rate Online Data Entry End-User Efficiency Online Update Complex Processing

Description

How many communication facilities are there to aid in the transfer or exchange of information with the application or system?

How are distributed data and processing functions handled?

Did the user require response time or throughput?

How heavily used is the current hardware platform where the application will be executed?

How frequently are transactions executed daily, weekly, monthly, etc.?

What percentage of the information is entered online Was the application designed for end-user efficiency?

How many ILF’s are updated by On-Line transaction? Does the application have extensive logical or mathematical processing?

www.SoftwareMetrics.Com

350

GSC’s at a Glance

- continued

10.

GSC

Reusability 11.

12.

Installation Ease Operational Ease 13.

Multiple Sites 14.

Facilitate Change

Description

Was the application developed to meet one or many user’s needs? How difficult is conversion and installation?

How effective and/or automated are start-up, back up, and recovery procedures?

Was the application specifically designed to be installed at multiple sites for multiple organizations? Was the application specifically designed to facilitate change?

www.SoftwareMetrics.Com

351

Score As 0 1 2 3 4 5

Data Communications GSC

Descriptions to Determine Degree of Influence

Application is pure batch processing or a standalone PC.

Application is batch but has remote data entry

or

remote printing.

Application is batch but has remote data entry

and

remote printing.

Application includes online data collection or TP (teleprocessing) front end to a batch process or query system.

Application is more than a front-end, but supports only one type of TP communications protocol.

Application is more than a front-end, and supports more than one type of TP communications protocol.

Example Rating:

An application that allows query of application via a web based solution and local access would receive a value of 3.

An application that allows for the update of ILF’s via the Internet and local update would receive a value of a 5 .

www.SoftwareMetrics.Com

352

Distributed DP GSC

Score As Descriptions to Determine Degree of Influence 0 1 2 3 4 5

Application does not aid the transfer of data or processing function between components of the system. Application prepares data for end user processing on another component of the system such as PC spreadsheets and PC DBMS Data is prepared for transfer, then is transferred and processed on another component of the system (not for end-user processing).

Distributed processing and data transfer are online and in one direction only.

Distributed processing and data transfer are online and in both directions.

Processing functions are dynamically performed on the most appropriate component of the system.

Example Rating:

* Copying files from a mainframe to a local PC or copy files from an Internet or intranet would receive a value of 2.

* Reading via a client or via Internet or intranet would receive a value of 3.

* Reading and updating via Internet or intranet would receive a value of 4.

* Depending on available resources, the application processes either local, on server, on intranet or Internet application would receive a value of 5.

www.SoftwareMetrics.Com

353

Score As 0 1 2 3 4 5

Performance GSC

Descriptions to Determine Degree of Influence

No special performance requirements were stated by the user. Performance and design requirements were stated and reviewed but no special actions were required.

Response time or throughput is critical during peak hours. No special design for CPU utilization was required. Processing deadline is for the next business day. Response time or throughput is critical during all business hours. No special design for CPU utilization was required. Processing deadline requirements with interfacing systems are constraining.

In addition, stated user performance requirements are stringent enough to require performance analysis tasks in the design phase. In addition, performance analysis tools were used in the design, development, and/or implementation phases to meet the stated user performance requirements.

www.SoftwareMetrics.Com

354

Score As 0 1 2 3 4 5

Heavily Used Config. GSC

Descriptions to Determine Degree of Influence

No explicit or implicit operational restrictions are included. Some security or timing considerations are included. Operational restrictions do exist, but are less restrictive than a typical application.

No special effort is needed to meet the restrictions.

Specific processor requirements for a specific piece of the application are included.

Stated operation restrictions require special constraints on the application in the central processor or a dedicated processor.

In addition, there are special constraints on the application in the distributed components of the system.

www.SoftwareMetrics.Com

355

Score As 0 1 2 3 4 5

Transaction Rate GSC

Descriptions to Determine Degree of Influence

No peak transaction period is anticipated. Peak transaction period (e.g., monthly, quarterly, seasonally, annually) is anticipated. Weekly peak transaction period is anticipated. Daily peak transaction period is anticipated.

High transaction rate(s) stated by the user in the application requirements or service level agreements are high enough to require performance analysis tasks in the design phase.

High transaction rate(s) stated by the user in the application requirements or service level agreements are high enough to require performance analysis tasks and, in addition, require the use of performance analysis tools in the design, development, and/or installation phases.

www.SoftwareMetrics.Com

356

Online Data Entry GSC

Score As 0 1 2 3 4 5 Descriptions to Determine Degree of Influence

All transactions are processed in batch mode.

1% to 7% of transactions are interactive data entry.

8% to 15% of transactions are interactive data entry.

16% to 23% of transactions are interactive data entry.

24% to 30% of transactions are interactive data entry.

More than 30% of transactions are interactive data entry.

www.SoftwareMetrics.Com

357

End User Efficiency GSC

End User Design Components:

• Navigational aids (for example, function keys, jumps, dynamically generated menus) • Menus • Online help and documents • Automated cursor movement • Scrolling • Remote printing (via online transactions) • Assigned function keys • Batch jobs submitted from online transactions • Cursor selection of screen data • Heavy use of reverse video, highlighting, colors underlining, and other indicators • Hard copy user documentation of online transactions • Mouse interface • Pop-up windows.

• As few screens as possible to accomplish a business function • Bilingual support (supports two languages; count as four items) • Multilingual support (supports more than two languages; count as six items) .

www.SoftwareMetrics.Com

358

Score As 0 1 2 3 4 5

End User Efficiency GSC

Descriptions to Determine Degree of Influence

None of the above.

One to three of the above.

Four to five of the above.

Six or more of the above, but there are no specific user requirements related to efficiency.

Six or more of the above, and stated requirements for end-user efficiency are strong enough to require design tasks for human factors to be included (for example, minimize key strokes, maximize defaults, use of templates).

Six or more of the above, and stated requirements for end-user efficiency are strong enough to require use of special tools and processes to demonstrate that the objectives have been achieved.

www.SoftwareMetrics.Com

359

Score As 0 1 2 3 4 5

Online Update GSC

Descriptions to Determine Degree of Influence

None.

Online update of one to three control files is included. Volume of updating is low and recovery is easy.

Online update of four or more control files is included. Volume of updating is low and recovery easy.

Online update of major internal logical files is included.

In addition, protection against data lost is essential and has been specially designed and programmed in the system.

In addition, high volumes bring cost considerations into the recovery process. Highly automated recovery procedures with minimum operator intervention are included.

www.SoftwareMetrics.Com

360

Complex Processing GSC

Complex Processing Components:

• Sensitive control.

• Extensive logical processing.

• Extensive mathematical processing.

• Much exception processing.

• Complex processing to handle multiple I/O.

Score As 0 1 2 3 4 5 Descriptions to Determine Degree of Influence

None of the above.

Any one of the above.

Any two of the above.

Any three of the above.

Any four of the above.

All five of the above.

www.SoftwareMetrics.Com

361

Score As 0 1 2 3 4 5

Reusability GSC

Descriptions to Determine Degree of Influence

No reusable code.

Reusable code is used within the application.

Less than 10% of the application considered more than one user's needs.

Ten percent (10%) or more of the application considered more than one user's needs.

The application was specifically packaged and/or documented to ease re-use, and the application is customized by the user at source code level.

The application was specifically packaged and/or documented to ease re-use, and the application is customized for use by means of user parameter maintenance.

www.SoftwareMetrics.Com

362

Score As 0 1 2 3 4 5

Installation Ease GSC

Descriptions to Determine Degree of Influence

No special considerations were stated by the user, and no special setup is required for installation.

No special considerations were stated by the user

but

special setup is required for installation.

Conversion and installation requirements were stated by the user, and conversion and installation guides were provided and tested. The impact of conversion on the project is not considered to be important.

Conversion and installation requirements were stated by the user, and conversion and installation guides were provided and tested. The impact of conversion on the project is considered to be important.

In addition to 2 above, automated conversion and installation tools were provided and tested.

In addition to 3 above, automated conversion and installation tools were provided and tested.

www.SoftwareMetrics.Com

363

Score As 0 1-4 5

Operational Ease GSC

Descriptions to Determine Degree of Influence

No special operational considerations other than the normal back-up procedures were stated by the user.

* One, some, or all of the following items apply to the application. Select all that apply. Each item has a point value of one, except as noted otherwise.

* Effective start-up, back-up, and recovery processes were provided, but operator intervention is required.

* Effective start-up, back-up, and recovery processes were provided, but no operator intervention is required (count as two items).

* The application minimizes the need for tape mounts.

* The application minimizes the need for paper handling.

The application is designed for unattended operation. Unattended operation means

no operator intervention

is required to operate the system other than to start up or shut down the application. Automatic error recovery is a feature of the application.

www.SoftwareMetrics.Com

364

Score As 0 1 2 3 4 5

Multiple Sites GSC

Descriptions to Determine Degree of Influence

User requirements do not require considering the needs of more than one user/installation site.

Needs of multiple sites were considered in the design, and the application is designed to operate only under identical hardware and software environments.

Needs of multiple sites were considered in the design, and the application is designed to operate only

under similar

hardware and/or software environments.

Needs of multiple sites were considered in the design, and the application is designed to operate

under different

hardware and/or software environments.

Documentation and support plan are provided and tested to support the application at multiple sites and the application is as described by 1 or 2.

Documentation and support plan are provided and tested to support the application at multiple sites and the application is as described by 3.

www.SoftwareMetrics.Com

365

Facilitate Change GSC

Facilitate Change Characteristics:

• Flexible query and report facility is provided that can handle simple requests.

• Flexible query and report facility is provided that can handle requests of average complexity .

• Flexible query and report facility is provided that can handle complex requests .

• Business control data is kept in tables that are maintained by the user with online interactive processes, but changes take effect only on the next business day .

• Business control data is kept in tables that are maintained by the user with online interactive processes, and the changes take effect immediately (count as two items) .

www.SoftwareMetrics.Com

366

Score As 0 1 2 3 4 5

Facilitate Change GSC

Descriptions to Determine Degree of Influence

None of the above.

Any one of the above.

Any two of the above.

Any three of the above.

Any four of the above.

All five of the above.

www.SoftwareMetrics.Com

367

Chapter 11 Questions

www.SoftwareMetrics.Com

368

Calculating Adjusted FP Count

 Development – DFP = (UFP + CFP) * VAF  Where: • DFP is development function point count • UFP is unadjusted function point count • CFP is the function point count added by the conversion unadjusted function point count • VAF is the value adjustment factor (assume = 1) www.SoftwareMetrics.Com

369

Calculating Adjusted Function Point Count  Enhancement Project Function Point Calculations 

EFP = [(ADD + CHGA + CFP) * VAFA] + (DEL* VAFB) EFP

is the enhancement project function point count.

ADD

is the unadjusted function point count of those functions that were added by the enhancement project.

CHGA

is the unadjusted function point count of those functions that were modified by the enhancement project. This number reflects the functions

after

the modifications.

CFP

is the function point count added by the conversion

VAFA

is the value adjustment factor of the application

after

the enhancement project.

DEL

is the unadjusted function point count of those functions that were deleted by the enhancement project.

VAFB

is the value adjustment factor of the application

before

the enhancement project.

www.SoftwareMetrics.Com

370

EFP Simplified

EFP = [(ADD + CHGA + CFP) * VAFA] + (DEL*VAFB)

 Assume that CFP = 0 and VAFA=VAFB=1  Hence,  EFP s = (ADD + CHGA+DEL) www.SoftwareMetrics.Com

371

Application After Enhancement Project

  AFP = [(UFPB + ADD + CHGA) - (CHGB + DEL)] * VAFA or AFP = (UFPB+ADD + CHGA-CHGB - DEL)* VAFA  Where  UFPB = Unadjusted Function Points Before  ADD = Added Function Points  CHGA = Change After  CHGB = Change Before  DEL = Deleted  VAFA = Value Adjustment Factor After Enhancement (assume = 1.0) www.SoftwareMetrics.Com

372

AFP Simplified

 AFP = (UFPB+ADD + CHGA-CHGB - DEL)* VAFA – Assume, CHGA = CHGB, DEL = 0 – Then  AFP s = (UFPB +ADD) * VAFA www.SoftwareMetrics.Com

373

Application Function Point Count

AFP = ADD * VAF

Where:

AFP

is the initial application function point count.

ADD

is the unadjusted function point count of those functions that were installed by the development project.

VAF

is the value adjustment factor of the application (assume = 1).

www.SoftwareMetrics.Com

374

Importance of Measurement

"Weights and Measures may be ranked among the necessaries of life to every individual of human society… They are necessary to every occupation of human industry…The knowledge of them..is among the first elements of education, and is often learned by those who learn nothing else, not even to read and write.

John Quincy Adams 1821

www.SoftwareMetrics.Com

375