320 likes | 705 Views
Function Point Analysis. Function Points Analysis (FPA). What is Function Point Analysis (FPA)? Function points are a standard unit of measure that represent the functional size of a software application.
E N D
Function Points Analysis (FPA) What is Function Point Analysis (FPA)? • Function points are a standard unit of measure that represent the functional size of a software application. • It is designed to estimate and measure the time, and thereby the cost, of developing new software applications and maintaining existing software applications. • It is also useful in comparing and highlighting opportunities for productivity improvements in software development. • It was developed by A.J. Albrecht of the IBM Corporation in the early 1980s.
Objectives of Function Point Analysis • Measure software by quantifying the functionality requested by and provided to the customer. • Measure software development and maintenance independently of technology used for implementation. • Measure software development and maintenance consistently across all projects and organizations.
Important FPA notes • Measured from the user's perspective • Technology-independent • Low cost • Repeatable • Work well with use cases
FPA • How is Function Point Analysis done? • Working from the project design specifications, the following system functions are measured (counted): • External Inputs (EI) • External Outputs (EO) • Files (ILF-internal logical files) • External Inquires (EQ) • Interfaces (ELF – external logical files)
FPA Inquiries Boundary Input Output Internal Logical Files External Interface Files
FPA • EI: An elementary process in which data crosses the boundary from outside to inside. • Data input screen • Another application • Business data: does update ILF • Control data: does not update ILF • EO: An elementary process in which derived data passes across the boundary from inside to outside. • Creates reports • Creates output files sent to other applications • Created from ILF and ELF
FPA • EQ:An elementary process with both input and output components that result in data retrieval from one or more ILF and ELF • Sent outside the application boundary • Input process does not update ILF • Output side does not contain derived data • ILF: A User identifiable group of logically related data that entirely within the applications boundary and is maintained through External Inputs • EIF: A User identifiable group of logically related data that is used for reference purposes only. • Resides entirely outside application • Maintained by another application • It is an ILF for another application
Unadjusted FP Calculation • Functional Count by (Complexity) • Complexity rated by three categories: • Simple • Average • Complex • Each of the 5 functional components has its own unique complexity matrix weighting based on level of complexity
Degrees of Influence (DI) • Data communications • Distributed functions • Perfomance objectives • Heavily used configuration • Transaction rate • On-line data entry • End-user efficiency • On-line update • Complex processing • Reusability • Installation ease • Operational ease • Multiple sites • Facilitate change • General characteristics to be ranked by degree of influence from 0-5 • Degree of Influence Measures • Not Present, or no influence present=0 • Insignificant Influence=1 • Moderate Influence=2 • Average Influence=3 • Significant Influence=4 • Strong influence, throughout=5
FP Calculation • Complexity Adjustment Factor (CAF) • CAF = 0.65 + 0.01 x DI • each degree of influence is worth 1 percent of a Total count factor which can range from 0.65 to 1.35 • Adjusted Function Points (AFP) • AFP = CAF x UFP
Complexity of Files& Transactions • Data Element Type (DET) • A unique user recognizable field from a business perspective which participates in a transaction or is stored on a logical data file. • Record Element Type (RET) • A user recognizable subgroup of data elements within an ILF or EIF. (orders types) • The complexity of an transaction is determined by counting the number of logical File Types Referenced (FTRs) and the number of DET.
Productivity Index • Function points method can be used for measuring the productivity of development activities
Critics to FPA • The calculation of function counts tends to take a black boxview of the system. • The user defined function types currently established may notbe wholly appropriate for current technology. • Function point counts are affected by project size • Difficult to apply to massively distributed systems or to systems with very complex internal processing • Difficult to define logical files from physical files
Critics to FPA • The classification of the user function types into simple, average,and complex appears to be oversimplified • The choice of weights was determined by debate and trial. • The restriction to 14 processing complexity factors is not going to be satisfactory for all time
Benefits of FPA • Organizations that adopt Function Point Analysis as a software metric realize many benefits including: • improved project estimating • understanding project and maintenance productivity • managing changing project requirements; and gathering user requirements
3D Function Points • Each class is an internal file • Messages sent across the systemboundary as transactions • Require a greater degree of detail in order to determine sizeand consequently make early counting more difficult.
Object-Oriented Function Points(OOFP) • Characterized by a mapping of FP concepts(logical files and transactions) to OO concepts (classes andmethods), and by a flexible method for handling specificOO concepts like inheritance and aggregation.
OOFP • Uses OMT Model • Object Model * • Static representation of classes and objects • First to be developed so can be measured early stages • Function Model • Data Flow Diagrams • Identifiying and Design some methods in early stages • Dynamic Model • State machiness • Use case and Scenarios
OOFP • Central Analogy to map FP to OOFP • Logical files (collection of user identifiable data) Classes(encapsulates collection of data items) • Transactions Methods • Application Boundary • External classes encapsulates non-system components (external services and reused library classes); EIF • Classes with in the App. Boundary is ILF
OOFP • OOFP Calculation
OOFP Process • Analyze object model and identify units to be counted as LF. • Calculate the complexity of each LF and SR. • Convert complexity values to numbers • If LF is “reused” its OOFP value is calculated with a scale factor f<=1 • All OOFP values are summed up.
OOFP • Identify LF • Classes are mapped to LF • Aggregation and Inheritance is encountered • Mainly a concern of implementation • At analysis phase • Each class is a LF • Scale factor =1 (Origin of class does not matter) • At design phase • Scale factor <1, reuse makes classes easier to develop • For designer, each class is LF • For user perspective, it is complicated
OOFP • Ways to Identify LF • Simple LF • Sigle Class is a LF • Composite LF • Aggregation • Generalization/Specialization • Mixed (combine aggregation and generalization)
Aggregation Single Mix Generalization
OOFP • Calcution of DET and RETs • One RET for each ILF/ELF • Simple LF • Simple attributes sucs as integer and strings counted as DET • Associations are counted as DET or RET accoring to cardinality • Single valued association is DET • Multiple valued association is RET
OOFP • Composite LF • DETs and RETs are counted as in simple LF, except for aggregation • Aggregations act as subgroups in composite LF • One RET is counted for aggregations • For each OOFP , weighted vector table for ILF and ELF in IFPUG (international function point user group)
OOFP • Service Requests • Concrete methods are only counted once, abstract methods are not counted • Simple Items: (analogy to DET) • simple data items referenced as a argument • simple global variables referenced by the method • Complex Items: (analogy to FTR) • Complex arguments, objects and complex global variables references by the method • For each OOFP(SR) , weighted vector table for EI,EQ in IFPUG