Executive Guide: Measuring Performance and Demonstrating Results of
Information Technology Investments (Exposure Draft) (Guidance, 09/01/97,
GAO/AIMD-97-163).

GAO published a guide to aid federal agencies in understanding and
devising effective information technology (IT) measurement
implementation approaches. The guide presents information on: (1) the
demand for performance measurement; (2) fundamental practices and the
foundation of IT performance measurement; (3) practice areas for: (a)
following an IT results chain; (b) following a balanced scorecard
approach; (c) target measures, results, and accountability at
decisionmaking tiers; (d) building a comprehensive measurement, data
collection, and analysis capability; and (e) strengthening IT processes
to improve mission performance; and (4) key lessons learned for
effective implementation.

--------------------------- Indexing Terms -----------------------------

 REPORTNUM:  AIMD-97-163
     TITLE:  Executive Guide: Measuring Performance and Demonstrating 
             Results of Information Technology Investments
             (Exposure Draft)
      DATE:  09/01/97
   SUBJECT:  Information technology
             Systems evaluation
             Information resources management
             Systems design
             Accountability
             Strategic information systems planning

             
******************************************************************
** This file contains an ASCII representation of the text of a  **
** GAO report.  Delineations within the text indicating chapter **
** titles, headings, and bullets may not be preserved.          **
**                                                              **
** No attempt has been made to display graphic images, although **
** figure captions are reproduced.  Tables are included, but    **
** may not resemble those in the printed version.               **
**                                                              **
** Please see the PDF (Portable Document Format) file, when     **
** available, for a complete electronic file of the printed     **
** document's contents.                                         **
**                                                              **
******************************************************************

                    United States General Accounting Office

GAO                 Accounting and Information Management Division
________________________________________________
September 1997      Executive Guide

                    Measuring Performance and Demonstrating Results of
                    Information Technology


                    Investments



     

                    

                    EXPOSURE DRAFT





GAO/AIMD-97-163



Preface

The Government Performance and Results Act of 1993 requires government 
executives to focus on defining missions, setting goals, measuring 
performance, and reporting accomplishments.  In addition, with the 
passage of the Federal Acquisition Streamlining Act of 1994 (FASA) and 
the Clinger-Cohen Act of 1996, performance-based and results-oriented 
decision-making is now required for all major investments in 
information technology (IT).  Clearly, this intense focus on results 
is one of the most important management issues now confronting federal 
agencies.

To assist federal agencies in understanding and devising effective IT 
measurement implementation approaches, we examined certain public and 
private organizations well-known for their IT performance leadership 
and management expertise.  Similar to our past efforts examining 
comprehensive information management practices of other leading 
organizations,[1] we have taken the lessons learned from these 
organizations and developed a suggested framework for agencies to 
consider when designing and implementing their IT performance 
management approaches.  We have briefed numerous Chief Information 
Officers, agency executives, and agency IT managers on our work over 
the last 6 months as part of our effort to advance a pragmatic 
understanding of what is required to effectively measure the 
contribution of IT to mission performance and program outcomes.

Using comprehensive performance information for information management 
and technology decisions can advance more informed decision-making 
about IT investments at a time when resources are limited and public 
demands for better government service are high.  Ultimately, the 
success of results-oriented reform legislation will demand concerted 
management effort and long-term commitment.  The key practices and 
steps outlined in this guide can help agencies achieve success. 

This exposure draft was prepared under the direction of Dave McClure, 
Senior Assistant Director for Information Resource Management Policies 
and Issues.  If you have questions or comments about the report, he 
can be reached at (202) 512-6257.  Other major contributors are listed 
in appendix IV.



Gene L. Dodaro
Assistant Comptroller General
Accounting and Information Management Division









Contents

                                                                Page

Preface                                                            1

The Demand for Performance Management                              4

Fundamental Practices:  The Foundation of IT Performance Management11

Practice Area 1:
Follow an Information Technology "Results Chain"                  18

Practice Area 2:
Follow a "Balanced Scorecard" Approach                            31

Practice Area 3:
Target Measures, Results, and Accountability at Decision-making Tiers45

Practice Area 4:
Build a Comprehensive Measurement, Data Collection, and Analysis 
Capability                                                        53

Practice Area 5:
Strengthen IT Processes to Improve Mission Performance            64

Key Lessons Learned for Effective Implementation                  70

Appendix I:  Selected Bibliography                                76
Appendix II: Objectives, Scope and Methodology                    79
Appendix III:Case Study Organizations and Participants            81
Appendix IV: Major Contributors to This Report                    82




Figures
Page

Figure 1:  IT Performance Management Approach                     12

Figure 2:  Performance Measurement - A Strategic Information 
Management 
        Best Practice                                             15

Figure 3:  Implementing the Results Act - Key Steps and Critical 
Practices                                                         16

Figure 4:  An IT Results Chain                                    19

Figure 5:  A Hypothetical Chain of Events/Evidence for Achieving                                   
Environmental Quality                                             27

Figure 6:  An IT Balanced Scorecard Approach                      33

Figure 7:  Balanced Scorecard - IT Strategic Measures             37

Figure 8:  Balanced Scorecard - IT Customer Measures              39

Figure 9:  Balanced Scorecard - IT Internal Business Measures     41

Figure 10:  Balanced Scorecard - IT Innovation and Learning Measures43

Figure 11:  Performance Measurement Tiers                         46

Figure 12:  A Tiered Performance Scorecard Example                50

Figure 13:  Information Systems Process Architecture Version 2.0 
Process Framework                                                 65

Figure 14:  IT Measurement Implementation Roadmap                 72

Figure 15:  Stages in IT Performance Management                   74












The Demand for Performance Management
______________________________________________________________________
___________

Increasingly, federal policy-makers are insisting that government 
executives provide hard facts on mission and program results.  Program 
authorizations, resource decisions, and oversight requirements 
increasingly hinge on how well agencies perform against expectations 
and improve performance over time.  As such, a new standard for 
management expertise is evolving:  setting performance targets, 
designing efficiency and effectiveness measures, systematically and 
accurately measuring outcomes, and then using the results for informed 
decision-making.

Information technology (IT) products, services, and delivery processes 
are important resources for results-driven government programs and 
operations.  For purposes of this guide, IT also includes the 
organizational unit or units and contractors primarily responsible for 
delivering IT.  Line managers--the operational customers[2] relying on 
IT products and services--and IT managers themselves, want to know 
"How are information technology products and services, including the 
information infrastructure, supporting the delivery and effectiveness 
of the enterprise's (agency) programs?"   As we pointed out in an 
earlier report, successful organizations rely heavily on performance 
measures to operationalize mission goals and objectives, quantify 
problems, evaluate alternatives, allocate resources, track progress, 
and learn from mistakes.[3]  Operational customers and IT managers in 
these organizations form partnerships to design, manage, and evaluate 
IT systems that are critical to achieving improved mission success.

Performance Management:
What Are the Benefits?

In an effective performance management approach, measures are not used 
for assigning blame or to unknowingly comply with reporting 
requirements.  Quite simply, they are used to create and facilitate 
action to improve performance.  Measures and performance information 
must link to strategic management processes.  An effective performance
management system produces information that delivers the following 
benefits.[4]   

  Provides an early warning indicator to correct problems, or to 
   examine if corrective action is having any effect. 

  Provides input to resource allocation and planning.  It can help 
   organizations prepare for future conditions that likely will impact 
   program and support function operations and the demands for 
   products and services, such as decreasing personnel, financial 
   resources or changes in workload.  Use of measures can give 
   organizations a long lead time for adjustment if these conditions 
   are known in advance.

  Provides periodic feedback to employees, customers, stakeholders, 
   and the general public about the quality, quantity, cost, and 
   timeliness of  products and services.

Across all of these benefits is an overarching one -- measures build a 
common results language among all decision-makers.  What measures the 
organization picks basically say what the organization is accountable 
for, and what it should benchmark and compare against.  


Results-Oriented Legislation Provides
Impetus for Performance Based Management 

For the past several years, the Congress has emphasized federal 
performance improvement, accountability for achieving results, and 
cost reduction.  Legislative requirements in the Chief Financial 
Officers Act (CFO) of 1990, the Government Performance and Results Act 
(Results Act) of 1993, the Federal Acquisition Streamlining Act (FASA) 
of 1994, the Paperwork Reduction Act of 1995, and the Clinger-Cohen 
Act of 1996 expect improvements in IT performance management.  These 
laws reinforce financial accountability, emphasize results-oriented 
management, define cost performance and schedule goals, and improve 
the acquisition of IT to streamline federal programs.

Under the CFO Act, CFOs are responsible for developing and maintaining 
integrated accounting and financial management systems that include 
systematic measurement information on agency performance.  OMB's
Management Circular A-11 (agency guidance for preparing and submitting 
budget estimates) encourages agencies to review program performance 
information contained in the most recent financial statements prepared 
under the CFO Act when developing their program performance 
indicators.    The Results Act directs federal agencies to improve 
their program management by implementing outcome-oriented performance 
measurement systems.  Agencies are to prepare annual performance plans 
with objective, quantifiable, and measurable performance indicators to 
measure relevant outputs, service levels, and outcomes of each program 
activity.  Meeting these requirements is critical in developing the 
mission goals and performance expectations which IT products and 
services will support.  Our Executive Guide: Effectively Implementing 
the Government Performance and Results Act contains additional 
guidance on results-oriented management under the Results Act.[5]

The Federal Acquisition Streamlining Act requires federal agencies to 
assess major acquisition cost, performance, and scheduling.  Agency 
heads must determine if there is a continuing need for programs that 
are significantly behind schedule, over budget, or not in compliance 
with performance or capability requirements.  Congressional policy is 
that each executive agency should achieve, on average, 90 percent of 
the cost and schedule goals established for agency programs.

The Paperwork Reduction Act requires agencies to establish information 
resources management goals improving the productivity, efficiency, and 
effectiveness of agency operations and methods for measuring progress 
in achieving the goals.  OMB's Circular A-130 (guidance resulting from 
the Paperwork Reduction Act) highlights the importance of evaluation 
and performance measurement.  It recommends that agencies seek 
opportunities to improve the effectiveness and efficiency of 
government programs through work process redesign and the judicious 
application of IT.  Agencies must perform various benefit-cost 
analyses to support ongoing management oversight processes and conduct 
post-implementation reviews of information systems to validate 
estimated benefits and document effective management.

The most recent legislation affecting IT performance management, the 
Clinger-Cohen Act, requires agencies to establish efficiency and 
effective program improvement goals using IT.  Performance 
measurements must assess how well IT supports agency programs.  Agency 
heads must benchmark agency process performance against comparable 
processes in terms of cost, speed, productivity, and quality of 
outputs and outcomes.  Agency heads must analyze agency missions and 
make appropriate
changes in mission and administrative processes before making 
significant IT investments to support missions.  Annual performance 
reports cover how well each agency improves agency operations through 
IT.  OMB has issued specific guidance to assist agencies in the 
implementation of the Clinger-Cohen Act that requests specific 
information on IT cost, benefit, and risk.  Most notable is OMB 
Management Circular A-11, Part 3, that provides instructions on agency 
budget submissions and the OMB Director's policy memorandum M-97-02 
(also known as "Raine's Rules") that specifies investment criteria 
that IT projects are to meet to qualify for inclusion in the 
President's budget submission to the Congress.

These various legislative and executive branch requirements create
pressure for top management's increased attention on IT performance.  
However, for federal agency managers, the challenge is not complying 
with these legislative and regulatory requirements, but "managing and 
measuring to results" using a well-crafted IT performance management 
system.  At a high level, federal managers can start framing a system 
by asking:

  What are enterprise (agencywide) and key operational customer IT 
   performance expectations?
  What are the vital few IT objectives given these expectations?
  What measures are appropriate for these IT objectives?
  What is IT's current baseline performance and what should be the 
   target performance?
  How will IT management and customers work together to use these 
   measures to leverage and improve IT performance in ways that will 
   improve mission delivery?

Answering these questions signals a significant change in determining 
how IT contributes to achieving improved program outcomes.  
Traditional measures such as response time and systems availability 
are by themselves insufficient to answer IT performance questions.  
"Measures such as machine hours are easy," said one manager we 
interviewed, "what is difficult is how to measure the business value 
of the applications."
Identifying Best Practices:
Learning from Leading Organizations

Federal managers are seeking guidance on developing and implementing  
agency IT performance management systems.[6]  To assist agencies, we 
examined how certain leading organizations approach IT performance 
management, studying practices of both public and private sector 
organizations recognized by peers and independent researchers for 
their IT performance efforts.  (A more detailed description of our 
case study selection methodology is found in appendix 2.)

The private sector companies used as case studies were

  Xerox Corporation,
  Eastman Kodak Company,
  Texas Instruments,
  Motorola Semiconductor Products Sector, and 
  American Express Travel Related Services Company.  

In the public sector, we studied

  Oregon Department of Transportation,
  Sunnyvale, California, and 
  Phoenix, Arizona.

We also selectively included the U.S. Immigration and Naturalization 
Service, the U.S. General Services Administration Information 
Technology Service, and the U.S. Department of Agriculture to assess 
early federal practices.[7]  We gathered additional information from 
general and IT performance management literature and reports.

We preceded the organizational research by an extensive review of the 
generic performance management and IT performance management and 
measurement literature, guides, and reports.  We also consulted with 
experts involved in IT performance management and measurement efforts.  
We collected organizational data through interviews and documentary 
analysis, not direct observation.  Case study organizations reviewed 
our results for accuracy and completeness.  We also gave briefings to 
federal officials to discuss our results.  Appendix II provides a more 
detailed description of our scope and methodology.

Our guide contains key practices we found from our organizational 
research and key concepts and practices extracted from available 
literature, guides, and reports.  They supplement recent GAO reports 
and testimonies.[8]  Much still remains to be learned.  This guide is 
an initial effort in an area very much in its infancy.  Without 
exception, those in the organizations we studied noted that IT 
performance management and measurement practices are not completely 
defined.  However, their experiences did translate to the key 
practices we describe in this guide.

Understanding the Context of IT
Performance Management

We have found from our research that there is not one "best" approach 
to IT performance management.  How IT performance management is 
designed, implemented, and sustained in each organization depends on a 
multitude of contextual factors, such as

  whether the organization's culture--leadership, decision-making, 
   appraisal and reward systems--supports IT performance management;
  how important IT is for program (mission) delivery;
  how widespread IT is used in the organization;
  what IT activities are centralized, dispersed, or decentralized; 
   and
  the availability of resources such as skills and tools to support 
   performance management.

Factors such as these, taken together, provide a unique environment 
impacting IT performance.  However, as with most things important to 
organizational health, another significant contextual factor is top 
management ownership, involvement in, and use of IT performance 
information.  What the top leaders pay attention to, what messages 
they send about IT performance management, and what action they take 
based on the measures tells the true story of IT performance 
management acceptance in any organization.  

In the organizations we studied, there is strong management attention 
on IT measures and their rigorous use in decision-making at all 
management levels to improve IT performance.  A second significant 
factor for effective IT performance is the partnership forged between 
IT and the enterprise and operational customers.  In a large sense, 
the enterprise and operational customers "co-produce" IT results 
because they are the consumers and users of IT products and services.  
The starting point for IT objectives is organizational goals and 
objectives.  What IT does and what is measured must directly align 
with those organizational goals and objectives.  While IT management 
and staff serve as business consultants on how current and emerging IT 
can aid mission objectives, they must rely on the ongoing engagement 
of organizational customers in defining how IT can facilitate mission 
accomplishment.
Fundamental Practices:  The Foundation of 
IT Performance Management
______________________________________________________________________
________

Our case study research clearly indicated that knowing what 
performance management is and is not is the starting point for 
developing an IT performance management system.  In simple terms, any 
performance management system assesses how well an organization 
delivers expected products and services directly that are tied to its 
goals and objectives.   It also incorporates the products and services 
from enterprise and program support functions such as IT, financial 
management, or human resources management.  Within this context, IT 
performance management and measures are considered subsets of overall 
performance management systems.  

Our IT performance management approach includes several important 
distinguishing characteristics that are discussed in greater depth in 
different parts of this guide.  These characteristics include:
      
  differentiating between IT''s impact on intermediate versus final 
   program outcomes, 
  using a good balance of different kinds of IT measures, 
  understanding that measures may differ by management tier within an 
   organization, and 
  evaluating both the overall performance of the IT function within 
   an organization and the outcomes for individual IT investments.

Our approach suggests three distinct practice areas that involve  
aligning IT systems with agency missions, goals, and programs; 
constructing measures that determine how well IT is supporting 
strategic, customer, and internal business needs; and implementing 
performance measurement mechanisms at various decision-making levels 
within an organization.  

Two supporting practice areas are important to keep the overall IT 
measurement process working.  Data collection and analysis 
capabilities must effectively support the performance management 
system being used in such a way that performance data is accessible, 
reliable, and collected in the least burdensome manner.  The benefit 
of effective automated data and management information systems is that 
performance information can be effectively and efficiently used to 
make strategic, managerial, and day-to-day operational decisions.  In 
addition,  a constant focus on strengthening the processes and 
practices being used to deliver IT products and services is essential 
for building and maintaining effective IT organizations.
Figure 1 shows the generic model produced by our case study research 
on IT performance measurement.  




Figure 1:  IT Performance Management Approach




   Practice Area 1:Follow an IT "results chain"

Leading organizations build and enforce a disciplined flow from goals 
to objectives to measures and individual accountability.  They define 
specific goals, objectives, and measures, use a diversity of measure 
types, and develop a picture as to how IT outputs and outcomes 
directly impact operational customer and enterprise (agency) program 
delivery requirements.  The IT performance management system does not 
optimize individual customer results at the expense of an enterprise 
(agency) perspective.  Operational customer goals and measures meet IT 
department or unit objectives that are matched to enterprise strategic 
directions or goals.

   Practice Area 2:Follow a balanced scorecard approach

Leading organizations use an IT goal, objective, and measure approach 
that translates organizational strategy and IT performance 
expectations into a comprehensive view of both operational and 
strategic measures.  Four generic goal areas include meeting the 
strategic needs of the enterprise, meeting the needs of individual 
operational customers, addressing internal IT business performance, 
and addressing ongoing IT innovation and learning.

   Practice Area 3:Target measures, results, and accountability at 
                  different decision-making tiers

For the balanced scorecard areas, leading organizations match measures 
and performance results to various decision-making tiers or levels.  
These tiers cover enterprise executives, senior to mid-level managers 
responsible for program or support units, and lower-level management 
running specific operations or projects.  The organizations we studied 
place IT goals and measures in widely distributed IT performance 
improvement plans.  Individual appraisals tie IT performance to 
incentives.

   Practice Area 4:Build a comprehensive measure, data collection, and 
                  analysis capability

Leading organizations give considerable attention to baselining, 
benchmarking, and the collection and analysis of IT performance 
information.  They use a variety of data collection and analysis tools 
and methods which not only keep them on top of IT performance 
management, but reduce the burden of collection and analysis.  They 
also periodically review the appropriateness of their current 
measures.

   Practice Area 5:Improve performance of IT business processes to 
                  better support mission goals

In the leading organizations, IT performance improvement begins and 
ends with IT business processes.  The organizations map their IT 
business processes and select those processes which must be improved 
to support an enterprise and operational customers' business 
processes.

Measurement Maturity:  Start With the Basics
and Increase Sophistication Over Time

Developing performance measures that demonstrate the impact of 
information technology on mission performance requires management 
commitment, experience in constructing and evaluating measures, and a 
constant learning environment.  Many of the organizations we talked to 
indicated that had attempted to develop strategic or mission impact 
measures without first realizing that they had to demonstrate strong 
capability and sound performance in the basics of IT management.  In 
short, if an IT unit was not being very successful in providing 
quality products and services to the rest of the organization, it had 
little credibility in measuring strategic contributions to mission or 
business results.

As such, several IT managers emphasized the need to start with the 
basics by assessing the quality and effectiveness of existing internal 
IT operations.  This evaluation and early measurement construction 
exercise can focus on such things as (1) delivery of reliable, 
cost-effective, high quality  IT products and services, (2) adherence 
to industry standards for systems design, cost estimation, 
development, and implementation, (3) internal customer satisfaction, 
(4) staff productivity, and (5) technical skills and capability.  All 
of these factors are important dimensions in providing effective IT 
support to business operations and improving overall organizational 
performance.

Starting with measures of internal IT operations offers some 
advantages, even though they should be viewed as a substitutes for 
other measurements of IT contribution to specific program and mission 
area results.  First, it preempts the problem of IT organizations 
waiting for  development and consensus of mission or business specific 
performance measures.  Second, it provides the IT organization with 
valuable experience in performance measurement construction and 
evaluation which is easily transferable to mission-related measures.  
Third, constructing performance measures for IT operations conforms 
with a balanced scorecard approach which emphasizes the need for a 
diversity of measures in examining IT performance.  And fourth, 
introducing performance measurement maturity over time is a critical 
factor affecting the overall success of implementing performance 
management in an organization--an issue we discuss in greater detail 
in the final section of this report.   

IT Performance Is Essential for
Strategic Information Management

Performance measurement is not an end, but rather the means to 
achieving better management results.  In our May 1994 Executive Guide 
on strategic information management,[9] we noted that leading 
organizations use performance measures to objectively evaluate 
mission, business, and project outcomes.  These organizations  (1) 
focused performance measures on gauging service to key management 
processes, (2) embedded performance measures in key management 
processes, (3) used internal and external benchmarks to assess 
relative performance, and (4) tailored performance measures to gauge 
whether information technology made a difference in improving 
performance. 
As shown in figure 2 performance measurement is a cornerstone practice 
that GAO advocates as part of an integrated strategic information 
management approach.



Figure 2:  Performance Measurement--A Strategic Information         
Management Best Practice


In June 1996, GAO issued a companion Executive Guide on a suggested 
performance measurement implementation approach for the Results 
Act.[10]  The approach, depicted in figure 3 identifies certain key 
steps and associated practices that agencies may find useful for 
implementation of the Results Act.  The approach is based on the 
actions taken by certain organizations that have successfully 
undertaken performance improvement initiatives similar to that 
required by the act.  

Figure 3:  Implementing The Results Act -- Key Steps and
         Critical Practices





Each organization GAO studied set its agenda for management reform 
according to its own environment, needs, and capabilities.  In 
striving to become more results oriented, they commonly took three 
steps in implementing a performance based management approach.  First, 
they defined clear missions and desired outcomes.  Second, they 
measured performance to gauge progress.  Third, they used performance 
information as a basis for decision-making.

Along with these steps, certain practices proved especially important 
to the success of their efforts.  Taken together, these steps and 
practices were useful in making changes necessary for these 
organizations to become results oriented.  These fundamental steps and 
practices are consistent with the Results Act requirements and provide 
a useful framework for federal agencies to adopt in implementing key 
provisions of the law.

The IT performance management approach outlined is this guide works in 
tandem with GAO's Results Act implementation model and demonstrates 
how IT performance measurement can be implemented within an overall 
performance management framework.  Most importantly, mission goals, 
objectives, and strategies must be understood in order to evaluate how 
IT contributes to performance improvements.

In the sections that follow, each practice area in our approach is 
explained in detail, listing specific characteristics and providing 
case study examples to illustrate how they are implemented.  A final 
section discusses key steps involved in implementing an IT performance 
management system.



Practice Area 1:
Follow An Information Technology "Results Chain"





Practice Area Characteristics:

1. Directly map IT goals and measures to organizational mission goals, 
   objectives, and measures.
2. Prepare a chain of events and evidence to understand IT's       
   contribution to enterprisewide and operational customer objectives
3. Use a diversity of measures to evaluate IT performance












Practice Area Overview

To maximize the results of IT investments, leading organizations 
ensure that IT programs align with and directly support high-level 
organizational missions, goals, and objectives.  This practice 
provides an approach for linking organizational goals and objectives 
to the "vital few" IT performance measures needed to manage for 
effective results.  This framework, formal or informal, follows a 
systematic movement through what can be called an "IT results chain."  
The results chain approach provides discipline for aligning 
performance expectations and measures at all levels.

Any effort to measure IT performance must begin with clearly defined 
organizational and programmatic goals and objectives.  In other words, 
an organization cannot properly define its IT goals and objectives 
(much less measure to determine the degree of success in meeting them) 
unless it has clearly defined goals and objectives for the programs 
that IT supports.  The resulting IT goals and measures must, in all 
cases, map back to program or strategic (enterprise-level) goals.  To 
help understand the relationships between operational programs and the 
IT contribution 
to their success, many organizations prepare a chain of events and 
evidence to show how programs work and how success might be measured.  
Finally, a diversity of qualitative and quantitative measures are used 
for the inputs, outputs, and outcomes of IT programs.              

In short, a results chain approach

  defines what the organization is attempting to accomplish,
  allows an organization to identify success, and
  links IT projects directly to business goals and objectives.

As shown in figure 4, the chain shows the links from organizational 
goals and objectives to IT performance measures.  In the organizations 
we studied, measuring IT's contribution begins by defining 
organizational goals and objectives--for the enterprise and for 
internal and external customers.  These goals and objectives should be 
based on defined organizational mission statements. 



Figure 4:  An IT Results Chain



As for other programs, IT management and staff can then develop a 
purpose statement that specifically defines how IT products and 
services will be used to support the achievement of organizational and 
customer goals.  The purpose statement is then translated into IT 
goals, objectives, and measures.

IT goals and objectives should be consistent with the IT purpose 
statement and clearly linked to organizational goals and objectives.  
Leading organizations  focus on a "vital few IT objectives" to 
demonstrate results in selected key areas.  Similarly, the number of 
measures for each IT goal should be limited to the "vital few."  These 
should be limited to the key IT performance dimensions that will 
enable the IT organization to assess accomplishments, make decisions, 
realign processes, and assign accountability.  Attempts to manage an 
excess number of IT measures increase risks of confusing excess data 
with key IT performance issues.  Lastly, management and staff 
performance evaluations are linked to the performance measures as a 
way of comparing achievements with planned results. 

As part of a results-oriented management approach, IT performance 
measurement must be used in the decisionmaking process.  Measurement 
development and alignment involves consideration of all organizational 
goals and objectives and converging on the vital few IT goals, 
objectives, and measures.  Effective measurement must be supported by 
sound data collection and analysis methods and communication of 
results to management and staff.  

Practice Area Characteristics

1. Directly Map Information Technology and Management Goals and 
   Measures to Strategic Goals

Use of an IT results chain is only as good as the clarity and 
specificity of the overall organizational goals and objectives.  
Leading organizations build consensus among program managers, IT 
managers, customers, stakeholders, and staff to establish joint 
ownership for performance management.  They work together to achieve a 
common understanding of goals, objectives, measures, and anticipated 
outcomes. As a practical matter, those who will judge the success of 
programs and the supporting functions should agree on the links in the 
results chain from IT's purpose to the vital few measures. [11]

In the organizations we examined, IT goals and measures flow directly 
from strategic goals.  IT managers and staff do not develop 
performance management systems that optimize operational customer 
results without considering an enterprisewide perspective.  IT goals 
and measures in support of individual operational customers must meet 
IT department or unit objectives.  In turn, IT department or unit 
objectives must map directly to both programmatic and enterprisewide 
strategic directions or goals.  The result is that IT goals and 
measures track in a seamless fashion back to enterprise strategic 
directions or goals.  If such mapping is not obvious when comparing 
measures and high-level goals, the IT function is probably not 
measuring the right things. 


Case Study 1:
Linking Measures to Specific IT Goals and Objectives 

The city of Phoenix's Information Technology Department (ITD) wanted a 
set of business-oriented customer-based measurements to measure the 
effectiveness of its ITD operations.  One manager said, "We set forth 
to come up with a 'net result measuring scheme.'  It would (1) focus 
on customer satisfaction, (2) be employee friendly, (3) not affect 
program work, (4) improve customer satisfaction, and (5) give the 
customer the ability to understand the value our organization gives to 
them, and let them suggest how to help us get better."

The department wanted to (1) lay the groundwork for continuous 
monitoring and improving customer satisfaction with ITD service, (2) 
better define the value-added of ITD services from a business 
perspective, (3) identify continuous improvement needs in internal ITD 
operations, and (4) strengthen ITD relationships with other city 
departments and manage customer expectations using hard facts and 
measures.  

The department identified measures based on input from key selected 
customers and staff and the ITD objectives.  Objectives were developed 
using the City of Phoenix Strategic Directions Report and the IT 
architecture vision.  Generally, the strategic directions were to (1) 
create and maintain a well-informed and involved community with a 
clean and safe environment, efficient transportation systems, a 
quality education system, and economic opportunity for all citizens, 
(2) provide cost-effective, high-quality community services through a 
productive, efficient organization and an empowered work force, and 
(3) generate and maintain desirable revenue flows and sound financing 
within a system of rational resource allocation.  Goals for the 
architecture vision were to make all applications integrated and fully 
compatible, using a city-wide deployed IT architecture supported by 
commonly shared data repositories.  The department then revised the 
measures to meet all of the ITD objectives and correlated the measures 
to customer-defined goals of time, cost, and customer satisfaction.  

The table on the following page shows how Phoenix measures progress 
towards achieving department objectives.  These measures focus on 
timeliness (service delivery), cost (producing city services at 
acceptable or reduced cost), or customer satisfaction (evidence that 
internal and external customers are satisfied and that ITD's 
contribution can be quantified).


     













Case Study 1:
The City of Phoenix:  Relating Performance Metrics to IT Objectives

IT Department Objectives for Meeting City Strategies
                  Net Results Metrics (Examples)Type of MeasureIT 
                                                    Objectives Linked 
                                                    To Measures

1   Build partnerships with city departments.Successful delivery of 
                  ITD products or services on time v. goal.Time2, 3, 
                                                    4, 5, 6, 11

2   Provide enabling technology to city departments.Reduced delivery 
                  time v. goal.          Time       3, 4, 5, 6, 11

3   Increase customer satisfaction.Problem responsiveness v. 
                                                    goal.Time1, 3, 4, 
                                                    5, 6, 8, 12

4   Reduce cycle times in delivery of ITD products and 
                  services.Reduction attributed to ITD v. goal.Cost3, 
                                                    9, 10, 11, 12

5   Reduce service delivery costs.Customer satisfaction and 
                  relationships v. goal. Customer satisfactionall

6   Improve service delivery processes.Reliability of products and 
                  services v. goal.      Customer satisfaction3, 6, 8, 
                                                    12

7   Implement citywide technology architecture.
                  Systems using citywide architecture/new emerging 
                  technologies v. goal.  Customer satisfaction1, 3, 4, 
                                                    7, 8, 9, 10

8   Improve citizen access to information and services through 
    technology.   ITD staff trained in customer service skills and new 
                  technology v. goal.    Customer satisfaction1, 2, 3, 
                                                    9, 10, 11

9   Increase resource versatility.New ideas received and adopted from 
                  ITS staff v. goal.     Customer satisfaction1, 3, 6, 
                                                    9, 10, 11

10  Improve leadership skills.Employee satisfaction v. goal.Customer 
                                         satisfaction3, 9, 10, 11

11  Increase employee confidence.Effective communications v. 
                                         goal.Customer satisfaction1 
                                                    ,2 ,3, 4, 11, 12

12  Achieve world class resultsITD involvement in departmental 
                  technology planning v. goal.Customer satisfaction1 
                                                    ,2, 3, 4, 7, 11, 
                                                    12

                                                    






Case Study 2:
Connecting Goals, Strategies, Objectives and Measures

To better meet customer needs and respond to a changing market 
environment, in the mid 1990's Xerox developed a Xerox 2000 business 
strategy.  In response to the new strategy, the information management 
organization--Global Process and Information Management 
(GP&IM)--developed four strategies and three breakthrough goals to 
align IM programs to support achievement of the outcomes anticipated 
by the corporate level business drivers.  Senior executives ultimately 
decided on eight measures to determine whether IM programs were 
supporting corporate-level strategies.

GP&IM concluded that five factors were significantly influencing how 
it developed a new IT strategy for Xerox.  These were (1) the push to 
reengineer Xerox business processes, (2) knowledge that the current 
information management environment was not the foundation for the 
future of Xerox, (3) information systems investment decisions were 
currently driven by "entitlements," not a strong selection strategy, 
(4) renewal of the existing infrastructure was too costly, and (5) the 
cycle time for information management was not meeting business 
requirements.

GP&IM then developed four strategies to respond to these five IT 
strategy "drivers":  (1) reduce and redirect information spending 
through global outsourcing, reducing legacy system spending, and 
consolidating and sharing resources, (2) infrastructure management 
including asset management,  (3) leverage worldwide information 
resources of people, hardware, and software, and (4) develop business 
process-driven solutions.  

As shown in the figure on the following page, GP&IM has several 
outcomes identified for each of the four strategies.  GP&IM also 
developed three proposed high-level "breakthrough" goals covering the 
four strategies, shown on page 24.  The goals are defined as follows:

Manage IM spending suggests that IM2000 must be to managed to a flat 
ceiling for base spending while doubling the amount of investment in 
new development.

Deliver infrastructure renewal determines the penetration of the new 
infrastructure in conjunction with some renewal of the old 
infrastructure (known as Globalview).

Deliver business process solutions covers reengineering process 
deployment and drives operational optimization and the development of 
a baseline for day-to-day work.  The baseline is for comparison with 
Electronic Data Systems (EDS) outsourcing arrangements.   Each 
breakthrough goal area has measure examples.

Xerox top management decided that the eight measures cited in figure 8 
provided the right information on whether their IT strategies were 
being achieved.   For example, the measures told them how well they 
were performing in actually redirecting IT spending into new 
applications and systems development, retiring the old infrastructure 
and replacing it with more companywide solutions, reducing operations 
and maintenance costs, and whether new process-driven IT solutions 
were providing timely, high-quality products near estimated costs.





  Case Study 2
  Xerox IM 2000:  Connecting Strategies With Outcomes





  Case Study 2:
  Xerox IM2000:  Breakthrough Goals and Related Measures




          2. Prepare a Chain of Events and Evidence

A common measurement problem is determining what impact support 
functions such as IT have on operational customer or enterprise 
outcomes in comparison to other factors.  Since final enterprise or 
program outcome measures are difficult to measure for a support 
function like IT, it is useful to think in terms of a "chain of events 
and evidence."

If given careful consideration, a chain of events and evidence can 
help IT managers understand how IT resources are being used to support 
mission or specific program-related outcomes.  In some cases, IT is 
not directly related to final program results.  However, there are 
intermediate outcomes that must occur before the program or business 
outcome can be achieved.  In contemporary organizations, IT can play a 
pivotal role in achieving these intermediate outcomes.

A chain of events is a theory of how inputs in an action chain move to 
outputs and finally to end outcomes.  A corresponding chain of 
evidence shows the evidence and measures that match to the chain of 
events.  The chains of events and evidence recognize that agency 
programs set in motion a sequence of events which are expected to 
achieve desired goals.  The sequence of events bridges inputs to 
outputs to outcomes.

The hypothetical example presented in figure 5 illustrates a simple 
chain of events, evidence, and measures for an environmental water 
quality program.  At the beginning point of the program results chain, 
a series of simple events are mapped out to indicate steps essential 
to achieving the end outcome of a desired level of environmental 
quality.  Certain types of evidence would then be used as indicators 
of whether the steps are occurring.  IT provides essential support for 
several of the steps in the results chain which can indirectly affect 
the achievement of the program outcome.  Measuring how well IT is 
supporting these intermediate steps is necessary to demonstrate how 
well it is performing from a mission perspective. 
Figure 5:A Hypothetical Chain of Events/Evidence for Achieving 
        Environmental Quality






In an iterative process, organizations examine the chain of events and 
evidence for enterprise and operational customer goals and objectives.  
The chains explain how an enterprise or operational customer produces 
its results and just what those results are.  The assumption in using 
the chains of events and evidence is that there is a clear 
understanding of just how a program, for example, is supposed to work 
in producing the expected results.

In turn, a support function such as IT needs to define its own chains 
of events and evidence, demonstrating how its products and services 
affect enterprise and operational customer chains.  For IT, the 
operational customer is interested in having IT applications that help 
achieve efficient and effective program operations and service 
delivery.  In essence, final IT outcomes are often efficient and 
effective program operations-- actually an operational customer's 
goal.  The organization in
large part gauges IT failure or success by how well IT supports the 
chain of events the operational customer has in place to achieve its 
desired effect.

Building chains of events and evidence in partnership with enterprise 
and organizational customers can be a difficult process, but it helps 
IT managers understand exactly how IT products and services support 
customers and how IT performance should be measured.  Using a chain of 
events approach also enhances customer understanding of just how IT 
can contribute to the eventual enterprise or program outcome.  

3.Use a Diversity of Measures to Evaluate IT Performance

An effective IT performance management system should have a diversity 
of measures, matched to the right organizational need and level of 
decision-making, and action taken on the measure.  These measures can 
capture performance at the individual, organizational, program, and 
process levels.  Generically, measures--both qualitative and 
quantitative-- are often categorized into four main types:

 Input Measures are assessments of the resources used to carry out a 
  program or activity over a given time period with the purpose of 
  achieving an outcome or output.  Input measures can include number 
  of IT managers and employees, labor hours, IT funds, computer and 
  telecommunications equipment or facilities, or supplies.  

 Output Measures are assessments of the actual level of work 
  accomplished or services provided over a given time period.  They 
  are often used to control resources.  Number of reports issued, 
  number of projects completed, number of answers on hot line 
  services, and function points delivered are examples of output 
  measures.  These, too, are often process measures.

 Outcome Measures assess the actual results, effects, or impacts of a 
  program or support function compared to its intended purpose.  
  Outcomes can be difficult to measure because results may not be 
  immediately evident or several organizational units and external 
  suppliers or customers are involved and it is difficult to assign 
  relative contributions to them.  Outcome measures may be the level 
  of customer satisfaction with IT services or cycle time reduction 
  attributable to automated work processes.

 Combinations of Single Measures combine single output, outcome, 
  and/or input measures into measures designed to demonstrate 
  improvements in efficiency or effectiveness.  An efficiency measure 
  is output over input, such as number of PC applications installed 
  per number of IT staff.  An effectiveness measure may compare actual
      results to estimated or expected results or compare existing 
levels     of performance (output) to accepted industry standards or 
target       performance goals.  

A wide range of measures provide a balance for different decisionmaker 
needs.  Input and output measures assess workload for an enterprise or 
specific program and how much in demand are its products and services.  
Combination measures assess efficiency and effectiveness.  Outcome 
measures assess results compared to expectations.  The key point is 
that the right measure is used at the right time and for the right 
reason.  Input and output measures are absolutely vital for measuring 
how well a process to deliver IT products and services is performing.

Use Contextual Information to Augment
Performance Measures

The organizations we studied track contextual or explanatory 
information to use with their IT performance measures.  The contextual 
or explanatory information describes the broader environment that 
surrounds IT activities which can influence IT inputs, outputs, and 
outcomes.  For example, measuring changes in mainframe to client 
server use, policy changes that impact performance, or changes in IT 
organizational structure could be important explanatory information.

Capturing contextual and explanatory information can help managers 
understand the measures, assess performance, and evaluate the 
significance of underlying factors that may affect reported 
performance.  Managers can also capture actions that have been taken 
or are being taken to in response to reported information, 
particularly for unexpectedly high or low performance.  Often, IT 
performance data displays show the contextual or explanatory 
information either as a footnote to the performance data or in an 
explanatory page attached to the performance information.


How to Get Started

To proceed with the development and use of an IT results chain 
approach, organizations should:

 clarify--with top management participation and stakeholder 
  involvement--major organizational goals and objectives;

 establish a simple chain of events and evidence for key mission 
  areas of the organization;

 create supporting IT outcome and process measures for each 
  organizational mission goal area;

 use one of the organizational goals and objectives to develop IT 
  goals,  specific objectives, and related performance measures for 
  that IT goal; 

 examine existing IT measures, categorize them as input, output, or 
  outcome measures, and decide on the combination of measures that 
  best provides performance results; and

 test the performance measurement system and make revisions based on 
  initial lessons learned.






Practice Area 2:
Follow A "Balanced Scorecard" Approach









Practice Area Characteristics

1. Develop IT goals, objectives, and measures in operational
   and strategic areas
2. Focus on the most important "vital few" objectives and
   measures in four IT goal areas:
      Achieving the strategic needs of the enterprise
      Satisfying the needs of individual customers
      Fulfilling IT internal business performance
      Accomplishing IT innovation and learning














        
        
Practice Area Overview

A second best practice is to use a balanced scorecard approach to IT 
performance measurement.  The approach attempts to create a 
measurement balance across the overall performance management 
framework.  A balanced approach to measuring the contribution of IT to 
mission outcomes and performance improvement recognizes the broad 
impact of IT's supporting role.  By measuring IT performance across 
four goal areas that are critical to overall IT success, the scorecard 
forces managers to consider measurement within the context of the 
whole organization.  This limits the possibility of overemphasizing 
one area of measurement at the expense of others.  In addition, 
measuring IT performance from different perspectives helps strengthen 
the analysis of intangible and tangible benefits attributable to 
technology.

In the four IT goal areas discussed in this section, we present three 
or four key objectives that were common among the organizations we 
examined. Corresponding to each objective we provide some sample 
measures that come from our case study research and from supporting 
literature.  Our purpose is to illustrate possible types of measures, 
not to prescribe a definite list of measures that all organizations 
should be using.  Some of the measures are very basic, but they are 
clearly related to the objectives.  Also, many of the measures are 
percentages or ratios.  This is important because successful 
organizations begin with good baseline data on performance and, 
therefore, can accurately measure progress against the baseline as 
they move forward.

In several of the organizations we studied, management is developing 
measures across key areas covering both long- and short-term 
strategies and activities.  This approach is best captured in Robert 
Kaplan and David Norton's balanced scorecard approach, which many of 
the organizations either used directly or incorporated into the 
development their own approaches.[11]   The Kaplan and Norton 
scorecard evaluates performance in four areas:  financial (how does 
the organization look to shareholders?), customer (how do customers 
see performance?), internal business (at what must the organization 
excel?), and innovative and learning (can the organization continue to 
improve and create value?).  

In order to summarize the IT performance methods being used by the 
organizations we studied, we have adopted a balanced scorecard 
approach similar to the Kaplan and Norton framework.  However, a 
balanced scorecard is just one approach available for agencies to 
adopt in conducting IT performance management and measurement.  Other 
approaches such as the value management framework, critical success 
factor analysis, and information economics also offer useful IT 
performance measurement methodologies.[12]  Like other methodologies, 
a balanced scorecard approach translates organizational strategy into 
specific measurable objectives, operating from several key concepts:

  no single measure provides clear performance targets or places 
   attention on critical mission areas,
  goal, objective, and measure areas should give a comprehensive view 
   of all levels of activities, from the project level to the 
   strategic level,
  limiting the number of measures used minimizes information 
   overload, and

  a scorecard guards against optimizing one goal area at the expense 
   of others.

          
          Practice Area Characteristics

          1. Develop IT Goals, Objectives, and Measures in
             Operational and Strategic Areas

For IT, measures cover a great diversity of value-added activities, 
including those for projects, a portfolio of applications, and 
infrastructure development.  Organizations should know about success 
in all of them.  As shown in figure 6, an IT results chain can be 
translated into a scorecard framework that looks at goals, objectives, 
measures (tiered for various decision-making levels), and 
accountability in key goal areas.  The key starting point in 
developing a balanced scorecard is the question of purpose from the IT 
results chain--"What is the current and future purpose of IT?  Then to 
meet that purpose, the IT organization must answer the goal 


question--"If we succeed, how will we differ?" in terms of 
specific goals.

Figure 6:  An IT Balanced Scorecard Approach


The IT goals and objectives of the organizations we studied most often 
focused on the following:

  customer commitments and satisfaction,
  cycle and delivery time, 
  quality, 
  cost, 
  financial management,
  IT infrastructure availability,
  internal IT operations, 
  IT skill availability, and 
  customer business process support.  

For example, Motorola's Semiconductor Products Sector (SPS) focuses 
its goals and objectives on four areas: (1) delivering reliable 
products (quality environment, best-in-class staff, worldwide resource 
optimization, communication),  (2) providing integrated IT solutions 
(integrated data and systems architecture, technology roadmap and 
migration planning, distributed computing effort),  (3) building 
client partnerships (client involvement, lead the information 
technology community), and (4) achieving competitive advantage 
(prioritize the projects that benefit SPS, deploy resources for 
maximum impact, speed of execution).

We developed four balanced scorecard goal areas, objectives, and 
measures for IT that were among the most common across the 
organizations we studied.  As such, the four goal areas illustrate an 
approximate consolidation of the performance management efforts of the 
organizations involved in our research.  The four balanced scorecard 
goal areas are designed to measure how well IT is

  achieving the strategic needs of the enterprise as a whole, in 
   contrast to specific individual customers within the enterprise,
  satisfying the needs of individual customers with IT products and 
   services,
  fulfilling internal IT business performance that delivers IT 
   products and services for individual customers and the enterprise, 
   and
  accomplishing ongoing IT innovation and learning as IT grows and 
   develops its skills and IT applications.

The first two goals address whether IT is providing the right products 
and services  for the enterprise and individual customers.  The latter 
two goal areas address how well IT is performing in its own capability 
to deliver those products and services.   The strategic and customer 
perspectives are key for linking to mission planning requirements in 
the Results Act, the Chief Financial Officers Act, the Paperwork 
Reduction Act, and the Clinger-Cohen Act.

Managers in our case study organizations emphasized that "balance" 
does not necessarily mean "equality."  Use of a balanced approach only 
means the consideration of several goal areas and the development of 
objectives and measures in each.  For example, Kodak managers liked 
the balanced scorecard approach because it was a multivariable 
approach.  Before using it, the IT organization was very cost 
conscious and tended to judge investments in new applications or 
skills largely from a cost perspective.  The balanced scorecard 
examines short-term cost goals and potential business value in the 
context of various other nonfinancial operating parameters.

          2. Focus on the "Vital Few" Objectives and Measures

Each leading organization customizes a set of measures appropriate for 
its organizational goals and, for IT, how IT fits into the 
enterprise's strategic direction and mission delivery plans.  The 
organizations concentrate their IT performance management efforts on a 
vital few objectives and measures within the goal areas.  The 
organizations did not severely limit the number of measures developed 
at the beginning.  But, over time, and with experience, the 
organizations became more focused in the measures they used.  However, 
use of a  balanced scorecard approach gets rid of "safety net" 
measures which organizations often collect but do not use for 
decision-making, resource allocation, or oversight reporting purposes.

As is explained in the sections that follow, the measure examples 
illustrate the need for diversity.  Within some of our case study 
organizations, similar measures are being used, but the measures 
remain under development, requiring more refinement and documentation.  
The measures presented here do not represent the full universe of what 
an organization might use.  Also, in practice, the goal and objective 
areas may be more specific than those presented on the following 
pages.  For example, one of our goal areas centers on the customer 
perspective.  One objective of this goal area is customer 
satisfaction.  In practice, an actual customer objective statement 
might be stated as "This fiscal year, at least 98 percent of customers 
will be satisfied with IT products, services, and processes."  In 
short, the following sections discuss a general categorization of IT 
goals, objectives, and sample measures.


Balanced Scorecard Goal Area 1:
Achieving the Strategic Needs of the Enterprise






IT strategic measures are designed to evaluate the aggregate impact of 
IT 
investments on the organization.  In short, these measures provide 
insights into impacts made by the organization's entire portfolio of 
IT investments.  
This goal area focuses on ways to measure how IT supports the 
accomplishment of organizational strategies.  The strategic 
perspective recognizes that in successful organizations, all 
components, including IT, must align with enterprise goals and 
directions.  

When evaluating the impact of IT in terms of strategic needs, the 
following questions should be considered:

    How well integrated are our IT strategies with business needs?
    How well is the overall portfolio of IT investments being 
managed?
    Is IT spending in line with expectations?
    Are we consistently producing cost-effective results?
    Are we maximizing the business value and cost effectiveness of 
IT?
  
IT managers and staff often attempt to satisfy individual operational 
customers without a check against enterprise interests.  Having this 
goal area prevents targeting IT efforts for individual operational 
customers which may be very counter-productive to enterprise IT needs 
and expectations.  Doing so is difficult, as one manager said, "It has 
been a cultural plan to look at [the company] as a whole versus 
maximizing for individual business partners.  The reward and incentive 
systems are set up to emphasize that all senior managers must succeed 
together."

The four IT strategic enterprise objectives presented in figure 7 
reflect  several key objective areas of the organizations we studied.  
These objectives cover enterprise strategic planning and goal 
accomplishment, enterprise management of the portfolio of IT 
applications, IT financial and investment performance, and use of IT 
resources across the enterprise.

The first objective in this goal area addresses how well IT plans and 
efforts reflect enterprise mission goals.  This objective area assumes 
the enterprise has defined its mission goals and can make the clear 
link to how IT supports those goals.  The sample measures capture the 
contribution of IT solutions and services, compare what was planned 
for IT benefits and IT strategies against what actually happened, and 
compare IT strategies and planning and enterprise strategies and 
planning.  The overall measurement thrust is to make sure that 
enterprise mission goals direct IT activities.

The second objective, portfolio analysis and management, is a growing 
concern among the organizations we studied.  Leading organizations 
want to make sure they have the right portfolio of IT applications 
either planned or in place that will enhance business or mission 
performance.
Figure 7:  Balanced Scorecard-IT Strategic Measures





Kodak defines an application portfolio as a comprehensive inventory of 
computer applications that were developed or purchased to manage an 
organization's processes and information.  The inventory contains 
detailed data relative to each application's size and characteristics, 
effectiveness in meeting business needs, potential for growth, and 
development and maintenance costs.  The application portfolio forms 
the foundation for an overall IT investment strategy.

Xerox has defined its IT inventory in a similar manner and made IT 
portfolio management a key objective area as part of its overall 
strategic enterprise strategies.  As described in an earlier case 
study describing its IM2000 strategy, Xerox wanted to improve 
information management spending, deliver IT infrastructure renewal, 
and deliver process-driven IT solutions to customers.  A key part of 
the overall IT strategy was to evaluate existing IT applications and 
determine how they supported, if at all, the IM2000 strategy.  Xerox 
ran each of its existing applications through a rigorous analysis 
process, categorizing each into one of nine "disposition" categories 
ranging from stopping those of low usage and value to keeping others 
as corporatewide applications.

For Xerox, the IT portfolio strategy helps accomplish several 
performance goals.  The strategy reduces unnecessary operational costs 
and increases support productivity; identifies and consolidates 
similar applications and eliminates low value applications; identifies 
and retires legacy applications, data, and infrastructure as new 
solutions are deployed; and identifies consolidation and sharing 
opportunities.  The principle is to view application systems as 
corporate assets.[13]

The third objective in this goal area examines financial and 
investment performance. While the two objectives above cover mission 
goals and portfolio management, this objective addresses management of 
IT costs and returns.  The sample measures capture costs in major IT 
financial categories such as hardware and software.  They also provide 
information on the balance of spending between legacy and new 
development applications and between in-house and outsourced 
operations.  Another sample measure compares the IT budget to the 
enterprise operational budget, and how that compares to industry 
standards.

Sample measures also look at the return on the IT investments, 
offering several different methodologies such as rate of return and 
net present value.   Much of this information is traditionally 
benchmarked with other IT organizations of similar size and IT 
penetration.  These measures are tied to customer and enterprise 
strategic perspectives to assess if scarce resources are being 
invested wisely.  This is an especially important area in the federal 
government with the emphasis on cost reduction and the best possible 
use of existing resources.

Lastly, IT resource usage as an objective targets how well the 
organization can leverage and share its IT resources across the 
enterprise.  The measures evaluate factors such as what resources can 
be shared, what has been consolidated, and employee access to 
computing services.  From a strategic perspective, this objective 
recognizes the need for shared, enterprisewide applications and the 
use of an IT infrastructure and architecture for the entire 
organization.



Balanced Scorecard Goal Area 2:  
Satisfying the Needs of Individual Customers





IT customer measures are designed to measure the quality and cost 
effectiveness of IT products and services.  When evaluating the impact 
of IT on customer satisfaction, the following questions should be 
considered:

  How well are business unit and IT staff integrated into IT systems             
   development and acquisition projects?  
  Are customers satisfied with the IT products and services being 
   delivered?
  Are IT resources being used to support major process improvement            
   efforts requiring information management strategies?  If so, are 
   the IT projects delivering the expected share of process 
   improvement?

The purpose of the second goal area is to meet the needs of individual 
operational customers.  The three objectives shown in figure 8 capture 
the key objective areas we found in our research.  



Figure 8:  Balanced Scorecard-IT Customer Measures



Two of the objective areas, customer satisfaction and business process 
support, address direct IT support.  Customers were especially 
interested in time, cost, quality, overall customer satisfaction, and 
business process support.  One official we talked to said, "[Our IT 
organization] looks at the business process characteristics of our 
customers.  IT personnel ask are there better ways to support product 
innovation and development?  How does IT support that?  The question 
is the effectiveness of IT in supporting business processes--not 
cranking out function points."[14]

The first objective area, customer partnership and involvement, 
stresses a mutual partnership between the IT organization and 
customers in developing the best possible IT products and services.  
The sample measures examine a variety of areas, ranging from 
cooperation and joint development to involvement in project 
management.

Customer satisfaction measures assess how well customers are satisfied 
with many IT activities.  Sample measures also cover the 
accomplishment of system design requirements, complaints, problem 
resolution, error and defect rates, timeliness, and service-level 
agreement accomplishments.
  
The business process support objective area emphasizes the importance 
of business process improvement as organizations streamline and 
reengineer.  Business process improvement is a central objective area 
for many of the organizations we studied.  The sample measures capture 
how well IT supports business process improvement plans and process 
analysis.  They also examine the adaptability of IT solutions, 
training for new IT solutions and the effectiveness of the training, 
and costs in moving applications to new hardware.



Balanced Scorecard Goal Area 3:
Addressing IT Internal Business Performance





Internal IT business measures are designed to evaluate the operational 
effectiveness and efficiency of the IT organization itself.  The 
ability of the IT shop to deliver quality products and services could 
have a direct impact on decisions to outsource IT functions.  When 
evaluating internal IT business functions, the following questions 
should be considered:

    Are quality products delivered within general industry standards?
    Are quality products being delivered using accepted methods and 
tools?
    Is our infrastructure providing reliable support for business 
needs?
    Is the enterprise architecture being maintained and sustained?

One manager we interviewed said, "There are two dimensions of [IT] 
performance.  One is the dominant or visible component--the use of IT 
in the context of [customer] business processes. The other is 
transparent--the functional excellence of IT." The first two goal 
areas stress the use of IT as it supports enterprise and operational 
customers.  On a day-to-day basis, it is the functional excellence of 
IT internal business processes which delivers that support.  Figure 9 
shows four objective areas and sample measures we synthesized from our 
case study organizations and the general IT literature.  




Figure 9:  Balanced Scorecard-IT Internal Business Measures

IT managers and staff, along with enterprise senior management, decide 
which of the many IT processes truly must excel for meeting customer 
and enterprise goals in the short and long term.  For example, is the 
IT process for identifying the right technology for customer 
applications the best it can be?  IT managers and staff set specific 
goals for improvement of internal IT business processes.
The first objective covers IT's performance in developing and 
maintaining applications.  The sample measures include dollars 
expended per function point, average application development cycle 
time, and cost.  The second objective area examines the performance in 
delivering projects, capturing traditional measurements on project 
time, budget, functionality, and use of widely accepted methods and 
tools.  Measures also capture backlogs in both development and 
enhancement or maintenance of applications.

The third objective area addresses IT infrastructure availability in a 
variety of areas, as well as response time and transactions.  Many of 
the organizations we studied stressed the importance of infrastructure 
availability, an area totally transparent to the customer until 
something goes wrong.  These measures keep managers on top of 
infrastructure performance where there is little tolerance for down 
time.  The last objective area covers architectural standards.[15]  
The measures assess how well IT is meeting set standards, most often 
developed for interconnectivity and interoperability and efficient IT 
support.

Many of the traditional IT measures fall into the internal business 
performance goal area, often focusing on the efficiency of computing 
and communications hardware and software.  The measures in this goal 
area frequently are used for individual manager and staff IT 
accountability, as described in a later practice.

Some of the organizations we studied were using the Software 
Engineering Institute's five-level capability maturity model to guide 
their IT process improvement efforts.  The objective areas and 
measures are, in contrast to some of the other balanced scorecard 
areas, highly integrated.  For example, project performance relies on 
effective applications development and maintenance.



Balanced Scorecard Goal Area 4:
Addressing Innovation and Learning





Innovation and learning measures evaluate the IT organization's skill 
levels and capacity to consistently deliver quality results.  This 
goal area recognizes that without the right people with the right 
skills using the right methodologies, IT
performance will surely suffer.  Measures in this goal area should be 
used to answer the following questions:

  Do we have the right skills and qualified staff to ensure quality 
   results?
  Are we tracking the development of new technology important to our         
   business/mission needs?
  Are we using recognized approaches and methods for building and 
   managing IT projects? 
  Are we providing our staff the proper tools, training, and 
   incentives to perform their tasks? 

The four objective areas shown in figure 10 include workforce 
competency and development, advanced technology use, methodology 
currency, and employee satisfaction and retention.

Figure 10:Balanced Scorecard-IT Innovation and Learning
          Measures




This goal area develops the continuous improvement aspect of IT 
activities.  It speaks to capabilities of bringing new technologies to 
bear on customer problems, practicing the best methodologies, and 
retaining and developing the best employees.  The first objective area 
stresses the importance of having a capable
and competent workforce.  In particular, the organizations we studied 
were very concerned with workforce competence and development.  Key 
measures included training hours and skill development.  Most were 
transitioning from core competencies in operations and maintenance to 
business process improvement and reengineering, new business 
solutions, and technical direction of applications development done by 
others.

The second and third objectives, advanced technology use and 
methodology currency, speak to the ability to recognize and deploy 
advanced technologies and methodologies in doing IT's work.  The last 
objective, employee satisfaction and retention, measures how well 
employees themselves are satisfied with the quality of their work 
environment and general IT strategies and accomplishments.

How to Get Started

   To begin developing a balanced scorecard approach for IT, 
   organizations should:

    get agreement among business and IT management on the approach 
     that will be used for developing IT-related performance 
     indicators and measures,

    using the agreed upon approach, define and develop the key goal 
     areas and objectives for the IT organization,

    develop a full set of measures in one or two priority IT goal 
     areas, then expand out to other goal areas, and

    test the performance measurement system and make revisions based 
     upon initial lessons learned.







Practice Area 3:
Target Measures, Results, and Accountability at Decision-making Tiers

Practice Area Characteristics:

    1.Track IT measures and appropriate reports to each 
     decision-making level
    2.Align measures from the bottom to the top
    3. Directly link tiered measures to the balanced  scorecard
    4.Align individual accountability to IT scorecard goals



Practice Area Overview

Organizations in our study targeted measures and performance reports 
at specific decision-making levels or tiers.  These tiers cover the 
enterprise (agency) level, senior to mid-level management (program) 
level, and specific operations or project level. This approach offers 
several advantages, including (1) enhanced communication and 
understanding of performance measurement throughout the organization, 
(2) systematic links among measures and enterprise, program, project, 
and individual objectives, and (3) alignment of measures with mission 
results.  IT performance information should drive management actions 
and decisions that support the attainment of organizational goals.

Practice Area Characteristics

1.  Track IT Measures and Reports to Decision-making Levels

As shown in figure 11, performance measures and reports at each tier 
have specific purposes.  At the enterprise tier, IT performance and 
measures focus on mission results, or how well IT is meeting its 
purpose in supporting enterprisewide goals and objectives.  
Information on final and intermediate outcomes of programs facilitated 
by IT projects and investments would be shown.  A summary report may 
be prepared for 
external reporting to stakeholders and the general public.  Reports 
may be prepared on an annual or quarterly basis and highlight IT 
policy-oriented information showing areas of progress, problems, and 
contextual or explanatory information to supplement the performance 
data.




Figure 11:  Performance Measurement Tiers
At the IT program or support unit level, senior to mid-level managers 
want to know how specific units or processes are performing.  
Measurement most often covers specific IT business processes such as 
applications development or lines of business (or core business 
areas).  At this tier, more detailed performance information is used 
for management and the improvement of operations and integrating 
activities across IT processes or programs.

In the third tier, or bottom level, the measurement emphasis is 
generally at the project level and individual systems performance.  
Highly detailed tactical and execution information supports immediate 
and day to day decision-making on funding, contract development and 
monitoring,
project priorities, and possible adjustments in program operating 
procedures.  Here the emphasis is on input and output measures.

Tiering in this manner can help assign performance accountability and 
determine where IT measurement ownership lies.  The organizations 
decide where pieces of IT performance accountability rest, such as 
user or operational departments, central IT organizations, and/or 
department or operational IT units.  In decentralized organizations, 
many people are 
Case Study 3: 
Using Performance Measures for Different Management Tiers

Xerox information management operational measures are targeted at 
three distinct organizational tiers.  As shown in the following 
figure, Tier 1 consists of corporatewide metrics, targeted for senior 
managers and executives.  Tier 2 focuses on measures used by specific 
organizations, divisions, and core business areas.  These measures 
assess what is happening below the enterprise level and actually roll 
up Tier 3 metrics collected at the project level.
involved in the IT processes that deliver products and 
services.










 



   Case Study 3:
   Xerox Tiered Information Management Matrix



Case Study 4:  
Decision-making Levels Use Different Types of Measures

American Express' Technologies group is also developing a tiered 
measurement process for decision-making at all management levels.  
Tier 1 measures consist of executive information that represents the 
Technologies group's overall effectiveness against such goals as (1) 
achieving world class time-to-market,  (2) developing new business, 
and (3) enabling business partners.  Specific measures include 
development cycle time, time requirements for ongoing operations, cost 
of ongoing operations, cost of quality, leading quality indicators, 
and elimination of root causes of failures.  

Tier 2 measures consist of management information for senior and 
mid-level managers and have direct links to tier 1 measures.  To 
illustrate, tier 2 measures supporting the tier 1 measure of 
development cycle time include "elapsed time per function point" and 
"effort hours per function point." 

Tier 3 measures are operational information used for project 
development, operations, and project leaders. Tier 3 information links 
directly to tier 2 and forms the basis for management decisions, root 
cause analysis, and continuous process improvement evaluations.  
Specific performance measures are used to evaluate individual projects 
and applications.
          2. Align Measures From the Bottom to the Top of the 
             Organization

A key performance feature found in the organizations we studied is the 
notion of aligning--but not necessarily "rolling-up"--measures from 
the bottom to the top of the organization.  IT measures used at the 
lowest tier--the specific operations or project level--must align 
upwards with the subsequent tiers.  In other words, the IT input and 
output information collected for tactical and execution management 
must directly relate to unit results needs and then upwards to the 
enterprise level.  This alignment helps to ensure that performance 
measures and information at the lowest level directly support policy 
and mission decisions and strategies.  

Only rarely will an organization have a single performance measure 
appropriate for all three levels, or, in other words, those that can 
"roll-up" to the top of the pyramid.  A temptation in performance 
measures is to layer measures from lower levels on top of each other 
and pass the information along to higher-level officials.  This 
approach may provide an overload of information not easily 
understandable or digestible by top executives or even the public.  It 
also creates the potential of hiding embarrassing details in a 
mountain of data, or can promote the self-selection of favorable data.  
A few of the bottom tier measures may "roll 
up" into strategic measures of interest at the other two tiers.  
However, the type and formatting of IT information and measures and 
timing of performance reporting appropriate at one tier may not be 
appropriate for others.  Use of all three tiers gives the organization 
a comprehensive picture of the value of IT and if the individual IT 
products, services, and processes were worth the investment.

3. Directly Link Tiered Measures to a Balanced Scorecard

Once the IT organization has agreement on balanced scorecard goals and 
objectives, then it would develop limited tiered measures to address 
specific operational or project measures, program and support unit 
measures, and enterprise-level measures.  The tiering of measures 
across the balanced scorecard area facilitates the use of measures for 
decision-making and performance improvement action.  Otherwise, if 
measures are not used, they lose their decision-making impact, which 
results in less effort to collect and use them and, in turn, leads to 
less decision-making impact.

One tier is not more important than the other two.  As one manager 
noted, an operational customer will never ask for reports containing 
performance data found in the bottom tiers, such as mainframe 
availability or application availability.  But this operational data 
provides vital information about how well IT is supporting program 
operations and is indirectly linked to customer satisfaction measures.  

The use of performance measurement tiers is not a novel concept, but 
reflects a change in how measurement information is used by an 
organization.  Traditional IT performance measures are used to examine 
lines of code generated, number of reports issued, data center 
downtime, transactions, and the number of projects on time.  These can 
be considered bottom tier measures.  More recently, management 
emphasis has shifted towards performance-oriented customer 
requirements, productivity and quality improvements, selection of 
strategic projects, and weighing value of delivered systems.  

          Using some specific objectives under the balanced scorecard 
          goal areas discussed in Practice 2, figure 12 provides some 
          hypothetical examples of IT measures that might be used at 
          different organizational tiers.  A combination of input, 
          output, and outcome measures are sprinkled throughout the 
          tiers to accommodate different management information 
          decision-making and reporting needs.



                     Figure 12:  A Tiered Performance Scorecard 
Example
























4.  Align Individual Accountability to IT Goals

The last question that ties together the IT balanced scorecard and the 
IT results chain is "Who is accountable for results and how are they 
held accountable?"  The leading organizations have learned that 
managing performance well depends on making the connection between 
program and IT purpose, goals, and objectives, and responsible teams 
and individual staff.   

Alignment begins when organizational and program goals and objectives 
are translated into action plans for the improvement of IT products, 
services, systems, processes, and practices.  Just as measure 
development involves many staff, IT performance plans and targets must 
be communicated to all levels of the organization.   Smaller 
components develop goals and objectives that support the higher level.  
In turn, individual accountability is based on the actions that each 
individual can take to contribute to the organization's goals.  For 
example, the system of accountability must recognize that the final 
outcomes of IT support 
activities are usually intermediate outcomes for the higher level 
program goals and targets. 

Actual levels of performance should be tracked against targets, and 
both IT and program managers should be held accountable for the 
results.  Measure and results reviews should report on the actual 
outcomes and individual performance appraisals should link IT 
performance to merit 
Case Study 5:
Aligning Individual Performance with Organizational Goals

Texas Instruments Information Systems and Services (IS&S) business 
excellence improvement process has many events involving measures and 
results.  Operational assessments include quarterly metrics and 
project reviews.  Information sharing events include a presentation of 
annual results and goals in January and quarterly communication 
meetings and results reports.  The assessments focus on sharing 
lessons learned, identifying opportunities to transfer best practices, 
uncovering potential risks, and assigning corrective action.  IS&S 
holds quarterly communication meetings open to all IS&S staff to 
discuss current topics and answer general questions.  IS&S leaders use 
quarterly department meetings to further disseminate information and 
plans.  Several electronic information sharing sources are available 
to IS&S staff.

IS&S uses a performance evaluation and development process to allow 
IS&S personnel the opportunity to detail their previous year's 
accomplishments and identify short and long-term job and career goals.  
The employee uses the process to align his/her goals to those of the 
organization.  The process is key to promotions, bonuses, and 
selection for a technical ladder--a designation recognizing technical 
staff for outstanding technical contributions and quality leadership.  
IS&S also uses recognition actions to support quality and performance 
improvement.  For example, recognition display boards are located in 
highly visible areas.  These display individual and team pictures of 
IS&S personnel honored for technical and quality contributions, as 
well as for involvement in team problem solving activities and 
community projects.

Each year, IS&S leadership sends a memorandum to Texas Instruments' 
top management on policy deployment and key performance measures.  
Policy deployment is Texas Instruments' term for aligning the 
individual to overall corporate business objectives and for continuous 
or dramatic improvement of key performance measures. 

The memorandum's performance areas match the Texas Instruments 
strategic directions described in an earlier chapter--customer 
satisfaction, continuous improvement, people involvement, and cycle 
time improvement.  Texas Instruments chose cycle time as a measure to 
drive dramatic improvement in a specific IT process (solutions 
provisioning).  Improving this one process as a starting point 
addresses IT structural problems that affect cycle time.  The IS&S 
leadership team has a business excellence improvement plan.  This plan 
leads to action plans for each IS&S division supporting the strategic 
business units and other key organizational units, to team action 
plans to individual contributions.
pay, bonuses, promotional opportunities, and 
other incentives.

Although this approach is controversial in the pubic sector, leading 
private sector organizations have found that failure to translate 
organizational strategy into program, team, and individual IT goals 
can 
result in a focus on short-term and tactical issues rather than the 
accomplishment of strategic goals.




How to Get Started

To begin targeting measures, results, and accountability at different 
decision-making levels, organizations should:

  identify enterprise tier IT performance information requirements,

  use the enterprise tier information to develop measures across one 
   balanced scorecard area, 

  begin adjusting management performance review processes at the 
   unit, team, and individual level to reflect the tiered approach, 
   and

  test the performance measurement system and make revisions based on 
   initial implementation efforts.



Practice Area 4:
Build a Comprehensive
Measurement, Data Collection,
and Analysis Capability


Practice Area Characteristics:

    1.    Use data collection tools
    2.    Develop and use baseline and benchmark information
    3.    Assess maturity and develop complete performance              
definitions
    4.    Utilize concise, understandable performance reporting
    5.    Conduct measure reviews and audits



Practice Area Overview

Building a balanced scorecard and tiered measures is only one step in 
designing an effective IT performance management system.  In the 
organizations we studied, management paid careful attention to the 
"back end" of performance management: data collection and analysis.  
As one manager explained, "You need a lot of analysis to figure out 
the key drivers for performance, like what will it take to raise 
customer satisfaction one or more percentage points.  And is that goal 
good for other measures such as cost and time?  What is the gain for 
the effort we must put in and is it worth it?"  

Most organizations began by benchmarking existing performance against 
different IT units within the organization, external IT organizations 
in other businesses, or industry benchmarks published by research 
organizations.  Next, performance definitions were agreed upon and 
existing data used to "baseline" existing performance and identify 
information gaps that needed to be addressed with additional data 
collection and analysis.

Performance data are needed at all tier levels.  Even so, the 
collection and reporting of this information should not impose an 
unnecessary 
burden on management and staff.  Data collection should utilize 
efficient manual or automated methods.  The organizations we studied 
developed a clear rationale for new and continued data collection and 
specifications for accuracy, reliability, timeliness, and use before 
setting out measurement reporting requirements.  Most organizations we 
examined designed efficient and effective ways to present the 
performance information to management so that it could facilitate 
better decision-making.  Finally, the organizations regularly 
conducted reviews of their performance measurement systems and revised 
or updated measures in accordance with management feedback or changing 
business needs.

          Practice Area Characteristics

          1.  Use Data Collection Tools

For each data collection requirement--whether qualitative or 
quantitative--most of our case study organizations developed manual 
and automated tools to reduce the burden of collecting IT performance 
information.  These included personal observation, formal performance 
measure reports, customer satisfaction surveys and interview 
questions, reviews of records and documents, and automated hardware 
and software productivity data collection tools.


Case Study 6:
Collecting IT Performance Information Using Multiple Techniques Across 
Different Performance Dimensions

The Immigration and Naturalization Service's Office of Information 
Resources Management (OIRM) collects information on both its 
Information Technology Partnership (ITP) contract, the primary IT 
support services contract, and for program-level performance regarding 
mission outcomes.  As shown in the following figure, the ITP contract 
tasks measure within budget, on-time, customer satisfaction, and 
technical (such as quality of project completeness, use of resources, 
and completeness and quality of documentation) performance.  The 
program-level performance measures cover cost, efficiencies, and 
quality in the improvements realized.

The program level covers all major operational areas, such as the 
Border Patrol.  INS has specifically designated tools for its data 
collection under the Information Technology Partnership mentioned 
earlier.  These tools include a performance measures report, a 
customer satisfaction survey, and a technical performance evaluation 
form.  For mission-critical tasks, quantitative methods, ratios, and 
performance reports are the tools that are used for data collection 
and analysis.   







   Case Study 6:
   INS ITP Performance Measurement Plan


Most of the organizations we studied feature customer surveys, 
interviews, and focus groups as important sources of performance 
information.  These data collection exercises are well-designed, 
tailored to IT performance measurement ownership, and fit into 
management processes to take action on the results.  For example, 
Motorola uses a  companywide survey that asks for ratings ranging from 
"superior" to "poor" on such issues as

  availability of Information Systems (IS) personnel, 
  responsiveness of IS personnel, 
  effectiveness of IS in project management communication, 
  reliability of IS in meeting commitments, 
  IS cycle time for completing projects, 
  quality of IS work, 
  rate of improvement during the past year, 
  alignment of IS resources with business needs, 
  overall credibility of information systems support, and 
  IS performance with respect to total customer satisfaction.

In all cases, the organizations started with what they considered a 
primitive data collection strategy, beginning with data definitions 
and working with customers and IT staff to gain understanding and 
agreement.  In the early stages, some of the data collection may all 
be manual, later supplemented or replaced by automated systems.  All 
of the organizations cautioned that IT should never wait for automated 
data collection systems, but start small and with measures meaningful 
to enterprise and operational customers, such as IT cycle time and 
customer satisfaction.


Case Study 7:
Using Multiple Data Collection Mechanisms and Ensuring System 
Integrity

Texas Instruments' Information Systems & Services (IS&S) gathers, 
prioritizes, and communicates customer and supplier concerns through a 
series of teaming activities.  IS&S conducts annual strategic intent 
conferences and objective reviews with top-level customer management.  
Several times a year senior customer managers on an Internal 
Information Systems Leadership Team meet with IS&S managers to offer 
strategic planning and guidance.  Other activities include ongoing 
executive one-on-one interviews and management-level and project-level 
customer surveys, employee development, reengineering efforts, 
customer focus workshops, steering teams, quality improvement teams, 
leadership teams, and quality steering teams.  Senior IS&S management 
uses these opportunities to confer with the customer on business 
priorities, critical success factors, and operational plans.  A 
service-focused survey is sent to end users of IS&S products and 
services.  Each Texas Instruments organization has a metrics 
coordinator who handles the manual and automated databases.

Much of Texas Instruments' IS&S performance information has been 
tracked for at least 5 years, with the majority of the information 
available online.  A combination of software error checking, password 
control, and independent audits ensures the reliability, 
accessibility, and integrity of the performance databases.  IS&S 
promotes system consistency and maximum data sharing through 
standardized data formats and system interfaces.  IS&S uses an 
electronic data interchange network to communicate electronically with 
suppliers and customers.  An online complaint system assists customers 
in registering complaints about IS&S products and services.


While there is widespread use of annual customer surveys, most 
organizations note that they are very limited in the information they 
provide that can tie to corrective action strategies.  One manager 
said his organization was putting customer surveys on hold until the 
organization could develop questions which would provide answers for 
concrete corrective action.  Another manager said he planned to do 
"just in time" surveys on specific projects or IT processes  instead 
of a survey at the end of the year.  His rationale was that year end 
surveys only capture problems or feelings, while he wanted to capture 
data that the IT organization could take action on, that is linked to 
organizational activities, and where he can see trends in particular 
areas and focus on skills and problem areas.  His point was that 
asking general questions is no way to develop actionability or 
determine significant trends.

Managers in the organizations we studies emphasized that it is 
important to have consistency of some hard data collection from year 
to year.  Some performance information such as computer and 
communications availability, customer satisfaction percentages, and 
software capability 
maturity levels are relatively durable and comparable over longer time 
periods.  These measures track trends in areas where products and 
services do not change significantly.  One manager suggested that 
measures should not be policy-based as policy can change from year to 
year.

2.  Develop and Use Baseline and Benchmark Information

The organizations we studied spent considerable time and effort on 
baselining and benchmarking, two entirely different activities.  They 
assessed what performance information they had for the measures they 
had selected (baselining) and how that information might compare to 
that of other organizations or similar processes within their 
organization if there were discrete IT units (benchmarking).

In baselining, available IT performance information, or information 
that will have to be collected, becomes part of the performance 
baseline for each scorecard objective.  The current performance 
becomes the "baseline" against which further performance is measured.  
Without baselining, there is no standard to measure progress.  In 
fact, one of the initial tasks in IT performance management is 
determining current performance using the measures designed for the 
balanced scorecard or a similar approach.  The organizations we 
studied found that performance data within a balanced scorecard cannot 
be a set of data on top of all the IT performance data that was 
previously collected.

To set actual IT performance targets, organizations often do 
benchmarking, an activity that is much different from baselining.  
Benchmarking was done with other IT organizations in the enterprise, 
with IT organizations outside the enterprise, or with similar 
processes but in other industries.  For example, Kodak benchmarks with 
companies in markets where it competes (competitive benchmarking), 
with leading or best of class organizations within any industry 
(functional benchmarking), and among the operating units within Kodak 
(internal benchmarking).  For example, handling customer support phone 
lines could be benchmarked against mail order telephone operations.

Another organization we studied, Xerox Corporation, was outsourcing 
many of its IT operations to Electronic Data Systems (EDS).  Xerox is 
benchmarking EDS service content, service delivery, and pricing 
against the best in many countries.  Initially, Xerox established a 
price and service baseline for comparison of EDS services and prices 
against the best organizations.

Many of the organizations we studied have made benchmarking an 
integral part of their IT performance management activities.  
Benchmarking information is often required in decision-making packages 
sent forward to senior executives.  Most have devoted at least one 
staff member to benchmarking.  Texas Instruments, for example, has a 
corporate office of benchmarking and best practice sharing.

Some organizations we studied cautioned that benchmarking requires 
focus.  Benchmarking often requires looking at process improvements 
versus strategic value.  While an organization will know how an 
organization does a particular IT process or set of activities, that 
knowledge may do little to improve outcomes.  If the benchmarking 
focus is on key measures, such as reducing cycle time or how well IT 
business process support customers, then benchmarking realizes a 
strategic potential.  The organizations use baselining and 
benchmarking information to identify performance gaps between current 
IT performance and desired achievement levels.  Leading organizations 
recognize that improvement goals must flow from a fact-based analysis 
of IT performance aligned to organization mission.  At least one of 
the organizations we studied believed that goals, such as six sigma, 
make benchmarking irrelevant.[16]  The standard for performance is 
already set at a "zero defect" goal.  One manager noted that "the key 
to performance is how the [IT] organization supports the business, not 
how the IT organization compares to other IT organizations."  However, 
others believed that benchmarking provides comparison data on 
exemplary organizations and sets "stretch" performance standards for 
the IT organization.

3. Assess Performance Maturity and Develop Complete                   
   Performance Definitions

A common stumbling block for organizations is the tendency to struggle 
to develop "perfect" measures instead of thinking in terms of 
improving measures over time.  But as one manager told us, in 
performance measurement, you simply cannot be good at everything right 
away which forces you to undertake a phased approach.   As  an 
organization moves gains more performance management experience, 
better and more appropriate goals will be defined  so the supporting 
measures will, in turn, be modified.  For many measures, the 
definitions, data collection techniques, and reporting will need to be 
refined over time.  

Measure  maturity assessment can be a part of the measure definitions 
that organizations develop.  Generally, these definitions cover what 
the measure is intended to show and why it is important, how 
performance data are generated, who is responsible for collecting it, 
how the measure
is specifically calculated, any limitations placed on the measurement 
data (for example, factors beyond the organization's control), and 
whether the data is cumulative or noncumulative.  As this process is 
repeated and refined over time, there is increased confidence that the 
measure 
accurately describes performance in the relevant area.  



Case Study 8:
Gaining Experience with Fundamental Measures and then Expanding Out 

Kodak is one organization that is systematically defining the maturity 
of each measure it plans to use in its balanced scorecard.  Kodak 
categorizes measure maturity as either fundamental, growing, or 
maturing.  Established indicators are considered as fundamental.  
Growing measures are evolving from the fundamental, but are not the 
best they can be.  Maturing measures are defined as best-in-class for 
whatever they are measuring.  For example, for internal performance a 
fundamental measure is to meet all service-level agreements, a growing 
measure is information delivery excellence, and a maturing measure is 
defect-free products and services.  Kodak believes it is important to 
build the right fundamental practices first in developing an initial 
IT performance management system.

As part of its development of measures for EDS services, Xerox is 
starting with what it calls "primitive metrics."  These include size, 
measured in function points; effort, measured in work hours; defects, 
measured by number of defects; changes, measured by number of changes; 
and duration, measured by elapsed days.  Over time, quarterly 
performance reviews and examination of measures are expected to result 
in revisions to these measures.


4.   Utilize Concise, Understandable Performance Reporting

Leading organizations take great care in designing IT performance 
reports that are concise, easy to understand, and tailored to various 
management needs and audiences.  Executive managers, in particular, 
often require data presentations and displays that focus on bottom 
line performance results.  When presented in this manner, they can 
quickly digest information, focus on problem areas, seek pertinent 
follow-up data, and be more efficient in making or recommending 
project or program decisions.

Performance reports should be tailored for the audience for which the 
information is  intended.  Operational managers may need more details 
and supporting, contextual  information while external stakeholders 
may require far less.







     Case Study 9:
     INS/ITP Quarterly Performance Reporting

Case Study 9:
Presenting Performance Data in Usable Formats

The Immigration and Naturalization Service's Office of Information 
Resources Management (OIRM) collects information on both its 
Information Technology Partnership contract and on program-level 
performance regarding mission outcomes.  The contract is run by INS in 
partnership with Electronic Data Systems (EDS).  

The following figure shows how overall contract-level performance is 
reported on a quarterly basis for each of the four key critical 
success factors:  (1) customer satisfaction, (2) technical 
performance, (3)on time, and (4) within budget.  These quarterly 
measures can then be plotted to show a performance trend over the life 
of the contract. Gathering data such as this will also allow INS to 
provide documented and objective feedback on a contractor's past 
performance. 






Case Study 10:
Effective Presentation of Performance Data and Contextual Information

The Minnesota Department of Administration provides business 
management and administrative services to Minnesota agencies.  Most of 
the Department's operations are fee-based operations in areas such as 
data processing, printing, vehicle rental, and the sale of office 
supplies.  The Department's InterTechnologies Group (InterTech) 
provides services in managing and operating information technology 
resources.

Figure 21 illustrates one objective for customer service performance.  
The objective is written with a specific performance rating standard 
and the use of a specific survey for data collection.  The performance 
data are reported in this format, with the objective, definition, 
rationale, and data source.  The data also include a discussion of 
past performance and the plan to achieve targets. These data and 
reports on other measures are given to the Minnesota Legislature in a 
formal performance report.



    Case Study 10:
    Minnesota Department of Administration's Customer Service
    Performance Data Template



5.  Conduct Measure Reviews and Audits

Most organizations we studied regularly assess their measures to see 
if they are still appropriate for measuring results and assigning 
accountability.  Most of these organizations do regular external 
audits of IT measure appropriateness and data collection efficiency.  
One manager said, "We believe any successful organization needs a good 
basic independent look at measures, challenging what are the right 
questions."  The reviews and audits serve as an oversight of measures 
and the data, challenging the necessity of the measures (i.e., are 
they being used?) and their linkage to enterprise strategic plans and 
individual performance expectations (i.e., are they having an 
impact?).  Reviewers consider the accuracy, completeness, timing, 
match to actual conditions, and technical features such as the method 
of results calculations.

For example, as new technology is introduced, measures can become 
obsolete or less important.  Or customer measures supplant or replace 
traditional measures.  In one organization, IT used thousands of lines 
of code per calendar month as a measure of software productivity.  
From the customer point of view, the measure was ineffective, and the 
measure was changed to cycle time.  Over time the organization will 
develop more reliable measures.

The reviews also analyze key performance drivers in each goal area.  
They question what a stress in one area will do to the results in 
other areas.  They also can question what performance gain might be 
achieved for a certain level of performance effort, and if the 
performance results 
How to Get Started

To build a comprehensive measure, data collection, and analysis 
capability, organizations should

  designate specific IT staff to gain skills in measurement, data 
   collection, and analysis,

  review existing data collection and reporting and determine what is 
   still appropriate and what should be changed or deleted to match 
   possible measurement scorecard areas, and

  determine what are preliminary interdependencies among        
   scorecard goal areas, objectives, and measures.

are worth that effort.
























Practice Area 5:
Strengthen 
IT  Processes to 
Improve Mission Performance




Practice Area Characteristics:

1. Define the IT business processes that produce IT products and 
   services meeting mission goals
2. Using IT performance information, prioritize IT business processes 
   essential for improving mission performance.
Practice Area Overview

In many of the organizations we studied, business process improvement 
is a high priority enterprise strategy.  Often, enterprise and 
operational customer business process improvement can only be 
accomplished using IT products and services.  That means that IT must 
ensure that its own business processes are the best they can be.  
Simply put, IT internal business processes deliver IT products and 
services which the enterprise and operational customers depend on to 
support their missions.  If  IT does not have the capability to 
deliver high quality products and services, then organizational goals 
can suffer.  This is an important  reason why the balanced scorecard 
approach includes internal IT business processes.

        Practice Area Characteristics

          1. Define IT Business Processes That Produce Products
             and Services Critical for Meeting Mission Goals

All leading organizations define their key IT business processes.  
This helps the IT organization focus on primary activities, identify 
IT competencies, eliminate processes which do not add value, and 
facilitate IT process innovation.  Several of the organizations we 
studied noted 
that process measures are tightly linked with tasks or activities.  
Should organizational structures change and tasks and activities move 
within the IT organization, measures can more easily move.  In other 
words, IT processes and subprocesses, once defined, are "portable" 
from one part of the organizational structure to another.

Some of the organizations developed their IT process orientation based 
on work done by the Ernst and Young Center for Information Technology 
and Strategy and later published by a workgroup of the Society for 
Information Management (SIM).[17]  SIM's Information Systems Process 
Architecture (ISPA) process framework is a model of how a typical 
organization obtains and applies information systems and technology.  
In describing its initial framework issued in September 1993, SIM 
indicated that its process model (1) defines strategically important 
IT processes in an overall framework, (2) communicates with IT 
stakeholders on the value, activities, and organization of IT, and (3) 
provides a basis for allocating IT resources that is congruent with 
activity-based costing.

As shown in figure 13, SIM's ISPA version 2.0 process framework, 
issued in March 1996, includes eight IT processes which overlap.  This 
framework, like the earlier version, provides an example of a way to 
organize IT, determine core competencies, and identify process owners.



Figure 13:  ISPA Version 2.0 Process Framework
As explained below, each process in this suggested framework has a 
specific purpose and suggested metrics (measures).

  Perform Customer Relations
This process is used to develop and maintain working relationships 
with the customers of IT products and services.  Performance metrics 
focus on the business impact of IT, customer satisfaction, and product 
and service measurements, such as the accuracy and response time for 
solving problems or number of viable IT projects identified.

  Market IT
This process is used to ensure that customers want, need, and buy the 
products and services offered.  Sample metrics focus on the increased 
business value of IT, IT return on investment, customer satisfaction, 
and product and service effectiveness measures.

  Align Business and IT
This process is used to incorporate IT into strategic business change 
activities in a way that captures opportunities from current and 
emerging technologies, and to promote process innovation leadership 
using IT as the catalyst and using proven visioning and change 
management techniques.  Sample metrics focus on the business value of 
IT, net present value of projects approved by the strategic business 
unit, and improvements to business processes.

  Manage Enterprise Architecture
This process is used to provide a framework for delivering consistent 
products and services.  Sample metrics focus on IT architecture design 
and implementation, the number of viable IT projects identified, how 
service chargebacks align with customer views of services, and the 
degree of technology  standardization.

  Develop and Deploy Products and Services
This process is used to acquire, develop, deliver, and implement new 
information services for the organization.  Sample metrics might 
include the percent of projects on-time and with the desired 
functionality, projects completed within budget, customer 
satisfaction, and degree of technology usage in conducting core 
business processes.

  Deliver and Support the Products and Services
This process is used to ensure that the products and services are 
deployed in the most effective and efficient manner.  Metrics can 
focus on customer satisfaction survey ratings, increased demand for IT 
information, adequacy of equipment and facilities, availability and 
accessibility of data, number of problems received and requests 
satisfied, and mean time to data recovery.

  Plan the IS Organization
This process is used to shape and support business unit strategies, 
establish strategy and vision for long-term IS use, develop tactical 
plans and development and infrastructure resources over a 12 to 18 
month horizon, and  design key processes within the IS organization.  
Sample metrics could include employee satisfaction with the IS vision 
and knowledge about technology plans.

  Manage IS Organization Business
This process is used to manage the processes within IS that deal with 
the health and state of the IS organization, its employees, and 
vendors.  Sample metrics include an employee satisfaction and 
commitment index, team efficiency and effectiveness measures, product 
and service measurements, and an index of employee skills.


Case Study 11:
Texas Instruments IT Process Segmentation Effectively Supports 
Business Needs

At Texas Instruments, many IT customer business process reengineering 
activities established a clear need for rapid provisioning of IT 
solutions, flexibility with respect to business rules and work flows, 
and cost effective use of advanced technology.  In response, the 
Information Systems and Services Group (IS&S) developed its IT process 
map with five processes, as shown in the following figure.

The process map defines activities that "map" into both the 
operational customer and IT organization sphere.  For example, 
strategy management as a process is the joint responsibility of both 
the business customers and IS&S (the enterprise IT group).  Strategy 
management sets the vision and timing of all the elements of IT needed 
to support Texas Instruments.  Solution provisioning is the mechanism 
for assembling the hardware and software pieces together to form 
solutions for IT customers.  The service and product support area 
deploys and maintains IT products and services.  The research and 
architecture definition area does research, experiments with and sets 
standards and methodologies for current and future architectures.  The 
component provisioning area provides reusable applications, building 
blocks, assembly tools, and integration expertise.





     Case Study 11:
     Texas Instruments' Information Technology Process Map

2. Using IT Performance Information, Prioritize IT Business Processes 
   Essential for Improving Mission Performance

Given the many IT business processes that an IT organization manages, 
which ones are the most important for improvement?  In the 
organizations we studied, customer business process improvement 
strategies and performance requirements frequently identify and drive 
major improvements in IT business process.  They key is to determine 
the most efficient and effective manner to organize roles, 
responsibilities, and delivery of IT products and services within the 
organization.  This requires a clear understanding of what IT 
functions are the domain of business units, unit-organized IT groups, 
and the corporate or enterprisewide IT organization.  Knowing how 
effective is this arrangement is in providing effective IT support, 
which IT functions are performing well, and where service improvements 
are needed is critical to targeting IT management attention.

Case Study 12:
Using Business Process Focus to Shape IT Service Delivery

Kodak is deciding what IT processes it should pursue to support the 
enterprise and operational customers' business processes.  The focus 
is to look at each business process and its use of IT and determine if 
the information systems organization identified the right technology 
for the business needs.

For Kodak, enterprise goals drive a process focus for strategic 
business units and support functions such as IT.  The company's 
enterprise goals are three fold:
   
   (1)  a 10 time defect reduction every 3 years,
   (2)  reaching 6 sigma by the year 2000, and
   (3)  a 10 time cycle time reduction in 3 years.

Kodak defines a defect as a variation in a product or service which, 
if not caught, prevents the company from meeting the needs of its 
customers.  Cycle time is the total time to move a unit of work from 
the beginning to the end of a process.  Kodak believes that maximizing 
business processes is the only way to achieve these dramatic 
improvement goals.

How to Get Started

To begin focusing resources on IT process improvement, organizations 
should

    use a model such as that suggested by the Society for Information 
     Management to define IT business processes,

    determine which IT business processes are the most critical to 
     meeting enterprise and operational customer performance goals and 
     requirements, and

    choose one or more critical IT business processes to baseline 
     existing performance and benchmark against industry standards or 
     leading public/private organizations.


Key Lessons Learned for Effective Implementation 
______________________________________________________________________
________


Agency managers just starting to develop IT performance management 
systems or those who want to enhance existing ones have a formidable 
task.  They often are faced with resource constraints, demands for 
immediate IT support and solutions as program areas reduce staff and 
reengineer their business processes, and skepticism about the value of 
performance management.

From the organizations we studied and the experiences of other 
organizations described in the literature, three key activities are 
essential in putting the practices in place.  These are assessing 
organizational readiness for a successful IT performance management 
system and staging the system development, following a simple measure 
selection process, and recognizing system maturity will change over 
time.

Assess Organizational Readiness

The leading organizations find that assessing organizational readiness 
for a successful IT performance management system is essential.  Here, 
the organizations look for the involvement, commitment, and day-to-day 
support of enterprise senior managers.  They also determine if they 
have adequate resources, including staff allocation, skills, time, 
tools, and use of consultants or technical assistance if needed.  A 
manager characterized a good performance management system as one that 
"has complete buy-in from top management, involves front line 
employees in system design, and results in front line employees 
understanding what they do and why they are doing it."

Organizational readiness also means making sure that existing planning 
and decision making structures can accept performance results so they 
can be used.  As the introductory chapter to this guide explained, 
performance measures are a central piece of alignment around mission 
planning, budgeting, and evaluation.  These are essentially separate 
processes linked by performance measures.  The organization needs the 
capability to specify clear goals and objectives to set the focus and 
direction of IT performance, creating an IT performance improvement 
plan and revisiting it every one or two years.  The IT organization 
has to understand the business of operational customers and make sure 
IT measures are consistent with business measures.  That means the 
capability to develop a "theory" of how IT supports enterprise and 
operational customers so the organization can build the chain of IT 
events and evidence described in practice 1.

The organizations also determine if they have the support of other 
stakeholders and funding sources, such as legislative staff.  As 
mentioned under practice 1, stakeholders are one of the parties which 
have to reach a common understanding of IT goals, objectives, 
appropriate measures, and anticipated outcomes.

Lastly, organizational readiness means paying attention to 
organizational culture -- is it receptive to data collection, 
measurement, and analysis and accountability for performance and 
decisions as part of an overall performance improvement system?  The 
organization should have a philosophy that is positive towards 
performance management and measurement and views it as a way to focus 
on quality and operational customer satisfaction.  That means the 
organization should be willing to assess organizational values, 
principles, and how they are working.  That is the key to success, one 
manager explained, "We have the culture to support this -- people get 
to where they track and report metrics, they think about quality 
goals.  [IT performance management] is a hard sell where there is not 
a total quality culture, where the focus is on technology, not 
satisfying the customer."

Follow a Simple Measure
Selection Process

Selecting and implementing IT performance measures is extremely 
complex.  Each enterprise and its operational customers have a mission 
and goals that often differ significantly from other agencies.  A set 
of IT performance measures that work for one organization likely will 
not completely work for another organization.  The performance 
measures differ on what is valued most in terms of IT performance.  To 
illustrate, IT goals and objectives which stress cost savings would 
have more measures related to efficiency versus effectiveness.  IT 
goals and objectives stressing customer satisfaction and other 
service-oriented goals might have fewer efficiency measures.  However, 
well-managed IT activities should take a balanced approach to goals, 
objectives, and related measures.

In the organizations we studied and in the literature we reviewed, one 
element of success is sifting through the many possible goals, 
objectives, and measures before finalizing them in a balanced 
scorecard or similar approach.  The sifting process identifies 
potential objectives and measures for all parts of IT activities.  It 
assesses which measures will be valuable for which purposes and to 
whom and eliminates measures which are not relevant to customer and 
stakeholder needs.  And it 
eliminates measures for which good quality data cannot be practically 
obtained.  One manager said, "We want to put the best measures in 
place.  We want them simple, collectable, repeatable, and concrete."

A good example of the selection process is the measurement "roadmap" 
followed by the Minnesota Department of Transportation.  The roadmap 
examines and reduces the total potential to a vital few number of 
goals, objectives, and related measures.  The roadmap, shown in figure 
14 and enhanced with additional information from the United Kingdom's 
Royal Mail, is a general performance measure selection framework that 
all program and support functions follow.  In the roadmap, the word 
"organization" can refer to an agency, a program, or a support 
function.



Figure 14:  IT Measurement Implementation Roadmap



In stage 1, the organization clarifies its overall goals and 
objectives, first by fulling listing all possibilities, and then 
converging on the vital few goals and objectives.  In stage II, the 
organization would develop its measures and make sure its measurement 
system has a diversity of measures and follows a tiering approach.  In 
the last stage, the organization uses the measures, establishing 
baselines and targets, 
comparing against benchmarks, and monitoring progress for continual 
improvement.

For support functions such as IT, two roadmaps are in play.  One 
roadmap develops the objectives and measures for an agency or program 
area; another roadmap takes that agency or program information and 
develops objectives and measures for IT.  This is the intent of the 
balanced scorecard approach discussed in practice 2.

Develop IT Performance Management
System Maturity Over Time

In discussing the development of an IT performance management system, 
we attempted to determine if there was a maturity aspect to system 
development.  Several of the managers we talked to believe there is a 
staging in performance management emphasis and expectations although 
staging is not precise or formal.  Figure 15 shows possible stages in 
performance management and their linkage to the balanced scorecard 
areas, drawing on work done for the city of Phoenix and discussed in 
the literature.  While shown as discrete stages for purposes of 
illustration, the distinction between them often is not clear-cut as 
they overlap over time.



Figure 15:  Stages in IT Performance Management


The premise is that IT organizations have to be good at the basics in 
stage one  -- traditional internal IT operations -- and then move to 
stages two (linking enterprise and customer mission with new and 
improved IT services) and three (be mission-results oriented).  In 
other words, an IT organization that is viewed as a failure in 
day-to-day operations will not have the credibility nor support of the 
rest of the organization to play a strategic role directly tied to 
mission results.

In stage one, the IT organization is developing and implementing a 
performance management system that examines internal operations 
against standards and acceptable levels of performance.  Traditional, 
activity-based measures such as number of reports issued or mainframe 
availability are used.  In stage two, the IT organization is 
eliminating root causes of defects, building its competencies to 
consistently deliver new and improved systems and solutions on time 
and within budget, and linking operational measures to mission 
performance.  The goal is to improve IT processes and prevent defects.  
For both stage one and two, most of the measures fall in the internal 
business goal and innovation 
and learning goal of the balanced scorecard approach discussed in 
practice 2.

In stage three, the IT organization has the capability of applying IT 
expertise to enterprise and operational customer mission requirements 
and putting its IT outcomes in those mission terms.  Measures are 
based on customer needs and benefits, expressing measures in terms 
customers understand, such as business outcomes.  Stage three IT 
organizations also specify who is responsible for corrective action.  
In this stage, the measures found in the strategic and operational 
customer goals of the balanced scorecard approach are prominent.

The staging or maturity perspective suggests that organization should 
consider using the balanced scorecard approach in a building block 
approach.  The organization might initially develop major goals, 
objectives, and measures in each of the four areas.  However, it is 
possible that it likely must perform well in stages one and two, 
reflecting the balanced scorecard areas of internal business and 
innovation and learning, before it can perform well in strategic and 
operational customer areas.

A Final Note

The organizations we studied cautioned that the practice of IT 
performance management and measurement is in its infancy.  Most of the 
organizations we studied have worked on their IT performance 
management systems for several years and most of those efforts are 
part of strong performance goals at the enterprise level.  The cities 
of Phoenix and Sunnyvale, for example, have long-standing reputations 
for being well-managed and their general and specific IT measurement 
approaches have evolved over many years.

In both these organizations, as with others we studied, there is a 
strong performance management culture.  The organizations share many 
similar performance management values and management objectives that 
stress IT results and accountability.  For them, IT measures make a 
valued and positive difference in mission and business performance.

                    Selected Bibliography

Allen, Dan. "Performance Anxiety," Computerworld, February 15, 1993.

Allen, Dan. "Take a Good Look at Yourself," Computerworld, February 
15, 1993.

Association for Federal Information Resources Management. The 
Connection: Linking IRM and Mission Performance. Washington, DC: 
Association for Federal Information Resources Management, September 
1995.

Baatz, E. B. "Altered Stats," CIO, October 15, 1994.

Bjorn-Andersen, Niels and Davis, Gordon B. (eds.). Information Systems 
Assessment: Issues and Challenges. Amsterdam: North-Holland, 1986.

Brizius, Jack A. and Campbell, Michael D. Getting Results. Washington, 
DC: Council of Governors' Policy Advisors, 1991.

Carlson. W. and McNurlin, B. Measuring the Value of Information 
Systems. Rockville, MD: I/S Analyzer, 1989.

Davis, Dwight B. "Does Your IS Shop Measure Up?" Datamation, September 
1, 1992.

Earl, Michael J. and Feeny, David F. "Does the CIO Add Value?" 
Information Week, May 30, 1994.

Financial Management Service, Department of Treasury. Performance 
Management Guide. Washington, DC: Financial Management Service, 
November 1993.

Gaskin, Barbara and Sharman, P. "IT Strategies for Performance 
Measurement," CMA Magazine, April 1994.

Glover, Mark. A Practical Guide for Measuring Program Efficiency and 
Effectiveness in Local Government. Tampa, FL: The Innovation Groups, 
1994.

Gold, Charles L. IS Measures -- A Balancing Act. Boston: Ernst & Young 
Center for Information Technology and Strategy, 1992. 

Government Centre for Information Systems. Benchmarking IS/IT. London: 
HMSO, 1995.

Governor's Office of Budget and Planning, Legislative Budget Board, 
and State Auditor's Office. Guide to Performance Measurement for State 
Agencies. Austin, TX: State Auditor's Office, 1995.

Harrington, H. James. Total Improvement Management. New York: 
McGraw-Hill, Inc., 1995.

Jackson, Peter and Palmer, Bob. First Steps in Measuring Performance 
in the Public Sector. London: Public Finance Foundation, 1989.

Kaplan, Robert S. and Norton, David P. "The Balanced Scorecard -- 
Measures that Drive Performance," Harvard Business Review, Vol. 70, 
No. 2 (January-February 1992).

Kaplan, Robert S. and Norton, David P. "Putting the Balanced Scorecard 
to Work," Harvard Business Review, Vol. 71, No. 5 (September-October 
1993).

Kaplan, Robert S. and Norton, David P. "Using the Balanced Scorecard 
as a Strategic Management System," Harvard Business Review, Vol. 74, 
No. 1 (January-February 1996).

Klinger, Daniel E. "A Matter of Marketing," CIO, September 15, 1995.

LaPlante, Alice. "IT's Got What It Takes," Computerworld, October 3, 
1994.

LaPlante, Alice and Alter, Allan E. "IT All Adds Up," Computerworld, 
October 31, 1994.

Lingle, John H. and Schieman, William A. "From Balanced Scorecard to 
Strategic Guages:  Is Measurement Worth It?,"  Management Review, 
March 1996.

Moad, Jeff. "New Rules, New Ratings as IS Reengineers," Datamation, 
November 1, 1993.

National Academy of Public Administration. Information Management 
Performance Measures.  Washington, DC: National Academy of Public 
Administration, January 1996.

Parker, Marilyn M., Benson, Robert J., and Trainor, H.E. Information 
Economics: Linking Business Performance to Information Technology. 
Englewood Cliffs, NJ: Prentice Hall, 1988.

Pastore, Richard. "Benchmarking Comes of Age," CIO, November 1, 1995.

Quinn, James B. and Baily, Martin N. "Information Technology: The Key 
to Service Performance," The Brookings Review, Summer 1994.

Rubin, Howard. "Measurement: Shifting the Focus to Business Value," 
Capacity Management Review, Vol. 19, No. 1 (January 1991).



Rubin, Howard. "Measure for Measure," Computerworld, April 15, 1991.

Rubin, Howard A. "In Search of the Business Value of Information 
Technology," Application Development Trends, November 1994.

Sankaran, Chandran. and Taylor, Henry D. "Taking New Measures," CIO, 
October 1, 1993.

Saunders, Carol S. and Jones, Jack W. "Measuring the Performance of 
the Information Systems Function," Journal of Management Information 
Systems, Vol. 8, No. 4 (Spring 1992).

Society for Information Management Working Group on ISPA. Information 
Systems Process Architecture 1.0. Chicago: Society for Information 
Management, September 1993.

Society for Information Management Working Group on ISPA. Information 
Systems Process Architecture 2.0. Chicago: Society for Information 
Management, March 1996.

Society for Information Management Advanced Practices Council. 
Practitioner's Guide to I. S. Performance Measurement. Chicago: 
Society for Information Management, March 1995.

Strassmann, Paul A. The Business Value of Computers. New Canaan, CT: 
The Information Economics Press, 1990.

Strassmann, Paul A. and others. Measuring Business Value of 
Information Technologies. Washington, DC: International Center for 
Information Technologies, 1988.
                Objectives, Scope, and Methodology

The objectives of our research were to (1) identify information 
technology performance management practices used by leading private 
and public sector organizations with demonstrated success in 
developing and using information technology solutions and 
infrastructure to improve mission or business performance and 
outcomes, and (2) share our results with federal agencies to help 
improve overall mission performance.

Scope

Our research focused on information technology performance management 
practices used by management teams and staff in five private sector 
companies, three state and local governments, and three federal 
agencies.  The organizations were chosen purposefully, not at random 
or to ensure representation of a larger group.  We selected the 
private, state, and local organizations based on (1) recognition by 
other organizations and independent researchers for their progress in 
successfully developing and using information technology performance 
management systems, (2) recognition by professional publications for 
their performance management systems, and (3) willingness to 
participate as a case study organization.  The federal agencies were 
selected based on their recognition by federal information resources 
management officials for initiating comprehensive work on information 
technology performance management.  Because our work often involved 
data that these organizations regarded as proprietary or sensitive, we 
agreed not to disclose any data they wished to protect.

To supplement our findings from these private and public sector 
organizations, we gathered additional information from other federal, 
state, and local organizations.  This information included both 
generic and information technology performance management information, 
ranging from guides to specific practices.

Methodology

Our research was conducted with an illustrative case study approach 
using open-ended and focused interviews and documentary analysis, not 
direct observations.  In conducting the case studies, we interviewed 
senior executives, line managers, and information technology 
professionals to learn how the organization measured and managed the 
contribution of information technology towards organizational goals.  
Interview information was supplemented by documentary analysis of each 
organization's information technology performance management approach.

For quality assurance, we conducted a meeting of case study 
participants to obtain group comments on an initial draft of this 
guide, followed by individual case study participant reviews of a 
subsequent draft.  We also distributed the draft to other experts on 
information technology performance management, the Office of 
Management and Budget, and the General Services Administration.  We 
also made numerous presentations as the guide was developed to test 
our preliminary findings and the applicability to the federal 
government with Federal executives, managers, and staff, representing 
both line and information technology functions.  We have incorporated 
changes where appropriate.



Caveats

Information technology performance management and measurement is very 
much in its infancy in both public and private sectors.  As an initial 
step, this guide presents a framework that begins to document the 
state of the practice drawn from our analysis of a relatively small 
number of case studies.  Much more research and analysis remains to be 
done.  The practices we have presented can serve as starting point for 
any organization, tailored to the strategic directions and performance 
requirements unique to each organization.
             Case Study Organizations and Participants

We would like to acknowledge the invaluable assistance of the 
following individuals who serviced as key contacts for the case study 
organizations.

American Express Travel Related Services Company, Inc.
Cliff Shoung (Project Leader, Benchmarking)
Paula Bouthillier (Project Leader, Strategic Advantage)

U.S. General Services Administration
Jon Desenberg (Management Analyst)

Immigration and Naturalization Service
Janet Keys (Director, Systems Policy and Planning)
J. T. Lazo (Senior Systems Manager, EDS, INS ITP Contract)

Eastman Kodak Company
Jeff Duell (Technology Director, IS Measurement and Benchmarking)

Motorola Semiconductor Products Sector
Dorothy Hines (Controller, Sector Quality and Support Operations)
Bret Wingert (Manager, SEI Projects)

Oregon Department of Transportation
Craig Holt (formerly Chief Information Officer)












City of Phoenix
                                      Joe Motto (Information 
                                        Technology Director)
Bob Wingenroth (Deputy City Auditor)

City of Sunnyvale
                                      Shawn Hernandez (Director, 
                                        Information Technology)
                                      Marilyn Crane (Information 
                                        Technology Administrative 
                                        Services Manager)

Texas Instruments
Gary Pollard (Total Quality Director)
                                      Vicki Flewelling (Total Quality 
                                        Management)

U.S. Department of Agriculture
                                      Joseph Ware (Director, IT 
                                        Planning and Program 
                                        Management)

Xerox Corporation
                                      M. Catherine Lewis (Manager, 
                                        IM2000 Integration Planning)
                                      Margie Tomczak (Manager, Global 
                                        Applications)











                 Major Contributors to This Report


Accounting and Information Management Division
IRM Policy and Issues Group

Christopher W. Hoenig, Director
Dr. David L. McClure, Senior Assistant Director
Bernard R. Anderson, Senior Analyst

Los Angeles Office

Dr. Sharon Caudle, Project Director
Barbara House, Senior Analyst


1. Executive Guide:  Improving Mission Performance Through Strategic 
Information Management and Technology--Learning From Leading 
Organizations (GAO/AIMD-94-115), May 1994; Assessing Risks and 
Returns:  A Guide for Evaluating Federal Agencies' IT Investment 
Decision-making (GAO/AIMD-10.1.13), February 1997.

2. For purposes of this guide, we define an "operational customer" as 
a program or other function to whom the information technology 
organization or units delivers IT products and services.  The 
operational customer can include organizations and entities outside 
traditional agency boundaries affected by the use of IT products and 
services.  

3. GAO/AIMD-94-115.

4. This information is based on material in Jack A. Brizius and 
Michael D. Campbell, Getting Results (Washington, D.C.,  Council of 
Governors' Policy Advisors, 1991) and performance management guidance 
materials prepared by the National Academy of Public Administration 
and the Innovation Group.

5. See Executive Guide: Effectively Implementing the Government 
Performance and Results Act (GAO/GGD-96-118, June 1996).

6. Recent efforts include The Connection: Linking IRM and Mission 
Performance, a resource paper sponsored by the Association for Federal 
Information Resources Management, September 1995; Practitioner's Guide 
to I.S. Performance Measurement, a guide issued by the Society for 
Information Management's Advanced Practices Council, March 1995; 
Information Management Performance Measures, a report for the U.S. 
Department of Defense issued by the National Academy of Public 
Administration, January 1996, Performance-Based Management:  Eight 
Steps to Develop and Use Information Technology Performance Measures 
Effectively, a guide prepared by the General Services Administration, 
December 1996; and Guide for Managing Information Technology (IT) as 
an Investment and Measuring Performance, Department of Defense 
guidance issued by Assistant Secretary of Defense for Command, 
Control, Communications, and Intelligence, March 3, 1997. 

7. Other organizations we consulted were the Florida Legislature Joint 
Committee on Information Technology Resources, the Oregon Department 
of Administrative Services, the Oregon Secretary of State Audits 
Division, the Office of the Texas State Auditor, the Minnesota Office 
of the Legislative Auditor, the Minnesota Department of 
Transportation, the Federal Emergency Management Agency, the city of 
Portland, the Treasury Board of Canada, the United Kingdom's 
Government Centre for Information Systems and Royal Mail, the Society 
for Information Management, and the MITRE Corporation.

8. These include Government Reform: Goal-Setting and Performance 
(GAO/AIMD/GGD-95-130R, March 27, 1995); Managing for Results: 
Experiences Abroad Suggest Insights for Federal Management Reform 
(GAO/GGD-95-120, May 2, 1995);  Managing for Results: Critical Actions 
for Measuring Performance (GAO/T-GGD/AIMD-95-187, June 20, 1995); 
Managing for Results: Achieving GPRA's Objectives Requires Strong 
Congressional Role (GAO/T-GGD-96-79, March 6, 1996); The Government 
Performance and Results Act:  1997 Governmentwide Implementation Will 
Be Uneven (GAO/GGD-97-106, June 2, 1997); Managing For Results:  The 
Statutory Framework for Improving Federal Management and Effectiveness 
(GAO/T-GGD/AIMD-97-144, June 24, 1997).

9. GAO/AIMD-94-115.

10. GAO/GGD-96-118.

11. For related guidance, see Agencies' Strategic Plans Under GPRA:  
Key Questions to Facilitate Congressional Review (GAO/GGD-10.1.16, May 
1997). 

11. Kaplan and Norton published a series of articles in the Harvard 
Business Review which explain the concept and its application.  The 
articles are listed in the bibliography.

12. A complete description of IT performance measurement approaches 
can be found in the Society for Information Management's 
Practitioner's Guide to I.S. Performance Measurement, referenced in 
appendix I.

13. The Office of Management and Budget and GAO issued a guide, 
Evaluating Information Technology Investments:  A Practical Guide, in 
November 1995 to assist federal agency and oversight staff in 
evaluating a portfolio of information technology investments in a 
similar manner. The approach is also an underlying feature of GAO's 
guide Assessing Risks and Returns:  A Guide for Evaluating Federal 
Agencies' IT Investment Decision-making (GAO/AIMD-10.1.13, February 
1997).

14. A function point measures an IT application in terms of the amount 
of functionality it provides users.  Function points count the 
information components of an application, such as external inputs and 
outputs and external interfaces.

15. IT architectures explicitly define common standards and rules for 
both data technology, as well as mapping key processes and information 
flows.  A complete IT architecture should consist of both logical and 
technical components.  The logical architecture provides the 
high-level description of the organization's mission, functional 
requirements, information requirements, systems components, and 
information flows among the components.  The technical architecture 
defines the specific IT standards and rules that will be used to 
implement the logical architecture.

16. Six sigma is a measure defining the quality level of a product, 
service, or process.  This organization defined six sigma is 99.9997 
percent perfect or 3.4 defects per million opportunities to create a 
defect.  If set as a goal, it basically calls for a virtual "zero 
defect" standard.

17. See the Society for Information Management Working Group on ISPA 
process architecture documents listed in the bibliography.

*** End of document. ***