Performance Reporting: Few Agencies Reported on the Completeness
and Reliability of Performance Data (26-APR-02, GAO-02-372).
The Government Performance and Results Act (GRPA) requires
federal agencies to set goals for program performance and to
report on their annual progress toward achieving those goals.
Although no data are perfect, agencies need credible performance
data to provide transparency of government operations so that
Congress, program managers, and other decisionmakers can use the
information. To improve the quality of agencies' performance
data, in the Reports Consolidation Act of 2000 requires that
agencies assess the completeness and reliability of their
performance data. Only five of the 24 Chief Financial Officers
(CFO) Act agencies' fiscal year 2000 performance reports included
assessments of the completeness and reliability of their
performance data in their transmittal letters. The others
discussed to some degree the quality of their performance data
elsewhere in their performance reports. None of the agencies
identified any material inadequacies with their performance data.
However, concerns about the quality of performance data were
identified by the inspector general as either a major management
challenge or included in the discussion of other challenges for
11 of the 24 agencies. Although not required, discussing the
performance reports in the standard or method used to assess the
completeness and reliability of its performance data can provide
helpful contextual information on the credibility of the reported
performance data.
-------------------------Indexing Terms-------------------------
REPORTNUM: GAO-02-372
ACCNO: A03208
TITLE: Performance Reporting: Few Agencies Reported on the
Completeness and Reliability of Performance Data
DATE: 04/26/2002
SUBJECT: Accountability
Agency missions
Best practices
Data integrity
Performance measures
Productivity in government
Program evaluation
Reporting requirements
Strategic planning
******************************************************************
** This file contains an ASCII representation of the text of a **
** GAO Product. **
** **
** No attempt has been made to display graphic images, although **
** figure captions are reproduced. Tables are included, but **
** may not resemble those in the printed version. **
** **
** Please see the PDF (Portable Document Format) file, when **
** available, for a complete electronic file of the printed **
** document's contents. **
** **
******************************************************************
GAO-02-372
Report to the Ranking Minority Member, Committee on Governmental Affairs, U.
S. Senate
United States General Accounting Office
GAO
April 2002 PERFORMANCE REPORTING
Few Agencies Reported on the Completeness and Reliability of Performance
Data
GAO- 02- 372
Page 1 GAO- 02- 372 Performance Reporting
April 26, 2002 The Honorable Fred Thompson Ranking Minority Member Committee
on Governmental Affairs United States Senate
Dear Senator Thompson: The Government Performance and Results Act (GPRA)
seeks to improve the efficiency, effectiveness, and accountability of
federal programs by requiring federal agencies to set goals for program
performance and to report on their annual progress toward achieving those
goals. While no data are perfect, agencies need to have sufficiently
credible performance data to provide transparency of government operations
so that Congress, program managers, and other decisionmakers can use the
information. However, limited confidence in the credibility of performance
data has been one of the major weaknesses in GPRA implementation. To help
improve the quality of agencies? performance data, Congress included a
requirement in the Reports Consolidation Act of 2000 that agencies assess
the completeness and reliability of their performance data. Under the act,
agencies were to begin including this assessment in the transmittal letter
with their fiscal year 2000 performance reports. 1 Agencies were also
required to discuss in their report any material inadequacies in the
completeness and reliability of their performance data and discuss actions
to address these inadequacies.
To assess the initial year?s progress in improving performance data under
the Reports Consolidation Act, you asked us to determine the 24 Chief
Financial Officers (CFO) Act agencies? compliance with the Reports
Consolidation Act?s requirements and to identify any useful practices for
describing the credibility of performance data in agencies? performance
reports. As agreed, this report describes (1) whether or not the 24 CFO Act
agencies? fiscal year 2000 performance reports contained an assessment of
the completeness and reliability of their performance data, (2) the
1 Agencies had the option of using one of three formats for their fiscal
year 2000 performance reports- as a stand- alone document, combined with
their fiscal year 2000 accountability report, or combined with their fiscal
year 2002 performance plan. References to ?performance reports? in this
report are used to cover any of the three formats.
United States General Accounting Office Washington, DC 20548
Page 2 GAO- 02- 372 Performance Reporting
standards and methodologies agencies reported they used to assess their
performance data and whether the agencies include information as to how they
used them, and (3) useful discussions in agencies? performance reports on
the completeness and reliability of their performance data and actions to
resolve any inadequacies- discussions that may be useful to other agencies
in their future reports.
Only five of the 24 CFO Act agencies? fiscal year 2000 performance reports
included assessments of the completeness and reliability of their
performance data in their transmittal letters. Those five agencies were the
Department of Energy (DOE), the Department of Labor (DOL), the Federal
Emergency Management Agency (FEMA), the National Science Foundation (NSF),
and the Nuclear Regulatory Commission (NRC). The other 19 agencies
discussed, at least to some degree, the quality of their performance data
elsewhere in their performance reports.
None of the agencies identified any material inadequacies with their
performance data in their performance reports. However, concerns about the
quality of performance data were identified by the inspector general as
either a major management challenge or included in the discussion of other
challenges for 11 of the 24 agencies. None of the 11 agencies reconciled the
IGs? view with that of the agency?s management who did not identify any
material inadequacy with the performance data.
Although not required, discussing in performance reports the standard or
method used to assess the completeness and reliability of its performance
data can provide helpful contextual information to decisionmakers on the
credibility of the reported performance data. For example, four agencies
said that they used the Office of Management and Budget?s (OMB) suggested
standards for the completeness and reliability of performance data. Another
agency mentioned that it did a self- assessment of the quality of its
performance data but did not describe the standards or methods it used.
Still another agency hired an external third party to assess the quality of
some of its performance data.
We identified additional practices among the 24 agencies? performance
reports that could enhance the usefulness of agencies? future performance
reports. These examples fall into two categories: (1) discussions of data
quality including known data limitations and actions to address the
limitations and (2) discussions on data verification and validation
procedures and data sources, including proposals to review data collection
and verification and validation procedures. Results in Brief
Page 3 GAO- 02- 372 Performance Reporting
We provided a draft of this report to the director of the Office of
Management and Budget for his review and comment. While we did not receive
comments from the director, OMB staff provided us with oral comments on the
draft report. OMB staff generally agree with the information contained in
the draft report. The staff provided technical clarifications and
suggestions that we incorporated where appropriate. OMB staff also said that
the draft report implied that problems with an agency?s data identified by
an IG always equate with material inadequacies in the completeness and
reliability of performance data. While we do not agree that our report
implies this, we agree that data quality problems identified by an IG do not
always equate with a material inadequacy. Our point was that none of the 11
agencies? performance reports addressed whether these conclusions on the
part of their IGs were or were not material inadequacies. OMB staff
acknowledged that in cases where an IG identified a problem with the quality
of an agency?s performance data, the agency should have addressed the
problem in the performance report.
Annual performance reports are essential for communicating to decisionmakers
the progress an agency made towards achieving its goals during a given year
and, in cases where goals are not met, identifying opportunities for
improvement or whether goals need to be adjusted. In passing GPRA, however,
Congress emphasized that the usefulness of agencies? performance data
depends, to a large degree, on the reliability and validity of their
performance data. Our work over the past several years has identified
limitations on agencies? abilities to produce credible performance data. 2
In addition, agencies typically have not clearly articulated in their annual
performance plans the policies and procedures they plan to use to ensure the
credibility of their performance data.
One of the purposes of the Reports Consolidation Act of 2000 is to improve
the quality of agency financial and performance data. Thus, the act requires
that an agency?s performance report include a transmittal letter from the
agency head containing, in addition to any other content, an assessment of
the completeness and reliability of the performance and financial data used
in the report. It also requires that the assessment describe any material
inadequacies in the completeness and reliability of
2 U. S. General Accounting Office, Managing for Results: Challenges Agencies
Face in Producing Credible Performance Information, GAO/ GGD- 00- 52
(Washington, D. C.: Feb. 4, 2000). Background
Page 4 GAO- 02- 372 Performance Reporting
the data and the actions the agency can take and is taking to resolve such
inadequacies. In addition, the act allows agencies that prepare
accountability reports to combine this report with their performance report.
This combined report then is called the performance and accountability
report. When an agency chooses to issue a performance and accountability
report the act requires that the report include a summary of the most
serious management and performance challenges facing the agency, as
identified by their IGs, and a brief assessment of the agency?s progress in
addressing those challenges. Agency heads are allowed to comment on the IG?s
statements but not change them. Seven of the 24 CFO Act agencies had a
performance and accountability report.
The remaining agencies had either stand- alone performance reports or
combined their report with their performance plans. In their efforts to
develop goals and measures for their major management challenges, as
suggested by OMB guidance, agencies have included in their annual
performance plan, performance report, or both, a listing of the major
management challenges they face. Typically, these major management
challenges were identified by our prior work or the work of an agency?s IG
or both and are deemed problems that are of a mission critical nature or
could affect achievement of major program goals.
OMB?s guidance to agencies on preparing annual performance reports (OMB
Circular No. A- 11, Part 2) includes guidance on how agencies may comply
with the Reports Consolidation Act?s requirements and suggested standards
for assessing the completeness and reliability of performance data. The
suggested standards are shown in figure 1.
Page 5 GAO- 02- 372 Performance Reporting
Figure 1: OMB Circular No. A- 11 Standards on Complete and Reliable
Performance Data
Source: Office of Management and Budget, Circular No. A- 11, Part 2,
Preparation and Submission of Strategic Plans, Annual Performance Plans, and
Annual Program Performance Reports (2000).
This recent trend toward linking government programs to their results and
outcomes is not isolated to the United States. There is widespread attention
in other countries, as well, on the importance of performance reporting to
help enhance government performance, transparency, and accountability. As in
the United States, the national audit offices of other countries have
identified opportunities to make performance reporting more useful. For
example, the United Kingdom?s National Audit Office issued a report in 2000
on good practices in performance reporting. 3 The Canadian Office of the
Auditor General has also conducted similar work. 4
3 United Kingdom?s National Audit Office, Good Practice in Performance
Reporting in Executive Agencies and Non- Departmental Public Bodies, (HC 272
Session 1999- 2000) (London: March 2000).
4 Auditor General of Canada, Chapter 19, Reporting Performance to
Parliament: Progress Too Slow, Report to the House of Commons, (Dec. 2000).
Performance data are considered complete if
actual performance is reported for every performance goal and indicator in
the annual plan, including preliminary data if that is the only data
available when the annual report is sent to the President and Congress; and
the agency identifies, in the report, any performance goals and indicators
for which actual performance data are not available at the time the annual
report is transmitted, and notes that the performance data will be included
in a subsequent annual report.
Performance data are considered reliable if
there is neither a refusal nor a marked reluctance by agency managers or
decisionmakers to use the data in carrying out their responsibilities, and
data are further defined as reliable when the agency managers and decision
makers use the data contained in the annual report on an ongoing basis in
the normal course of their duties.
Page 6 GAO- 02- 372 Performance Reporting
To meet our objectives, we did a content analysis of the 24 CFO Act
agencies? fiscal year 2000 annual performance reports. To specifically
address the first and second objectives, we also reviewed the GPRA
requirements for agencies? performance reports; the requirements of the
Reports Consolidation Act of 2000; and guidelines contained in OMB Circular
No. A- 11, Part 2. Additionally, we reviewed the IG?s list of major
management challenges to determine whether data problems or issues had been
identified. To address our third objective we also reviewed work done by
other national audit organizations to determine whether they identified
useful reporting practices consistent with those examples we identified in
agencies? fiscal year 2000 performance reports.
We conducted our work from September 2001 through February 2002 in
Washington, D. C., in accordance with generally accepted government auditing
standards. We requested comments on a draft of this report from OMB.
Although the Reports Consolidation Act requires agencies to include in the
transmittal letters of their performance reports assessments on the
completeness and reliability of their data, 19 of 24 CFO Act agencies?
fiscal year 2000 performance reports lacked such statements. The five
agencies that included statements assessing the completeness and reliability
of their data in their reports? transmittal letters were DOE, DOL, FEMA,
NSF, and NRC. OMB told us that it intends to underscore to agencies the
importance of complying with the performance reporting requirements of the
Reports Consolidation Act of 2000.
While 19 agencies did not have statements assessing the completeness and
reliability of their performance data in their reports? transmittal letters,
they either included related statements or commented to some degree on the
quality of their data elsewhere in their performance reports. For example,
the Department of Interior?s (DOI) performance report had a statement on the
completeness and reliability of its performance data in a section entitled
?Additional GPRA Information.? The preface to the General Services
Administration?s (GSA) performance report included a comment that the
performance data were ?generally complete and reliable.? However, the agency
also stated that it was reviewing its procedures for collecting performance
data and the basis for making its comment on the data. The Department of
Veterans Affairs? (VA) performance report had a data quality section in
which VA noted that, while the quality of its performance data was much
better than it was Scope and
Methodology Most Reports Lacked Statements in Their Transmittal Letters on
the Completeness and Reliability of Performance Data
Page 7 GAO- 02- 372 Performance Reporting
when VA started its results- oriented management efforts, data quality is
not yet where VA wants it to be. VA further states that improving its data
is a long- term project that it will continue to pursue. The agency
describes some of the specific actions it is taking to improve the quality
of the data. While the Department of Agriculture?s performance report did
not have a statement on the completeness and reliability of its performance
data for the department as a whole, several agricultural agencies, such as
the Food Safety and Inspection Service and the Food and Nutrition Service,
commented on the completeness and reliability of some or all of the data
they used in their reports.
In addition to discussing the completeness and reliability of their
performance data, agencies are required by the act to identify in their
performance report any material inadequacies of their performance data and
actions to address these inadequacies. None of the 24 agencies? reports
identified any material inadequacies regarding the performance data.
However, performance data quality for 11 agencies was noted by each
respective agency?s IG either as a major management challenge, or concerns
about data quality were included in discussions of an agency?s other major
management challenges. None of the 11 agencies reconciled these views with
those of the agencies? management who did not identify any material
inadequacy with the performance data. For example, even though DOL stated in
its performance report that it had no material inadequacies in its
performance data, DOL?s IG identified the quality of program and cost data
as one of the more serious management and performance challenges facing DOL.
While not specifically citing fiscal year 2000 performance data, the IG
raised concerns about the quality of DOL?s program results data and briefly
summarized its concerns about limitations in DOL?s performance data. The
Environmental Protection Agency?s (EPA) IG also included data management as
one of the agency?s top management challenges. Again, while not addressing
specific data in EPA?s fiscal year 2000 performance report, the IG stated
that its audits of EPA?s programmatic areas typically cover environmental
information systems, and it frequently identifies deficiencies in these
systems. Such problems included EPA?s and the states? reporting inconsistent
data because they use different data definitions and, at times, collect and
input different data. EPA?s IG provided comments in the report indicating
that these problems continue to exist. The Small Business Administration?s
(SBA) IG cited the need for SBA to improve its managing for results
processes and produce reliable performance data as a new management
challenge for fiscal year 2001.
Page 8 GAO- 02- 372 Performance Reporting
Although not required, including discussions of standards and methods used
by agencies to assess the quality of their performance data in their
performance reports provides decisionmakers greater insight into the quality
and value of the performance data. Four agencies-- DOL, DOI, the Department
of Justice (DOJ), and the Nuclear Regulatory Commission (NRC)-- stated that
they used OMB?s suggested standards for completeness and reliability of
performance data. For example, NRC?s performance report included a
descriptive section on how it assessed the completeness and reliability of
its data. As shown in figure 2, NRC stated that based on OMB?s standards on
completeness and reliability, ?the data used by the NRC meet this test for
completeness. . . and meet the test for reliability.?
Figure 2: Excerpt of NRC?s Performance Report?s Discussion on Assessing Its
Performance Data Using the OMB Standards
Source: U. S. Nuclear Regulatory Commission, Performance and Accountability
Report Fiscal Year 2000 (Washington, D. C.: 2001).
Also, DOJ?s performance report indicated that each of its reporting
components assessed the credibility of its own data, and the department
surveyed the components to ensure that their reported data met the OMB
suggested standards. Some Agencies?
Reports Discussed Standards and Methods Used for Assessing Performance Data
Verification and Validation of Data
Data Completeness and Reliability
Assessing the reliability and completeness of performance data is critical
to managing for results. Comparing actual performance with the projected
levels of performance can only be accomplished if the data used to measure
performance are complete and reliable. The Reports Consolidation Act of 2000
requires that agency heads assess the completeness and reliability of the
performance data used in this report. A draft revision to Part 2 of 0MB
Circular A- 11 part 232.10 describes specifically how an agency should
assess the completeness and reliability of the performance data. The
following discussion on data completeness and reliability is based on the
guidance provided in the draft revision to 0MB Circular A- 11.
Data Completeness
0MB ?s draft A- 11 guidance indicates that data are considered complete if
actual performance data is reported for every performance goal and indicator
in the annual plan. Actual performance data may include preliminary data if
those are the only data available when the report is sent to the President
and Congress. The agency must identify those goals for which actual data are
not available at the time the annual report is transmitted and note that the
data will be included in a subsequent annual report. The data used by the
NRC meet this test for completeness. Actual or preliminary data have been
reported for every strategic and performance measure.
Data Reliability
OMB?s draft A- 11 guidance indicates that data are considered reliable when
there is neither a refusal nor a marked reluctance by agency managers or
decision makers to use the data in carrying out their responsibilities. Data
need not be perfect to be reliable and the cost and effort to secure the
best performance data possible may exceed the data?s value. The agency
managers and decision makers use the data contained in this report on an
ongoing basis in the normal course of their duties. There is neither a
refusal nor a marked reluctance by agency managers or decision makers to use
the data in carrying out their responsibilities. The data used by the NRC
meet the test for reliability.
Page 9 GAO- 02- 372 Performance Reporting
Similar to the agencies above, DOE did a self- assessment of the quality of
its performance data. Specifically, DOE stated in its performance report
that the ?reliability of the data is based on the Department?s policy that
the primary tool used at all levels to assess and evaluate results is
selfassessment. The DOE program offices provided the performance information
and concurred with this report.? However, unlike the agencies above, DOE did
not elaborate on the standards or methods used for the self- assessment,
including whether it used OMB?s suggested standard.
NSF used an approach different from a self- assessment; it hired an
independent third party to assess selected NSF performance data. NSF stated
in its performance report that it contracted with PricewaterhouseCoopers to
verify and validate selected performance data as well as the process used in
collecting and compiling data. NSF stated that PricewaterhouseCoopers
concluded that
?NSF was reporting its GPRA measures with sufficient accuracy such that any
errors, should they exist, would not be significant enough to change the
reader?s interpretation as to the Foundation?s success in meeting the
supporting goal. . . .?
NSF continued that PricewaterhouseCoopers concluded that NSF
?relies on sound business processes, system and application controls, and
manual checks of system queries to confirm the accuracy of reported data. We
believe that these processes are valid and verifiable.?
In addition to discussing standards and methods used to assess the quality
of performance data, we saw additional practices, in several agencies?
performance reports, that would help foster transparency to the public and
assist decisionmakers in understanding the quality of an agency?s data. The
additional practices we observed were useful discussions that fall into two
categories:
Discussion of data quality, including known data limitations and actions
to address the limitations.
Discussion of data verification and validation procedures, including
proposals to review data collection and verification and validation
procedures.
Several of the useful practices we identified are consistent with those
identified in the United Kingdom?s National Audit Office?s report, Good
Practice in Performance Reporting in Executive Agencies and Non
Additional Practices of Useful Discussions about the Quality of Performance
Data
Page 10 GAO- 02- 372 Performance Reporting
Departmental Public Bodies. Specifically, the document discusses that one
good reporting practice is for an agency to discuss the quality of data by
explaining (1) the sources of data collected by external sources, (2)
actions taken by the agency where data are unavailable or poor, (3) survey
methodologies, and (4) the approach used by an agency to validate
performance data.
We previously reported that the usefulness of agency performance plans could
be improved by including discussions on an agency?s capacity to gather and
use performance information. Some of the practices we identified associated
with performance plans-- identifying internal and external data sources and
identifying actions to compensate for, and discussing implications of, data
limitations for assessing performance- would make performance reports more
useful. 5 Discussing data credibility issues in performance reports provides
important contextual information to congressional and executive branch
decisionmakers to help them understand the data and proposed actions to
address any data weaknesses.
A few of the agencies? fiscal year 2000 performance reports incorporated
some of these practices and discussed data quality issues including (1) why
an agency thought some data are credible and (2) when problems were known,
actions being taken to address them. For example, the Department of
Transportation?s (DOT) performance report included a section entitled,
?Performance Measurement, Verification and Validation.? In this section, DOT
summarized some general rules it had established regarding the data it uses
and how they are evaluated and discussed, data verification and validation
procedures, data limitations, and data needs for each strategic goal. DOT
also included an appendix describing, for each performance measure, the
scope of the measure, source of the data, data limitations, statistical
issues, verification and validation procedures, and a comment on the
usefulness of the data. (See fig. 3.)
5 U. S. General Accounting Office, Agency Performance Plans: Examples of
Practices That Can Improve Usefulness to Decisionmakers, GAO/ GGD/ AIMD- 99-
69 (Washington, D. C.: Feb. 26, 1999). Data Quality
Page 11 GAO- 02- 372 Performance Reporting
Figure 3: Excerpt from DOT?s Performance Report Discussion on Data Issues
Associated with Each Measure
Source: U. S. Department of Transportation, Fiscal Year 2000 Performance
Report and Fiscal Year 2002 Performance Plan (Washington, D. C.: April 2001)
VA?s performance report contains a section discussing the quality of its
performance data. In this section, VA summarizes some of the
Page 12 GAO- 02- 372 Performance Reporting
departmentwide data quality issues and its response to them. VA also
describes data quality issues within each of its administrations and the
actions, either in place or planned, intended to improve the quality of the
data.
EPA?s performance report also provides a useful discussion of data quality.
The agency discusses the source and quality of the data associated with each
performance goal. (See fig. 4.)
Page 13 GAO- 02- 372 Performance Reporting
Figure 4: Excerpt from EPA?s Performance Report
Source: U. S. Environmental Protection Agency, Fiscal Year 2000 Annual
Report (Washington, D. C.: March 1, 2001)
Page 14 GAO- 02- 372 Performance Reporting
While agencies are required to discuss in their performance plans the
procedures they will use to verify and validate performance data, there is
no similar requirement for performance reports. Although not required, some
agencies? performance reports included discussions of their data
verification and validation procedures. This additional information helps to
place the credibility of an agency?s reported performance data in context
for decisionmakers. For example, as shown in figure 5, the Department of
Education?s performance report also described the validation procedure
related to each performance measure. In addition, the department included,
for each performance measure, information on the frequency of data
collection and, if any, data limitations and planned improvements to address
the limitations.
Figure 5: Excerpt from the Department of Education?s Performance Report
Source: Interim U. S. Department of Education Department- wide Fiscal Year
2000 Performance Report (Washington, D. C: April 13, 2001).
In addition, Education?s report contained an appendix that showed the
department?s draft quality standards. These standards cover the issues of
validity, accurate definitions, accurate counts, editing, calculation,
timeliness, reporting, and burden reduction.
While limited confidence in the credibility of performance data has been one
of the major weaknesses with GPRA implementation, few agencies took the step
of increasing confidence in performance data by including a statement in
their performance report?s transmittal letter assessing the completeness and
reliability of their data. Although agencies often discussed data quality
issues elsewhere in their reports, statements attesting to the completeness
and reliability of performance data are Data Verification and
Validation Concluding Observations
Page 15 GAO- 02- 372 Performance Reporting
important so that decisionmakers can rely with confidence on the performance
data when making decisions. This issue should be addressed by OMB?s
intention to underscore in its guidance on performance reporting the
importance of compliance with the provision of the Reports Consolidation Act
of 2000 that an agency?s transmittal letter for its performance report
contain either a statement that the report?s performance data are complete
and reliable or a statement identifying material inadequacies in the data
and the importance of an agency?s actions to address these inadequacies.
We requested comments on a draft of this report on March 18, 2002, from the
director of the OMB or his designee because of OMB?s leadership
responsibilities for government- wide implementation of GPRA. We did not
request comment from individual agencies. We did, however, provide a draft
of this report to each of the 24 CFO Act agencies for informational
purposes. While we did not receive comments from the OMB director, as of
April 8, OMB staff provided us with oral comments on the draft report. OMB
staff generally agreed with the information contained in the draft report.
OMB staff had three specific comments on the draft report. First, the staff
agreed that if an agency did not have a completeness and reliability
statement in the transmittal letter of its performance report-- or at least
in a report?s preface, forward, or somewhere in the front of the report--
then the agency fell short of the Reports Consolidation Act?s requirement.
Second, OMB staff asked that we clarify that the Reports Consolidation Act
requires an agency?s performance and accountability report to include a
summary of the agency?s most serious management and performance challenges,
as identified by its IG office, and the agency?s progress in addressing
those challenges. Performance and accountability reports are created when an
agency?s performance report is combined with its accountability report. The
act?s requirement does not pertain to standalone performance reports or
performance reports combined with performance plans. We made clarifications
in the report where appropriate.
Third, OMB staff stated that our report implies that data quality identified
by an IG always equates with material inadequacies in the completeness and
reliability of performance data. While we do not agree that our report
implies this, we agree that data quality problems identified by an IG do not
always equate to a material inadequacy. However, as our draft noted, the
Agency Comments
Page 16 GAO- 02- 372 Performance Reporting
IGs for 11 agencies specifically identified performance data quality as
either a major management challenge in itself or a part of other major
management challenges, and none of the 11 agencies? performance reports
addressed whether these conclusions on the part of their IGs were or were
not material inadequacies. OMB staff acknowledged that in cases where the IG
identified a problem with the quality of an agency?s performance data, the
agency should address the problem in its performance report.
As agreed with your office, unless you publicly announce its contents
earlier, we plan no further distribution of this report until 30 days from
its date. At that time, we will send copies to the chairman, Senate
Committee on Governmental Affairs; the chairman and ranking minority member,
House Committee on Government Reform; and the director of OMB. In addition,
we will make copies available to others upon request.
If you have any questions about this report, please contact me or Boris
Kachura on (202) 512- 6806. Allen Lomax, Sharon Hogan, and Adam Roye were
key contributors to this report.
Sincerely yours, J. Christopher Mihm Director, Strategic Issues
(450075)
The General Accounting Office, the investigative arm of Congress, exists to
support Congress in meeting its constitutional responsibilities and to help
improve the performance and accountability of the federal government for the
American people. GAO examines the use of public funds; evaluates federal
programs and policies; and provides analyses, recommendations, and other
assistance to help Congress make informed oversight, policy, and funding
decisions. GAO?s commitment to good government is reflected in its core
values of accountability, integrity, and reliability.
The fastest and easiest way to obtain copies of GAO documents is through the
Internet. GAO?s Web site (www. gao. gov) contains abstracts and full- text
files of current reports and testimony and an expanding archive of older
products. The Web site features a search engine to help you locate documents
using key words and phrases. You can print these documents in their
entirety, including charts and other graphics.
Each day, GAO issues a list of newly released reports, testimony, and
correspondence. GAO posts this list, known as ?Today?s Reports,? on its Web
site daily. The list contains links to the full- text document files. To
have GAO e- mail this list to you every afternoon, go to www. gao. gov and
select "Subscribe to daily e- mail alert for newly released products" under
the GAO Reports heading.
The first copy of each printed report is free. Additional copies are $2
each. A check or money order should be made out to the Superintendent of
Documents. GAO also accepts VISA and Mastercard. Orders for 100 or more
copies mailed to a single address are discounted 25 percent. Orders should
be sent to:
U. S. General Accounting Office P. O. Box 37050 Washington, D. C. 20013
To order by Phone: Voice: (202) 512- 6000 TDD: (202) 512- 2537 Fax: (202)
512- 6061
GAO Building Room 1100, 700 4th Street, NW (corner of 4th and G Streets, NW)
Washington, D. C. 20013
Contact: Web site: www. gao. gov/ fraudnet/ fraudnet. htm, E- mail:
fraudnet@ gao. gov, or 1- 800- 424- 5454 or (202) 512- 7470 (automated
answering system).
Jeff Nelligan, Managing Director, NelliganJ@ gao. gov (202) 512- 4800 U. S.
General Accounting Office, 441 G. Street NW, Room 7149, Washington, D. C.
20548 GAO?s Mission
Obtaining Copies of GAO Reports and Testimony
Order by Mail or Phone Visit GAO?s Document Distribution Center
To Report Fraud, Waste, and Abuse in Federal Programs
Public Affairs
*** End of document. ***