[Senate Report 118-291]
[From the U.S. Government Publishing Office]
Calendar No. 697
118th Congress } { Report
SENATE
2d Session } { 118-291
_______________________________________________________________________
PROMOTING RESPONSIBLE EVALUATION
AND PROCUREMENT TO ADVANCE
READINESS FOR ENTERPRISE-WIDE
DEPLOYMENT (PREPARED) FOR ARTIFICIAL INTELLIGENCE ACT
__________
R E P O R T
of the
COMMITTEE ON HOMELAND SECURITY AND
GOVERNMENTAL AFFAIRS
UNITED STATES SENATE
to accompany
S. 4495
TO ENABLE SAFE, RESPONSIBLE, AND AGILE PROCUREMENT,
DEVELOPMENT, AND USE OF ARTIFICIAL INTELLIGENCE BY
THE FEDERAL GOVERNMENT, AND FOR OTHER PURPOSES
[GRAPHIC(S) NOT AVAILABLE IN TIFF FORMAT]
December 16, 2024.--Ordered to be printed
_______
U.S. GOVERNMENT PUBLISHING OFFICE
59-010 WASHINGTON : 2025
COMMITTEE ON HOMELAND SECURITY AND GOVERNMENTAL AFFAIRS
GARY C. PETERS, Michigan, Chairman
THOMAS R. CARPER, Delaware RAND PAUL, Kentucky
MAGGIE HASSAN, New Hampshire RON JOHNSON, Wisconsin
KYRSTEN SINEMA, Arizona JAMES LANKFORD, Oklahoma
JACKY ROSEN, Nevada MITT ROMNEY, Utah
JON OSSOFF, Georgia RICK SCOTT, Florida
RICHARD BLUMENTHAL, Connecticut JOSH HAWLEY, Missouri
ADAM SCHIFF, California ROGER MARSHALL, Kansas
David M. Weinberg, Staff Director
Alan S. Kahn, Chief Counsel
Michelle M. Benecke, Senior Counsel
Evan E. Freeman, Counsel
William E. Henderson III, Minority Staff Director
Christina N. Salazar, Minority Chief Counsel
Andrew J. Hopkins, Minority Counsel
Kendal B. Tigner, Minority Professional Staff Member
Laura W. Kilbride, Chief Clerk
Calendar No. 697
118th Congress } { Report
SENATE
2d Session } { 118-291
======================================================================
PROMOTING RESPONSIBLE EVALUATION AND PROCUREMENT TO ADVANCE READINESS
FOR ENTERPRISE-WIDE DEPLOYMENT (PREPARED) FOR ARTIFICIAL INTELLIGENCE
ACT
_______
December 16, 2024.--Ordered to be printed
_______
Mr. Peters, from the Committee on Homeland Security and Governmental
Affairs, submitted the following
R E P O R T
[To accompany S. 4495]
[Including cost estimate of the Congressional Budget Office]
The Committee on Homeland Security and Governmental
Affairs, to which was referred the bill (S. 4495) to enable
safe, responsible, and agile procurement, development, and use
of artificial intelligence by the Federal Government, and for
other purposes, having considered the same, reports favorably
thereon with an amendment, in the nature of a substitute, and
recommends that the bill, as amended, do pass.
CONTENTS
Page
I. Purpose and Summary.............................................. 1
II. Background and Need for the Legislation.......................... 2
III. Legislative History.............................................. 3
IV. Section-by-Section Analysis of the Bill, as Reported............. 4
V. Evaluation of Regulatory Impact.................................. 7
VI. Congressional Budget Office Cost Estimate........................ 7
VII. Changes in Existing Law Made by the Bill, as Reported............ 9
I. PURPOSE AND SUMMARY
S. 4495, the Promoting Responsible Evaluation and
Procurement to Advance Readiness for Enterprise-wide Deployment
for Artificial Intelligence Act (PREPARED for AI Act), would
create a risk-based model for governance, procurement and use
of artificial intelligence (AI) by federal agencies. The bill
requires federal agencies to institute certain safeguards
around the acquisition and use of AI, with special attention to
high-risk use cases that will impact the rights and safety of
individuals and entities. The bill requires government
contracts for AI capabilities to include terms for data
ownership, security, civil rights, civil liberties, privacy,
adverse impact reporting, and other key areas. It instructs
agencies to identify, test, and monitor potential risks before,
during, and after acquiring AI capabilities, including through
ongoing testing and evaluation to mitigate potential risks. The
bill also requires agencies to establish AI governance
structures, including Chief AI Officers, to lead and coordinate
AI efforts. The legislation establishes pilot programs to
streamline how agencies are able to purchase AI and other
commercial technology, to help bolster innovative adoption.
Finally, the bill includes key provisions to encourage
transparency of the government's use of AI through improved
public disclosures and reporting.
II. BACKGROUND AND NEED FOR THE LEGISLATION
Artificial intelligence has the potential to transform how
the federal government serves the American people. From
accelerating service delivery, to informing data-driven
decisions, to enhancing agency operations, successful AI
adoption by the government can increase government efficiency
and improve customer service. However, especially when
improperly tested or deployed, AI systems can produce
incorrect, irrelevant, or harmful output, and can fail to
fulfill intended functions due to issues like inaccurate data,
flawed algorithms, or encountering situations outside its
training parameters. This can lead to errors, biases, or
unintended consequences that can negatively impact the safety
and rights of Americans.\1\ This bill would guide the federal
government's activities, personnel, and processes to
effectively and responsibly procure and use AI.
---------------------------------------------------------------------------
\1\Maximizing the public good: How Generative AI can enhance
government programs and services, Deloitte (Apr. 11, 2023)
(www2.deloitte.com/content/dam/Deloitte/us/Documents/public-sector/
genai-maximizing-for-public-good.pdf).
---------------------------------------------------------------------------
Already, there are documented instances of automated
systems inadvertently harming Americans. A federally
unregulated drug addiction assessment AI model deployed across
several states wrongfully denied painkillers to patients who
suffered from severe pain and who had no previous history of
drug abuse.\2\ A U.S. immigration court deployed an AI-powered
translation tool that was not properly trained on non-English
languages, resulting in the wrongful denial of asylum to a
Pashto-speaking Afghan refugee due to inaccurate
translation.\3\ An AI algorithm used by government-sponsored
entities to approve U.S. mortgage loan applications has been
shown to deny applicants of color despite notwithstanding their
having similar backgrounds to white applicants. The responsible
federal agency, the Federal Housing Finance Agency, lacks
insight into how the automated system scores applicants.\4\
These instances highlight the need for federal legislation to
set parameters for responsible AI procurement and use,
especially in high-risk use cases that impact people's rights
or safety.
---------------------------------------------------------------------------
\2\The Pain Was Unbearable. So Why Did Doctors Turn Her Away?,
Wired (Aug. 11, 2021) (www.wired.com/story/opioid-drug-addiction-
algorithm-chronic-pain/).
\3\How Language Translation Technology is Jeopardizing Afghan
Asylum-Seekers, PBS News (May 7, 2023) (www.pbs.org/newshour/show/how-
language-translation-technology-is-jeopardizing-afghan-asylum-seekers);
AI Translation is Jeopardizing Afghan Asylum Claims, Rest of the World
(Apr. 19, 2023) (restofworld.org/2023/ai-translation-errors-afghan-
refugees-asylum/).
\4\The Secret Bias Hidden in Mortgage-Approval Algorithms, The
Markup (Aug. 25, 2021) (themarkup.org/denied/2021/08/25/the-secret-
bias-hidden-in-mortgage-approval-algorithms).
---------------------------------------------------------------------------
Limited resources, a lack of technical capacity, and
insufficient governance structures have led many federal
agencies to outsource AI development and deployment to external
AI vendors, forcing agencies ``to rely on automated systems to
make important policy decisions without understanding why those
decision were made.''\5\ Moreover, without minimum standards
for responsible AI procurement, these agencies lack both
visibility into the AI systems they are deploying and the
necessary information to evaluate whether the AI system is
suitable for their purposes.\6\
---------------------------------------------------------------------------
\5\Electronic Privacy Information Center, Outsourced and Automated:
How AI Companies Have Taken Over Government Decision-Making (Sept.
2023) (epic.org/wp-content/uploads/2023/09/FINAL-EPIC-Outsourced-
Automated-Report-Appendix-Included.pdf).
\6\Id.
---------------------------------------------------------------------------
The PREPARED for AI Act would enable agencies to evaluate
AI tools based on the intended use cases and to mitigate any
associated risks before procurement and deployment. By building
AI governance structures within and across agencies, this
legislation seeks to empower agencies to choose use cases where
AI can best serve their respective missions, while harmonizing
minimum standards and best practices for AI use across the
federal government.
III. LEGISLATIVE HISTORY
Senator Gary Peters (D-MI) introduced S. 4495, the
Promoting Responsible Evaluation and Procurement to Advance
Readiness for Enterprise-wide Deployment for Artificial
Intelligence Act (PREPARED for AI Act), on June 11, 2024, with
original cosponsor Senator Thomas Tillis (R-NC). The bill was
referred to the Committee on Homeland Security and Governmental
Affairs.
The Committee considered S. 4495 at a business meeting on
July 31, 2024. At the business meeting, Senator Peters offered
a substitute amendment, as well as a modification to the
substitute amendment. The Peters substitute amendment, as
modified, adds provisions to increase cross-agency
harmonization, focuses the bill more narrowly on high risk AI
use cases, adds procedures to ensure competition for small
businesses, updates required documentation, requires a report
to identify extremely low-risk AI use cases and a report on the
impact of this Act on small businesses, and updates key
definitions for ``adverse outcome,'' ``deployer,''
``developer.'' The Committee adopted the modification to the
Peters substitute amendment, and the substitute amendment as
modified, by unanimous consent, with Senators Peters, Carper,
Hassan, Sinema, Rosen, Ossoff, Blumenthal, Butler, Paul,
Lankford, and Scott present.
The bill, as amended by the Peters substitute amendment as
modified, was ordered reported favorably by roll call vote of 8
yeas to 3 nays, with Senators Peters, Carper, Hassan, Sinema,
Rosen, Ossoff, Blumenthal, and Butler voting in the
affirmative, and Senators Paul, Lankford, and Scott voting in
the negative. Senators Johnson, Romney, Hawley, and Marshall
voted nay by proxy, for the record only.
IV. SECTION-BY-SECTION ANALYSIS OF THE BILL, AS REPORTED
Section 1. Short title
Section 1 establishes the short title of the bill as the
``Promoting Responsible Evaluation and Procurement to Advance
Readiness for Enterprise-wide Deployment for Artificial
Intelligence Act'' or the ``PREPARED for AI Act.''
Section 2. Definitions
Section 2 defines key terms in the bill including ``adverse
outcome,'' ``agency,'' ``artificial intelligence,'' ``biometric
data,'' ``commercial technology,'' ``council,'' ``deployer,''
``developer,'' ``director,'' ``government data,'' ``impact
assessment,'' ``relevant congressional committees,'' ``risk,''
and ``use case.''
Section 3. Implementation of requirements
Section 3 requires the Director of the Office of Management
and Budget (OMB) to (1) brief Congress on the bill's
implementation 180 days after enactment and (2) facilitate the
implementation of the bill's requirements after enactment.
Section 4. Procurement of Artificial Intelligence
Subsection (a) requires the Federal Acquisition Regulatory
(FAR) Council to review, update, and harmonize FAR requirements
as needed to ensure agency procurements of AI address key risk
management provisions in this bill, including those related to
data ownership and privacy, scope of use, cybersecurity
standards, and risk mitigation plans for addressing adverse
outcomes.
Subsection (b) requires agencies to update and incorporate
certain terms and conditions into contracts and agreements
before procuring AI for high-risk uses. Such provisions cover,
for example, the purpose and risks of the intended AI use;
rights to government data; data stewardship; testing and
evaluation; documentation; and how to report any adverse
outcomes. This subsection ensures that appropriate leaders in
agencies review the terms of contracts and agreements prior to
obtaining AI for high-risk use cases.
Section 5. Interagency governance of Artificial Intelligence
Subsections (a) through (e) establish a Chief AI Officers
(CAIO) Council to share AI program best practices between
agencies, coordinate development and use of AI across
government, and harmonize agency risk management processes.
Subsection (f) requires a Comptroller General report to
Congress about the Council's utility and coordination with
other federal councils.
Subsection (g) requires guidance from OMB one year after
enactment on topics including AI impact assessments,
documentation requirements, and model templates for agency risk
evaluation and procurement. Subsection (h) requires OMB to
develop procedures for adverse outcome reporting involving AI
procured, obtained, or used by federal agencies.
Section 6. Agency governance of Artificial Intelligence
Subsection (a) clarifies the responsibility of federal
agency heads for responsible AI governance, procurement, and
use of AI, and for workforce training within their agencies.
Subsection (b) requires each agency head to designate a
Chief AI Officer (CAIO), with criteria outlined for
designation.
Subsection (c) requires CFO Act agencies to each establish
an AI Governance Board, led by the deputy head and CAIO at that
agency.
Subsection (d) requires agencies to have these officials
and bodies in place within 120 days after enactment of the bill
before procuring or obtaining AI.
Section 7. Agency requirements for use of Artificial Intelligence
Subsection (a) requires agency CAIOs (in consultation with
their AI Governance Board) to develop and implement a risk
evaluation processes for high-risk AI usage within 180 days of
enactment. The section sets minimum requirements for ``high
risk'' classification, which would entail additional safeguards
such as periodic review requirements, targeted impact
assessments, and consultation with affected communities, if
appropriate. Agencies could alter their classification, if
needed, based on results of AI testing or new information about
the AI. Agencies must also provide a rationale for their high-
risk classification that follows the OMB Director's model
template provided for in Section 5(g)(1)(B).
Subsection (b) sets requirements for documentation that
agencies must obtain from developers and deployers supplying
the government with high-risk AI systems.
Subsection (c) lists protections this documentation must
include information regarding data source types, evaluation
methodologies, risk evaluation measures, data management, and
known limitations and guidelines of the AI system. Each
agency's CAIO retains the discretion to determine the
sufficiency of the documentation.
Subsection (d) creates pre-deployment and evaluation
requirements for high-risk AI use cases, while subsection (e)
requires agency heads to make certain determinations required
under subsection (d) available to Congress or the Director upon
request. Subsection (f) requires a process for ongoing
monitoring of high-risk uses.
Subsections (g), (h), and (i) provide information on ways
agencies may respond to changing the risk classification of a
use case, exceptions to the requirements of this section, and
waiver processes for certain use cases.
Subsection (j) requires infrastructure security risks and
protocols associated with AI use cases to be reviewed.
Subsection (k) requires agencies to comply with this
section within 270 days after enactment for AI already in use
at the time of the bill's enactment.
Section 8. Prohibition on select Artificial Intelligence use cases
Section 8 prohibits agencies from procuring, developing,
obtaining, or using AI for the purposes of (1) mapping facial
features to assign emotion; (2) categorizing and taking action
against individuals based on biometric data, with the exception
of deducing or inferring age in the context of investigating
child sexual abuse; or (3) creating a social scoring system.
Section 9. Agency Procurement Innovation Labs
Section (a) recommends that CFO Act agencies that do not
already have Procurement Innovation Labs establish a lab or
similar entity to test new approaches and share lessons learned
in the procurement of commercial technology, such as AI.
Subsections (b) and (c) outline the functions and structures of
Procurement Innovation Labs based on those already established
by some agencies.
Section 10. Multi-phase commercial technology test program
Subsections (a) through (d) allow for agencies to procure
commercial technology through a voluntary three-phase pilot
program. Subsection (e) limits agencies to a maximum amount of
$25 million for a pilot program. Subsection (f) requires the
FAR Council to revise the FAR to implement this section and
prevents agencies from awarding contracts under this test
program until issuing public guidance. Subsection (g) sunsets
the authority for a test program 5 years after the FAR updates.
Section 11. Research and development project pilot program
Subsections (a) through (f) authorize and set parameters
for agencies to implement a voluntary pilot program to conduct
basic or applied research and carry out certain prototype
projects. This section also outlines contracting procedures and
allows agencies to award follow-on contracts after the
successful completion of an initial pilot project. Subsection
(g) limits agencies to a maximum amount of $10 million for a
pilot program. Subsection (h) requires the FAR Council to
revise the FAR to implement this section and prevents agencies
from awarding contracts under this test program until issuing
public guidance. Subsection (i) requires contracts to be
reported to the Federal Procurement Data System, and subsection
(j) sunsets the authority for a test program 5 years after the
FAR updates.
Section 12. Development of tools and guidance for testing and
evaluating Artificial Intelligence
Subsection (a) requires each agency CAIO to submit an
annual report on obstacles encountered in testing and
evaluation of AI systems to the CAIO Council. Subsection (b)
requires the Council to annually review these reports to
identify common challenges and opportunities for cross-agency
collaboration. Additionally, under this subsection, the OMB
Director must convene a working group to develop tools and
guidance, support interagency coordination, and address any
additional matters determined appropriate by the Director.
Subsection (c) requires agencies or OMB to provide these
reports to relevant congressional committees. Subsection (d)
requires the CAIO Council to submit a report with a framework
to identify extremely low risk AI use cases and opportunities
to streamline their deployment and use within the federal
government. Subsection (e) sunsets the section's requirements
after 10 years.
Section 13. Updates to Artificial Intelligence use case inventories
Subsections (a) and (b) amend the Advancing American AI Act
(Public Law 117-263; 40 U.S.C. 11301 note) and create public
disclosure requirements for each agency's AI use case
inventory, with guidance from the OMB Director. Subsection (c)
requires the head of each agency to submit an annual AI use
case inventory report to Congress. Subsection (d) requires the
Comptroller General to submit an annual report regarding
whether agencies are appropriately classifying use cases, the
impact of AI procurement on the federal workforce and small
businesses, and overarching trends in federal government use of
AI.
V. EVALUATION OF REGULATORY IMPACT
Pursuant to the requirements of paragraph 11(b) of rule
XXVI of the Standing Rules of the Senate, the Committee has
considered the regulatory impact of this bill and determined
that the bill will have no regulatory impact within the meaning
of the rules. The Committee agrees with the Congressional
Budget Office's statement that the bill contains no
intergovernmental or private-sector mandates as defined in the
Unfunded Mandates Reform Act (UMRA) and would impose no costs
on state, local, or tribal governments.
VI. CONGRESSIONAL BUDGET OFFICE COST ESTIMATE
[GRAPHIC(S) NOT AVAILABLE IN TIFF FORMAT]
S. 4495 would authorize federal agencies to research and
implement new processes to accelerate the development and
acquisition of artificial intelligence (AI) systems. The bill
would require federal agencies to assess the safety of their AI
systems and to incorporate protections for data privacy and
cybersecurity into AI procurement regulations. S.4495 also
would require agencies and the Government Accountability Office
to report to the Congress on the effectiveness and security of
federal AI systems.
Spending Subject to Appropriation. S. 4495 would authorize
two pilot programs for acquiring AI systems. The authority for
each program would expire five years after regulations
governing its use are promulgated. CBO expects agencies would
need two years to update their acquisition procedures before
entering into the authorized contracts. First, S.4495 would
authorize federal agencies to procure AI systems through a new
Commercial Technology Test Program. Under the program, agencies
could enter contracts to develop new technologies in phases so
that private sector entities could test and demonstrate the
feasibility of such technology. Individual contracts under that
authority would be limited to a maximum of $25 million.
Assuming that one new contract is signed per year at the
maximum funding limit under S.4495 and using historical rates
of spending for similar activities, CBO estimates that
operating the test program would cost $55 million over the
2025-2029 period.
Second, S. 4495 would authorize federal agencies to conduct
research and acquire prototypes of new AI systems. Individual
contracts under that authority would be limited to a maximum of
$10million. Assuming that one new program is created per year
at the maximum funding limit under S.4495 and using historical
rates of spending for similar activities, CBO estimates that
implementing the research and development program would cost
$22 million over the 2025-2029 period.
S. 4495 also would require agencies to update acquisition
governance and information technology policy procedures before
acquiring new AI systems. In addition, agencies would have to
regularly update the Congress on their adoption of AI systems
and adherence to safety guidelines. On the basis of costs for
similar activities, CBO estimates that satisfying the policy
development and reporting requirements of S. 4495 would cost $9
million over the 2025-2029 period.
The costs of the legislation, detailed in Table 1, fall
within budget function 800 (general government). Such spending
would be subject to the availability of appropriated funds.
TABLE 1.--ESTIMATED BUDGETARY EFFECTS OF S. 4495
----------------------------------------------------------------------------------------------------------------
By fiscal year, millions of dollars
--------------------------------------------------
2025 2026 2027 2028 2029 2025-2029
----------------------------------------------------------------------------------------------------------------
SPENDING SUBJECT TO APPROPRIATION
Commercial Technology Test Program:
Estimated Authorization.................................. * * 25 25 25 75
Estimated Outlays........................................ * * 10 21 24 55
Research and Development Pilot Program:
Estimated Authorization.................................. * * 10 10 10 30
Estimated Outlays........................................ * * 4 8 10 22
Agency Reports and Policy Development:
Estimated Authorization.................................. 5 1 1 1 1 9
Estimated Outlays........................................ 5 1 1 1 1 9
Total Spending Subject to Appropriation:
Estimated Authorization.................................. 5 1 36 36 36 114
Estimated Outlays........................................ 5 1 15 30 35 86
----------------------------------------------------------------------------------------------------------------
* = between zero and $500,000.
In addition to the amounts shown here, enacting S. 4495 would increase direct spending by less than $500,000
over the 2025-2034 period.
Direct Spending. Enacting the bill could affect direct
spending by some agencies that are allowed to use fees,
receipts from the sale of goods, and other collections to cover
operating costs. CBO estimates that any net changes in direct
spending by those agencies would be negligible because most of
them can adjust amounts collected to reflect changes in
operating costs.
The CBO staff contact for this estimate is Aldo Prosperi.
The estimate was reviewed by Christina Hawley Anthony, Deputy
Director of Budget Analysis.
Phillip L. Swagel,
Director, Congressional Budget Office.
VII. CHANGES IN EXISTING LAW MADE BY THE BILL, AS REPORTED
This legislation would make no change in existing law,
within the meaning of clauses (a) and (b) of subparagraph 12 of
rule XXVI of the Standing Rules of the Senate, because this
legislation would not repeal or amend any provision of current
law.
[all]