[House Hearing, 111 Congress]
[From the U.S. Government Publishing Office]


                                     

                         [H.A.S.C. No. 111-10]
 
                 THE FUTURE OF MISSILE DEFENSE TESTING

                               __________

                                HEARING

                               BEFORE THE

                     STRATEGIC FORCES SUBCOMMITTEE

                                 OF THE

                      COMMITTEE ON ARMED SERVICES

                        HOUSE OF REPRESENTATIVES

                     ONE HUNDRED ELEVENTH CONGRESS

                             FIRST SESSION

                               __________

                              HEARING HELD

                           FEBRUARY 25, 2009

                                     
[GRAPHIC] [TIFF OMITTED] TONGRESS.#13

                                     

                  U.S. GOVERNMENT PRINTING OFFICE
51-659                    WASHINGTON : 2010
-----------------------------------------------------------------------
For Sale by the Superintendent of Documents, U.S. Government Printing Office
Internet: bookstore.gpo.gov  Phone: toll free (866) 512-1800; (202) 512�091800  
Fax: (202) 512�092104 Mail: Stop IDCC, Washington, DC 20402�090001
  


                     STRATEGIC FORCES SUBCOMMITTEE

                ELLEN O. TAUSCHER, California, Chairman
JOHN SPRATT, South Carolina          MICHAEL TURNER, Ohio
LORETTA SANCHEZ, California          HOWARD P. ``BUCK'' McKEON, 
ROBERT ANDREWS, New Jersey               California
JAMES R. LANGEVIN, Rhode Island      MAC THORNBERRY, Texas
RICK LARSEN, Washington              TRENT FRANKS, Arizona
MARTIN HEINRICH, New Mexico          DOUG LAMBORN, Colorado
                 Frank Rose, Professional Staff Member
                 Kari Bingen, Professional Staff Member
                      Zach Steacy, Staff Assistant


                            C O N T E N T S

                              ----------                              

                     CHRONOLOGICAL LIST OF HEARINGS
                                  2009

                                                                   Page

Hearing:

Wednesday, February 25, 2009, The Future of Missile Defense 
  Testing........................................................     1

Appendix:

Wednesday, February 25, 2009.....................................    45
                              ----------                              

                      WEDNESDAY, FEBRUARY 25, 2009
                 THE FUTURE OF MISSILE DEFENSE TESTING
              STATEMENTS PRESENTED BY MEMBERS OF CONGRESS

Tauscher, Hon. Ellen O., a Representative from California, 
  Chairman, Strategic Forces Subcommittee........................     1
Turner, Hon. Michael, a Representative from Ohio, Ranking Member, 
  Strategic Forces Subcommittee..................................     3

                               WITNESSES

Coyle, Hon. Dr. Philip E., III, Former Director, Operational Test 
  and Evaluation, U.S. Department of Defense.....................    28
Francis, Paul L., Director, Acquisition and Sourcing Management, 
  U.S. Government Accountability Office..........................    30
McQueary, Hon. Dr. Charles E., Director, Operational Test and 
  Evaluation, U.S. Department of Defense.........................     4
Mitchell, Donald C., Chief Engineer for Ballistic Missile 
  Defense, Air and Missile Defense Systems Department, Applied 
  Physics Laboratory, Johns Hopkins University...................    31
Nadeau, Maj. Gen. Roger A., USA, Commanding General, Test and 
  Evaluation Command, U.S. Army..................................     9
O'Reilly, Lt. Gen. Patrick J., USA, Director, Missile Defense 
  Agency, U.S. Department of Defense.............................     7

                                APPENDIX

Prepared Statements:

    Coyle, Hon. Dr. Philip E., III...............................    82
    Francis, Paul L..............................................   113
    McQueary, Hon. Dr. Charles E.................................    49
    Mitchell, Donald C...........................................   128
    Nadeau, Maj. Gen. Roger A....................................    73
    O'Reilly, Lt. Gen. Patrick J.................................    59

Documents Submitted for the Record:

    [There were no Documents submitted.]

Witness Responses to Questions Asked During the Hearing:

    [There were no Questions submitted during the hearing.]

Questions Submitted by Members Post Hearing:

    Mr. Heinrich.................................................   163
    Ms. Tauscher.................................................   147
                 THE FUTURE OF MISSILE DEFENSE TESTING

                              ----------                              

                  House of Representatives,
                       Committee on Armed Services,
                             Strategic Forces Subcommittee,
                      Washington, DC, Wednesday, February 25, 2009.
    The subcommittee met, pursuant to call, at 1:00 p.m., in 
room 2212, Rayburn House Office Building, Hon. Ellen Tauscher 
(chairman of the subcommittee) presiding.

 OPENING STATEMENT OF HON. ELLEN O. TAUSCHER, A REPRESENTATIVE 
    FROM CALIFORNIA, CHAIRMAN, STRATEGIC FORCES SUBCOMMITTEE

    Ms. Tauscher. The committee will come to order. The 
Strategic Forces Subcommittee meets today to gather testimony 
on the future of missile defense testing programs. We are 
expecting a series of votes at around 1:30.
    So what I would like to do is: I will do my opening 
statement; the ranking member will do his opening statement; 
and the best we can, our first panel, generals and Dr. 
McQueary, if you could summarize your statements in five 
minutes or less, then I expect that we will be about the time 
that that will be called; and then we will come back and then 
we will have our questions, if that will work for you.
    During the past eight years, there has been a vigorous 
debate over the Bush Administration's approach to testing and 
deploying missile defense systems. Many, including myself, have 
expressed concerns about the previous Administration's approach 
to testing. Those expressions don't come from naivete or 
confusion. It is because we all want an operationally 
effective, suitable, and survivable system.
    However, the objective of today's hearing is not to debate 
what the Bush Administration did or did not do. We are well 
past that point. Instead, our objective today is to look 
forward and to see what specific actions need to occur to make 
sure that the missile defense systems we have deployed are 
operationally effective, suitable, and survivable.
    The United States, its deployed forces, and its friends and 
allies around the world face real threats from ballistic 
missiles. That is why I voted for the Missile Defense Act of 
1999, which made it the policy of the United States ``to 
deploy, as soon as technologically possible, an effective 
national missile defense system capable of defending the 
territory of the United States against limited ballistic 
missile attacks.''
    So far, the testing record for missile defense systems is 
mixed. According to the Director of Operational Test and 
Evaluation's (DOT&E's) fiscal year 2008 Annual Report to 
Congress, theater missile defense systems, such as Aegis 
Ballistic Missile Defense (BMD) and Terminal High Altitude Area 
Defense (THAAD), continue to make significant progress in the 
fiscal year 2008.
    For example in 2008, the Navy's operational test and 
evaluation command declared Aegis BMD to be ``operationally 
effective and suitable.'' This is a major accomplishment that 
we should all take pride in.
    The same cannot be said of the long-range, Ground-based 
Midcourse Defense (GMD) system. For the third year in a row, 
the Office of the Director of Operational Test and Evaluation 
stated in its annual report, ``GMD flight testing will not 
support a high level of confidence in its limited capabilities. 
Additional test data under realistic conditions is necessary to 
validate models and simulations and to increase confidence in 
the ability of these models and simulations to accurately 
predict system capability.''
    I would also note that, due to technical challenges, the 
Missile Defense Agency (MDA) was unable to conduct any GMD 
intercept tests in the fiscal year of 2008. This situation 
needs to improve. Better testing must be the foundation of our 
forward progress on a ground-based missile defense. It is in 
this context that Congress has said the proposed expansion of 
the GMD system in Europe cannot move forward without more 
testing, so that we can have the highest level of confidence in 
the system's capabilities.
    We have two distinguished panels of witnesses for today's 
hearing. Panel one includes Dr. Charles McQueary, the 
Pentagon's Director of Operational Test and Evaluation; 
Lieutenant General Patrick O'Reilly, the Director of the 
Missile Defense Agency; and Major General Roger Nadeau, 
Commander of the Army Test and Evaluation Command.
    Panel two includes Mr. Philip Coyle, the former Director of 
Operational Test and Evaluation; Mr. Paul Francis, Director of 
Acquisition and Sourcing Management at the Government 
Accountability Office (GAO); and Dr. Donald Mitchell, chief 
engineer for the ballistic missile defense at Johns Hopkins 
University Applied Physics Laboratory. Thank you for agreeing 
to testify, gentlemen.
    At today's hearing, I am particularly interested in having 
our witnesses address the following issues. For all of our 
witnesses, I need you to answer one fundamental question: What 
specific actions need to take place during the next several 
years to make sure that we have a high degree of confidence 
that the Ballistic Missile Defense System (BMDS), especially 
the long-range, Ground-based Midcourse Defense system, will 
work in an operationally effective, suitable, and survivable 
manner?
    Furthermore, General O'Reilly, welcome. Welcome to your 
first hearing as the new director.
    General O'Reilly. Thank you, ma'am.
    Ms. Tauscher. I understand that you have begun a review of 
the Missile Defense Agency's entire testing program to 
determine your long-term data requirements and testing needs. I 
would like you to provide the committee with an update on that 
effort and share with us any initial results that you have at 
this point. I look forward to an interesting and thoughtful 
discussion.
    On that note, let me turn the floor over to our 
distinguished Ranking Member, Mr. Turner of Ohio, who is here 
at his first hearing as the new ranking member of the 
subcommittee.
    Mr. Turner, we are interested in any opening comments you 
might have. And the floor is yours.

 STATEMENT OF HON. MICHAEL TURNER, A REPRESENTATIVE FROM OHIO, 
         RANKING MEMBER, STRATEGIC FORCES SUBCOMMITTEE

    Mr. Turner. Thank you, Madam Chair. And I am very honored 
to be serving with you as the ranking member of this important 
subcommittee.
    We are going to be dealing with very complex and 
challenging national security issues, and I look forward to 
working with you on these. We have had a very good bipartisan 
relationship, as you and I have served on this committee and 
also traveled abroad to discuss the important issues of missile 
defense, and bipartisan support is so important.
    What is unfortunate about the topic that we are going to 
discuss today is, while many of us will dive into the issue of 
technical satisfaction, of requirements, and the importance of 
how we verify what our systems are capable of, there are those 
who are outright opposed to these systems and will use the 
concept of testing to undermine the concept of the United 
States having an active missile defense system.
    It is so important that we get it right, so that we have 
the ability to have credible answers, and that we have a system 
in place where we can defend against those that would use the 
lack of testing to try to undermine our systems.
    And with that, I want to discuss a little bit about where I 
understand that we are. As we begin our discussion on missile 
defense testing, we should start by establishing a baseline of 
where we are today. The missile defense capability our Nation 
has fielded today consists of 26 ground-based interceptors 
(GBIs) in Alaska and California, 18 Aegis missile defense 
ships, 13 Patriot battalions, 5 radar tracking system and 
command and control systems.
    As I have learned from intelligence analysts at the 
National Air and Space Intelligence Center (NASIC), which is in 
my home district, the threat doesn't wait for us to perfect our 
defenses. If, for example, North Korea were to launch a long-
range Taepodong missile today, we could use this system to 
protect the American people, our forces abroad, and our allies.
    As Secretary Gates recently suggested, the Pentagon was 
prepared to use its missile defense capabilities to bring down 
a North Korean missile if necessary. Having this missile 
defense capability today as an option is the direct result of 
U.S. leadership and the hard work and dedication of a strong 
Government and industry team.
    For the chairman and I agree that our missile defense 
assets must be effective and credible. I was particularly 
interested in Mr. Mitchell's written statement that our 
Nation's ballistic missile defense capability cannot be 
disregarded today, and will provide an even more effective 
defense in the future. Therefore, continued testing to increase 
the effectiveness, credibility, and flexibility of an already 
deployed system against evolving threats is a commitment we all 
make.
    A common misconception about missile defense is that the 
technology doesn't work, and that the tests are not realistic. 
Even today, you can find news stories, and we will hear about 
some even in this hearing, where people attempt to misconstrue 
testing with the issue of whether or not the system doesn't 
work.
    A good starting point for us here today is to better 
understand the progress made to date. What is the state of our 
missile defense capabilities? As I understand it, the Missile 
Defense Agency is reviewing their test plan. And there is good 
alignment between them and the test community in this process. 
I am interested in hearing more about what our test objectives 
are, how assessments are made, where gaps and shortfalls exist, 
and how the rebaseline testing program should address these.
    Flight tests tend to get the most attention. However, 
ground tests and modeling and simulation play equally important 
roles in the test program. How are we progressing in these 
areas? Are there limiting factors in testing? I am particularly 
concerned about targets being the pacing item for testing. And 
I am interested in an update on the targets program.
    Our missile defenses are designed to counter limited 
threats from North Korea and Iran. We need a better 
understanding of the threats we are likely to see from these 
countries, so we even know what level of countermeasures, salvo 
launches, and multiple engagement launches MDA should address 
in their test plans.
    Testing should not be used as an impediment. On the 
contrary, I worry about the impact that potential cuts may have 
on testing. As we all know from experience, testing is always 
the first to go when cuts are made to defense programs. I hope 
the chairman and I can work together to ensure that this does 
not happen. This is too important for the safety of the people 
of the United States.
    Lastly, let us look at the testing history. Since 2001, 37 
of 47 tests have resulted in hit-to-kill intercepts, a nearly 
80 percent success record. However, as the threat continues to 
evolve and we evolve our missile defense capabilities, we will 
continue to need more tests.
    Madam Chair, I look forward to working with you and our 
witnesses to manage these challenging issues to the benefit of 
the protection of the American people. Thank you.
    Ms. Tauscher. Thank you, Mr. Turner.
    Now we will go off to our first panel. Dr. McQueary, thank 
you for your thoughtful statement. It is part of the record. If 
you can summarize for us, we would appreciate it. Dr. McQueary, 
the floor is yours.

     STATEMENT OF HON. DR. CHARLES E. MCQUEARY, DIRECTOR, 
  OPERATIONAL TEST AND EVALUATION, U.S. DEPARTMENT OF DEFENSE

    Dr. McQueary. Thank you very much.
    Madam Chairman, Congressman Turner, distinguished members 
of the committee, good afternoon. I am pleased to be here to 
have this opportunity to speak to you about the testing of the 
Ballistic Missile Defense System, or BMDS as we will refer to 
it.
    As requested in Chairman Tauscher's letter, I will address 
three areas: my assessment of the missile defense testing 
programs as described in my annual report submitted on January 
28; my assessment of the Missile Defense Agency's three-phase 
review of BMDS; and three, test evaluation actions I see as 
needed to ensure that BMDS and its elements will work in an 
effective, suitable, and survivable manner.
    But before I get into my prepared statement, I would like 
to address a news article from Bloomberg that came out 
yesterday, if I may. And I will do that very briefly. That 
article fundamentally misconstrued my position on ballistic 
missile testing. And I would like to set the record straight.
    Specifically, the article stated, ``According to McQueary, 
the U.S. defense probably wouldn't be effective, even without 
the distraction of decoys.'' This is a complete distortion of 
anything I have said to date on this subject. In my annual 
report, I said Ground-based Midcourse Defense has demonstrated 
a limited capability to defend against a simple, long-range 
ballistic missile threats launched from North Korea toward the 
United States. And I stand by that wording, both this year and 
in the past year.
    So if I may, I will get back to my main statement.
    Ms. Tauscher. Yes, sir.
    Dr. McQueary. First, my assessment of missile defense 
testing programs to date. Overall, the MDA experienced a good 
year with its ground and flight test programs, notwithstanding 
the continuing challenges with targets. Aegis Ballistic Missile 
Defense demonstrated the capability to detect, track, and 
engage simple short- and medium-range ballistic missile targets 
for a variety of mission scenarios.
    The Navy's Operational Test Agency, as you observed, 
indicated that the program was effective and suitable. And that 
was good news. And I completely agree with you. I have already 
commented upon the BMD, so I won't belabor that point.
    Terminal High Altitude Area Defense, or THAAD, demonstrated 
the capability to detect, track, and engage realistic short-
range targets. The Command, Control, Battle Management, and 
Communication element, or the C2BMC, demonstrated the 
capability to provide situational awareness (SA) to warfighters 
worldwide and to control the Army Navy/Transportable Radar 
Surveillance (AN/TPY-2) radar in its forward-based mode.
    The MDA continued to increase operational realism in its 
testing. The ground-test program is robust, although the MDA is 
still using unaccredited models and simulation. The targets, 
availability and performance limitations continue to impact 
both the pace and productivity of the MDA flight testing. Even 
with MDA's target program improvements, there remains 
significant risk in this area.
    Second, my assessment of the MDA's three-phase review of 
BMDS. The MDA, General O'Reilly, has embarked on a process to 
develop a revamped Integrated Master Test Plan, or IMTP, that 
will document planned testing through the Future Years 
Development Plan. A principal focus is to ensure that the 
future testing will provide sufficient validation data to 
anchor the models and simulations.
    This effort directly addresses the concerns I raised last 
year in my testimony before you. The three-phased review offers 
a logical, well-engineered approach. And although I must 
caution it will be a challenging test, I do applaud General 
O'Reilly's personal commitment to the initiative.
    Future test and evaluation (T&E) actions, the third item, a 
combination of flight and ground testing together with 
verified, validated, and accredited models and simulations are 
needed to characterize the capabilities of the BMDS and its 
elements. The approach being developed by MDA in the three-
phased review, if fully resourced and executed as planned, 
could provide a solid foundation for an independent assessment 
of the operational effectiveness, operational suitability, and 
survivability of each capability block.
    I see the operational test community participating in all 
phases of testing to the degree that is appropriate for the 
stage of development. An integrated approach that leverages 
combined developmental and operational testing to the maximum 
extent feasible is essential. I anticipate that much of the 
data needed for the Operational Test Agency's evaluation will 
be collected during the developmental phase, and from the use 
of models and simulations that are validated and accredited 
based upon developmental flight tests.
    As we all recognize, the complexity of the systems and the 
physical constraints on flight testing will necessitate 
examination of much of the system's capability in ground tests 
that leverage modeling and simulation.
    As I discussed in my written testimony, once the MDA has 
completed its developmental test objectives for a given block 
of capability--and this is a key point, I believe--I would 
foresee a dedicated operational test, led by the Operational 
Test Agency, that would be confirmatory in nature and would 
exercise the plan capability in an end-to-end fashion against a 
realistic portrayal of the threat. A concurrent assessment of 
training and supportability will ensure delivery of an 
operationally suitable block capability.
    And in conclusion, the MDA has experienced a good year, as 
I said. The renewed commitment to a rigorously engineered, 
disciplined and event-driven approach to flight and ground 
testing is welcome. I look forward to the development of an 
integrated test campaign that will ensure the delivery of 
operationally effective, suitable, and survivable capabilities 
to our warfighter. This concludes my remarks.
    [The prepared statement of Dr. McQueary can be found in the 
Appendix on page 49.]
    Ms. Tauscher. Thank you, Dr. McQueary.
    General O'Reilly, it is an honor to have you before the 
committee. Welcome.
    General O'Reilly. Thank you, ma'am.
    Ms. Tauscher. It is your maiden voyage in front of the 
committee. And let me tell you that in the few months that I 
have got to know you as the new director of MDA, I am very, 
very impressed by your willingness to work with the committee 
and to be responsive. And I hope we have been equally 
responsive back to you.
    We are anxious to hear your summarization of your 
testimony. I have read your testimony. I think it is excellent. 
And the floor is yours.

   STATEMENT OF LT. GEN. PATRICK J. O'REILLY, USA, DIRECTOR, 
       MISSILE DEFENSE AGENCY, U.S. DEPARTMENT OF DEFENSE

    General O'Reilly. Thank you, ma'am.
    Good afternoon, Madam Chairman, Mr. Turner, distinguished 
members of the committee. It is an honor and greatly 
appreciated opportunity to testify before you today on the 
Department of Defense's (DOD's) Ballistic Missile Defense 
System, or BMDS, testing program.
    The Missile Defense Agency, or MDA, recently initiated a 
systematic review of BMDS testing in partnership with the Army, 
Navy, and Air Force Operational Test Agencies, with the support 
of the Director for Operational Test and Evaluation, Dr. 
McQueary.
    Our goal is to set test objectives that measure the 
performance of critical functions necessary for robust missile 
defense operations and create an event-oriented plan that 
extends out as many years as necessary to collect sufficient 
data to determine the operational effectiveness, suitability, 
survivability, and supportability of the system.
    First, I would like to describe the challenges in our 
approach to testing the BMDS. Given the unique characteristics 
of short-, medium-, intermediate- and long-range ballistic 
missiles that threaten our deployed forces, our friends, our 
allies and our Nation, no one missile defense interceptor or 
sensor system can effectively counter all ballistic missile 
threats.
    Warfighters are not only faced with the challenge of 
intercepting small objects at great distances and very high 
velocities, but they have to simultaneously counter large raid 
sizes involving combinations of threat missile types, and in 
the future countermeasures associated with ballistic missile 
attacks.
    Since it is difficult to develop countermeasures that 
degrade fundamentally different missile interceptor systems 
operating in different phases of a threat ballistic missile's 
flight, the most effective missile defense architecture to 
handle the large missile raid sizes is a layering of endo-
atmospheric and exo-atmospheric missile interceptors with a 
network of sensors connected and managed by a robust command 
and control and communications system.
    Consequently, a comprehensive test program must not only 
measure the operational effectiveness of individual sensors and 
interceptors, but also must measure the performance of an 
integrated Ballistic Missile Defense System.
    Evaluating the BMDS is likely one of the most challenging 
endeavors ever attempted by the Department of Defense. Ideally, 
comprehensive and rigorous testing is enabled by a stable 
configuration of the system being tested, a clearly defined 
threat, a consistent and mature operational doctrine, 
sufficient resources to repeat tests under the most stressing 
conditions, and a well-defined set of criteria of acceptable 
performance. Unfortunately, none of these situations apply to 
the Ballistic Missile Defense System.
    The hardware and software configurations of the BMDS change 
as the system continues to be developed. There are many 
significant uncertainties surrounding the nature and specifics 
of missile defense threats. And the creation of operational 
doctrine for simultaneous theater, regional, and homeland 
defense continues. Moreover, the cost of each missile defense 
flight test ranges between $40 million and $200 million, making 
the repetition of flight tests cost-prohibitive.
    In light of these challenges, our strategy is to develop 
models and simulations of the BMDS and compare their 
predictions to comprehensive flight and ground test results to 
validate the accuracy of those models and simulations. However, 
due to the complex phenomena associated with missile launches 
and associated environments, some performance measures cannot 
be predicted and must be measured in flight.
    I will now summarize the status of this ballistic missile 
defense testing to date. Although we have had three intercepts 
and three attempts in the currently deployed hardware, 
configuration of the Ground-based Midcourse Defense flight 
testing, to date, has been limited to the performance of the 
most basic Block 1 capability against intermediate-range 
ballistic-class threats.
    On 5 December, 2008, we were able to demonstrate a 
significant milestone by integrating space-, land- and sea-
based sensors to form a common track and intercept a 4,000-
kilometer threat-class missile. However, we were not able to 
demonstrate capability against simple countermeasures due to 
the failure of a component of the target. Significantly more 
GMD testing is needed when considering the tremendous potential 
capability of this system designed to destroy intercontinental 
ballistic missiles (ICBMs).
    In fiscal year 2008, THAAD intercepted target missiles, in 
space and inside the Earth's atmosphere, in demonstrated 
queuing to the Aegis system. THAAD testing to date has been 
highly successful, with five intercepts in five attempts 
against short-range ballistic missiles, four of which were 
actually foreign-threat missiles. But more testing is needed 
against salvo and medium-range threats.
    The Aegis BMD element has successfully tested against seven 
short-range ballistic missiles, one an actual threat missile. 
In eight launches of the SM-3 Block IA missile, including a 
successful salvo test, simultaneously destroying two short-
range ballistic missiles conducted November 2007.
    As we continue to pursue the root cause of the failure of 
an SM-3 Block IA missile last November, we are preparing to 
test again against an intermediate-range ballistic missile 
class this spring, once the root cause of the failure has been 
identified and corrected. The sensor's element, which consists 
of our early-warning radars, four base radars, and SPX has 
demonstrated its capability in July 2008, when they all worked 
together to create a common track. And that architecture also 
supported the intercept of the GMD interceptor last November.
    Finally, I would like to describe our current test process 
and emerging results. The BMDS test review is being conducted 
in three phases. Phase 1, we determine the body of data 
necessary to validate the BMDS models and simulation and the 
data needed to evaluate operational effectiveness, suitability, 
and survivability. In Phase 1, we identified 85 critical 
variables and parameters that must be tested to validate our 
simulations, and 31 additional variables that we cannot model 
adequately and can only be measured in flight and ground tests.
    We are currently in Phase 2 of our test review, where we 
determine the test venues and scenarios to acquire the data 
associated with those 116 variables identified in Phase 1. An 
advantage to developing a campaign of test objectives rather 
than developing objectives one test at a time is that we don't 
always have to test those objectives that have previously been 
tested. This will reduce the cost and increase the frequency of 
BMDS testing.
    In Phase 3, we will identify the resources and planning 
infrastructure, including targets and test ranges, to execute 
those scenarios identified in Phase 2. Our goal is to complete 
this work by the end of May.
    In conclusion, I greatly appreciate your support as we 
address issues associated with testing the Ballistic Missile 
Defense System. BMDS test results send a very credible message 
to the international community on our ability to defeat 
ballistic missiles in flight, thus reducing their value to 
potential adversaries using ballistic missiles as a strategy to 
threaten our Nation, our deployed forces, our friends, and our 
allies.
    Contribution to U.S. non-proliferation goals is one of the 
most important benefits of robust and comprehensive missile 
defense testing. With your permission, I would like to submit 
the remainder of my remarks in written testimony and look 
forward to answering your questions.
    Ms. Tauscher. Without objection, so ordered.
    [The prepared statement of General O'Reilly can be found in 
the Appendix on page 59.]
    Ms. Tauscher. Thank you very much, General O'Reilly.
    General Nadeau, this is also your first time before the 
subcommittee. We thank you very much; very comprehensive 
testimony that you submitted to us. The floor is yours, sir.

    STATEMENT OF MAJ. GEN. ROGER A. NADEAU, USA, COMMANDING 
        GENERAL, TEST AND EVALUATION COMMAND, U.S. ARMY

    General Nadeau. Thank you, ma'am.
    Good afternoon, Chairwoman Tauscher, Ranking Member Turner, 
distinguished members of the subcommittee. Thank you for the 
opportunity to appear before you today. In my invitation to 
appear, you asked me to address three specific questions, which 
I have done in my written statement to the subcommittee.
    I would like to take this opportunity to describe the 
independent, multi-service, Operational Test Agency team that 
assesses Ballistic Missile Defense System performance, the Army 
Test and Evaluation Command's role as the lead Operational Test 
Agency and how our team works with the Missile Defense Agency, 
the Office of the Director, Operational Test and Evaluation, 
and the warfighter community.
    The Ballistic Missile Defense Systems development cuts 
across multiple service lines. It is only natural that a multi-
service operational test team was formed to assess performance 
at the comprehensive systems level. While individual service 
operational test and evaluation agencies focus on the equipment 
being developed in that particular service, the lead 
Operational Test Agency's mission is to provide independent, 
collective operational assessments of the total integrated 
Ballistic Missile Defense Systems performance capabilities and 
limitations.
    In our role as the lead Operational Test Agency for the 
Ballistic Missile Defense System, we establish an Operational 
Test Agency team that interfaces with the test planning and 
execution cell within the Missile Defense Agency on a daily 
basis. Members from the other service operational test 
agencies, as well as the Joint Interoperability Test Command 
(JITC) are also part of that team.
    To better facilitate a close working relationship between 
us and our Missile Defense Agency counterparts, we have 
positioned significant personnel resources in Huntsville, 
Alabama, and Colorado Springs, Colorado to enable daily contact 
and coordination with the Missile Defense Agency test planners, 
modeling and simulation developers, and the warfighter. This 
enables our participation in all facets of test planning and 
execution among the various agencies.
    We essentially sit and work side-by-side with our Missile 
Defense Agency counterparts every day. We have found this 
operating relationship to be extremely productive and the best 
use of our collective resources. The communication flow among 
agencies is greatly enhanced through the co-location of 
personnel, while the independent integrity of the Operational 
Test Agency is preserved through separate reporting chains.
    In addition to the daily operational contact we have with 
the Office of the Director, Operational Test and Evaluation, 
the warfighter represented by the United States Strategic 
Command (USSTRATCOM) and the service operators, and the Missile 
Defense Agency--the multi-service operational test team 
produces an annual operational assessment report, co-signed by 
the service Operational Test Agency commanders. This document 
is the Capstone Operational Test Agency document for that year.
    The Army Test and Evaluation Command, as the lead 
Operational Test Agency, approves the report for release to the 
Director, Operational Test and Evaluation. It is used as one of 
the source documents from which they prepare their annual 
report to Congress.
    Another activity worth mentioning is our participation in 
the Missile Defense Agency's post-test reviews. At the 
invitation of the Missile Defense Agency, we are able to 
provide performance feedback from our perspective that assists 
in identifying performance issues early, allowing for 
corrective action, which saves time and money in the long run.
    Madam Chairwoman, I thank the subcommittee for the 
opportunity to testify today. I look forward to answering your 
questions.
    [The prepared statement of General Nadeau can be found in 
the Appendix on page 73.]
    Ms. Tauscher. Thank you, General Nadeau.
    Members, we have two votes, a 15 and a 5. The subcommittee 
will be in recess. The hearing will be in recess until the 
subcommittee returns in approximately 25 minutes. I ask members 
to come back as quickly as we can. We will go into questions of 
this panel, and then we have a second panel, as you know. We 
are in recess.
    [Recess.]
    Ms. Tauscher. Members are making their way back over from 
votes. And we expect another series of votes in perhaps about 
an hour. So I am going to start with our questions of our first 
panel, so that we can have enough time to assess the second 
panel and be sure to ask questions there too.
    I had a question for General O'Reilly and Dr. McQueary. 
Several of our potential adversaries have demonstrated the 
capability to conduct coordinated missile attacks--and this is 
something we could face in a real-world situation. Several 
missile defense systems, such as Aegis BMD and Patriot Advanced 
Capability-3 (PAC-3), have demonstrated the capability to 
conduct both salvo launches and multiple simultaneous 
engagements.
    In your opinions, is a salvo test, firing two or more 
interceptors at a single target, necessary to understand the 
operational performance of GMD and to provide confidence that 
the system works as we intended to operate it? Additionally, 
what are your thoughts on the need for the GMD system to 
conduct a multiple simultaneous engagement by firing multiple 
GMD interceptors at multiple targets?
    Dr. McQueary or General O'Reilly, whoever.
    Dr. McQueary. Let me----
    General O'Reilly. Well, ma'am if----
    Ms. Tauscher. General O'Reilly.
    General O'Reilly [continuing]. I may, I support that the 
testing of all of our systems, including GMD, must include 
salvo launches, because that is our doctrine. And we have a lot 
of theoretical estimations on the impact of one intercept on 
another interceptor flying in that area. But the phenomenology 
is very complex. And there would be a tremendous amount of 
empirical data gathered if we did that.
    I also support a multiple simultaneous intercepts, 
including GMD. However, I will need some assistance because of 
the amount of telemetry at Vandenberg Air Force Base and the 
safety considerations. I don't believe in their history they 
have launched two interceptors at once. I do know that they 
have not handled four missiles in flight at one time, which 
that would require.
    So ma'am, I do believe it would very beneficial to do that. 
It is important. But moving beyond the salvo, there will need 
to be some investment, or some commitment from national 
resources in order to accomplish that.
    Ms. Tauscher. Well, I appreciate that very much, General 
O'Reilly. Perhaps in a subsequent conversation, you can give us 
a sense for what that will entail. And----
    General O'Reilly. Yes, ma'am.
    Ms. Tauscher[continuing]. As the budget is going to be 
coming up soon, we can try to figure out how we can accommodate 
something like that.
    Dr. McQueary.
    Dr. McQueary. I fully agree with what the general just 
said. I think a salvo launch's multiple-target issues are a 
phenomenology that absolutely must be examined as a part of the 
program.
    Ms. Tauscher. Right. Let us see.
    General O'Reilly, the classified version of DOT&E's 2008 
assessment for the Ballistic Missile Defense System raised a 
number of concerns about the BMDS. I know that you are working 
with DOT&E to address these concerns outlined in the report. 
Can you just, once again, aside from what you had in your 
testimony, can you kind of give us a more detailed summary on 
exactly how that cooperation is folding out, and just give us 
some more detail on that?
    General O'Reilly. Yes, ma'am. With the DOT&E report, it 
covered what had been tested, and emphasized what has not been 
tested, in order to validate the models and simulations. So I 
work very closely, and my staff does, with the operational test 
agencies. But Dr. McQueary has been very generous in providing 
his staff to observe every one of those meetings and provide us 
assistance or assessments, primarily, on where there are areas 
in their models and simulations that they believe we need to go 
re-look. And we have done that.
    So I believe that the results of this first phase are very 
comprehensive, because of the fact that we have had the benefit 
of DOT&E supporting us onsite, in the meetings, instead of us 
just trying to interpret what was missing from the reports. And 
all indications I have had from Dr. McQueary and his staff is 
we are addressing those areas that we need data to be collected 
in.
    Ms. Tauscher. I appreciate that, General O'Reilly.
    Dr. McQueary, you know, I think that there has been a 
significant change in the level of cooperation between MDA and 
DOT&E. And I am very appreciative of it. This subcommittee 
weighed in very seriously over the last three years about that. 
And I think these are very good results.
    Can you give us a sense for your expectations? Let me 
restate that. Can you give us a sense for the reality of that 
and how that is accruing to our expectations of having the 
testing regime be much more robust going forward?
    Dr. McQueary. Well, I think the relationship is very good. 
The relationship began to improve, from my standpoint, when 
General Obering was here. We worked out some issues with him.
    I am particularly pleased, as I mention in my oral 
statement, about the approach that General O'Reilly has taken 
toward his three-phase approach of deciding what the test 
program really needs to be. And we are participants in the 
discussion thereof, because there is no such thing as the test 
that identifies everything. These are highly technical, complex 
kinds of things.
    And having discussions with the various parties who have an 
interest in seeing that we do the right thing, I think is 
very--right testing, I should say--is really the right thing to 
do. And so I don't have anything that I would add to it.
    I would also point out one other thing I would add, excuse 
me--General O'Reilly and I try to meet once a week, although 
that is very difficult to do, as you might be. But we do have a 
time on the calendar. But we each know that we can cancel out 
if we need to do so, in order to just be able to go over issues 
that might be there. We meet for about 30 minutes. But I think 
that is an important way of keeping the communication channel 
open, because the issues are large, but they are not 
insurmountable.
    Ms. Tauscher. Well, the committee very much appreciates 
that new level of cooperation and the congruency of your effort 
to work together. And that I think is really accruing very 
significantly to the program.
    General O'Reilly and Dr. McQueary, a key sensor critical 
for defense of the United States from a Northeast Asia attack 
is the upgraded COBRA DANE radar in Alaska. We have flown at 
least one missile across COBRA DANE for data collection, but 
have never performed an intercept flight test using the primary 
sensor and the fire control loop.
    Why have we not performed a GMD intercept test engagement 
using the upgraded COBRA DANE radar? Are we planning to do so? 
And what test range issues need to be dealt with, if any, to 
perform such a test?
    Dr. McQueary. I support you.
    General O'Reilly. Ma'am, we are looking at that in the next 
phase. It is my expectation that we will do that. However, when 
it was done last time in September 2005, we required, or we 
needed the cooperation of the Russian Government, because the 
launch was an air-launch target in the Russian flight 
information region, it is referred to, within 12 miles. To have 
an operationally realistic trajectory does require to skirt 
very closely to Russian airspace.
    At the time, President Putin had provided agreement and 
concurred with us doing the test, with us exchanging and 
allowing their Russian officers to observe the test. So that is 
one issue that has to be addressed for any test up in that 
region. I would look forward to that engagement with the 
Russians, because I do believe that would be a very informative 
test.
    I believe the infrastructure and the other issues 
associated with a test over the North Pacific could be 
addressed straightforward through our normal processes.
    Ms. Tauscher. Thank you.
    Mr. McQueary. In the event that we could not do the test, 
which we fully agree that it is desirable to do a real test, 
DOT&E still feels, as we have reported before, that there needs 
to at least be a target fly-by to test the software that was 
changed as a result of the last test that was conducted. There 
were some changes made, and we have not actually been able to 
run an operational test.
    Ms. Tauscher. Right.
    Dr. McQueary. And a target fly-by would help gain 
information that the system does work as it is supposed to in 
target tracking.
    Ms. Tauscher. Thank you.
    Thank you, gentlemen.
    Mr. Turner.
    Mr. Turner. Thank you, Madam Chair.
    Dr. McQueary, I would like to get back to, just for a 
moment, your initial comment that you made about the news 
article that we saw where you were misquoted, and it really 
leaves the impression that the system doesn't work. And that is 
not at all your message, even in your written testimony. Can 
you speak again about your reaction to this article and the gap 
between what the article portrays as your position and your 
position?
    Dr. McQueary. I found out about this article at about 5 
minutes of 6:00 last evening, just before I was going to the 
Kennedy Center, which was not a good time to prepare oneself to 
go listen to the orchestra. But Admiral McCarthy, who had sent 
me the message letting me know what was there, I had sent a 
response back to him. And what he tells me is that his 
Blackberry died after he received my message. Now, so I will 
let you infer from that what it might have said.
    I was very disturbed because it was--I don't mind talking 
with reporters. I don't mind having discussion about what we 
do, because I try to run a very transparent organization. I do 
feel, though, that when reporters have uncertainty about how 
they are going to report what someone says, particularly if it 
is almost a direct quote, then they have the responsibility of 
making sure that that information is correct.
    And this was simply blatantly incorrect and inconsistent 
totally with what I have said in the last two annual reports 
that we have put out about where we are in the testing. We have 
consistently said that we need more modeling and simulation. 
There is nothing new in that. And we all understand why that is 
an important aspect of this.
    But we have demonstrated, through flight testing, some 
capabilities that are important. And I believe I would 
characterize it, if the North Koreans launched an attack 
against us this afternoon, we wouldn't say we need more test 
data before we decide whether we are going to launch against 
and try to intercept that. We would see how the system works, 
and we would find out.
    Mr. Turner. Excellent. And that is really the whole heart 
of, I think, all of this discussion, is the need for systems in 
place in the case where something might occur, like you just 
described.
    And I would like to deal with just a little bit of 
terminology. On page three of your testimony, you give us a 
list of where we are on a few of these. And you say the Ground-
based Midcourse Defense, GMD, demonstrated a limited capability 
to defend against simple, long-range ballistic missile threats 
launched from North Korea toward the United States. Next 
sentence, ``Terminal High-Altitude Area Defense, THAAD, 
demonstrated the capability to detect, track and engage both 
short-range, non-separating and single-separating targets.''
    Then on the Command Control Battle Management Communication 
element, you indicate again, using the word demonstrated, the 
capability to provide situational awareness to warfighters 
worldwide. And you go on.
    And I am going to focus this for just a moment on the word 
``demonstrated,'' because you are all about testing, telling us 
what we know from what we have tested. But the word 
``demonstrated,'' I believe you are not indicating to me that 
it is the limitation of the system, it is a limitation of the 
testing. In other words, the testing has demonstrated that this 
system has this capability, but it might have greater 
capability. Is that correct?
    Dr. McQueary. It might have greater capabilities. There 
might be capabilities that the system couldn't respond to as 
well in that. So what it does--I was referring to in those 
words specifically about what we showed as a consequence of the 
test that was conducted. And indeed, we did intercept, 
``kill,'' a target to demonstrate that the GMD did work in that 
particular testing that we had. So to me, that was a 
demonstration that the system has the capability to work.
    Mr. Turner. Excellent. That is why I wanted to ask the 
question, because when I read this page, I was afraid that 
someone might read it as saying, you know, it only does this, 
it only does that. But in reality, it is just it has only 
demonstrated it in the testing scenarios that we had. It might 
have greater capability. It might even perform better than what 
we have currently been able to test.
    Dr. McQueary. May I, just one point: when we talk about 
confidence, I want to be very clear that when we refer to 
confidence, you can always assume if we don't say it, we are 
talking about statistical confidence. I am not in the mode of 
saying, ``I don't have confidence in you or you.'' We are 
talking about, from a statistical standpoint, how many tests 
does one have to run in order to demonstrate mathematically, if 
you will, through data, that you have a certain level of 
confidence that the system is going to work.
    And when I speak of confidence, that is what I mean. And 
that is all I mean. I am not rendering a subjective view at 
all. I am trying to convey what we need to know. And that is 
why we say, over and over, we need the models and simulation, 
because we will never be able to run enough real tests to 
prove, with high statistical confidence, that the system can do 
what it is intended to do. But with models and simulation, 
verified by real testing, we can accomplish that objective.
    Mr. Turner. Thank you.
    And to get back to your comment on North Korea, and this is 
my question next for the three of you, you know, currently the 
system that we have is intended to be integrated. Each of these 
individual components have unique capabilities. And the threat 
that it is designed against, we know that none of those 
individuals that are developing those threats, they are not 
abandoning missile systems. They are not abandoning seeking 
missile technology or missile capability. If we abandoned a 
capability or an element of our overall system that is 
integrated, we will open ourselves up to a vulnerability.
    But what I am concerned about, and what I would like you 
each to speak about, if we were to, based on lack of testing, 
lack of completion of testing, to stop or discontinue the 
advancement in any one of these areas or systems, I fear that 
we might lose some capability, because we have three different 
components basically. You have the development and innovation 
phase; you have the testing phase; you have procurement phase.
    And if you are to stop along the way, you are going to lose 
institutional intellect. You are going to lose some industry 
capability. Could you speak to a moment of concerns that you 
might have of ceasing to progress in any one of these 
integrated systems that we are looking at to protect us, and 
what might happen if we later go back and try to reengage, what 
gap would occur?
    Dr. McQueary. Could I defer to General O'Reilly----
    Mr. Turner. Absolutely.
    Dr. McQueary [continuing]. First, since he has a broader 
view of the whole system, because he lives it every day.
    General O'Reilly. Sir, from an acquisition point of view, 
once we decide to stop, if we decided to stop developing one of 
these areas, you are exactly right, sir. The first area we 
would be very careful to look at was the industrial base and 
the intellect that you were referring to.
    The missile defense area is a very unique developmental 
skill in the sciences, in the material sciences, in the 
production and so forth. Our country has been successful. But I 
would say that we are one of the few countries with the type of 
resources that could do what we have already accomplished. And 
to ensure that we maintain and protect that competency in 
developing missile defense interceptors, where you are not just 
worried about launching the missile, but you are more worried 
about what happens to that missile at the very end of its 
flight, is very difficult.
    So that is an area, and that is one of my greatest 
responsibilities, is to ensure that we continue to develop that 
competency in the United States in those areas, both in 
industry and in the government team. And that is why I spend so 
much time in universities and engineering schools in order to 
continue to grow that competency.
    Second is the supply chain. A lot of small companies out 
there provide very unique items that apply just to missile 
defense interceptors and missile defense systems. And we have 
to be very careful about their ability to endure a transition 
from supporting missile defense applications to having it so 
that it could apply to other commercial ventures. And a lot of 
that is a very difficult transition for them, because of the 
nature of their work.
    If we did stop one of these production lines, 
requalification is very expensive. Typically, it runs on the 
order of a line, such as GMD, would be over $400 million, an 
estimate if a line stopped for over a year before you had to go 
requalify and find new vendors. That is at the second.
    And unfortunately, this business--and I guess it is a 
strength of the United States that most of these systems 
involve almost every aerospace company in the United States, 
providing some sort of expertise or capability. So it is a far-
reaching impact that has to be carefully weighed.
    One benefit of continuing testing is to produce the 
interceptors themselves, which are the most difficult of these 
items--producing interceptors, so that we can continue to make 
decisions and keep a warm production line until we make a final 
decision that we have enough capability, or we have a 
capability by some other means.
    Mr. Turner. Thank you.
    And on that, Dr. McQueary, in your opinion, two-stage 
interceptors that are proposed. It is not as if we are 
completely designing a new system; it is a modification of a 
system. And so my guess would be, since it is a modification of 
a system, you are testing that modification and not having to 
retest everything all over again. Later this year, if the two-
stage flight test that is planned, if it is successful, would 
you recommend that we proceed with long-lead procurement?
    Dr. McQueary. We have previously said that, with a 
successful test, that we could support the idea of long-lead 
procurement for items, yes. If that is what the Congress 
decides upon.
    Mr. Turner. And that is because it is--explain to me why 
that would be the case.
    Dr. McQueary. Because there are great similarities. And we 
could show you that in detail, not easily here, but show a 
great similarity between the two-stage and three-stage 
interceptor. But nevertheless, there are changes. And 
therefore, it is important to verify that those changes did not 
introduce something anomalous in the behavior.
    In fact, we have gone on record before saying we thought we 
needed three tests, a total of three tests, in order to verify 
that the change from three-stage to two-stage was a 
satisfactory change. However, we have also said that we could, 
if someone chose to fund the operation, we would be all right. 
We would be all right in supporting the idea of going ahead 
with long-lead items, because the long-lead items, most of 
them, are usable in the three-stage vehicle in any event.
    Mr. Turner. Excellent.
    And I have just one more question, Madam Chair. Thank you 
for your tolerance of time.
    Another thing that I am concerned of, if we abandon the 
advancement in any area or the pursuit of any area, is that 
what it says to our adversaries or those that are developing 
missiles.
    Gentlemen, could you comment on, do you believe our missile 
defense system has a deterrent effect, because if it does have 
a deterrent effect, then abandoning any portion of it, we would 
lose that deterrent effect. People would see a vulnerability 
that we have, or an area in which we are conceding, but that we 
are not going to be seeking a defensive posture. Do you guys 
have thoughts as to if you believe that the pursuit of our 
missile defense system and its deployment acts as a deterrent 
internationally?
    General O'Reilly. Well, if I may, sir, yes, I do from two 
regards. One is when you are looking at the inventories and the 
numbers you were talking about for ICBM defensive systems, you 
have to look at how many ICBMs could be launched at any one 
time. And you have to assume that the United States would 
respond some way if, in fact, ICBMs were launched against it, 
and the missile defense system intercepted and took out those 
ICBMs.
    So you have to look at what the inventory around the world 
of ICBM launch points are, and there is only a few of them. But 
there are others being built today. And so there is that 
operational judgment on how much do we need, not a material 
developing judgment.
    However, I would say to the direct point on your question 
is that I believe it is the most compelling way to devalue 
these missiles is to show that they are ineffective, because we 
keep intercepting them in different ways. And that is a great 
strength of a robust test program, is to keep intercepting in 
all the different fashions in which I believe our adversaries 
are looking at ways to defeat it. And testing against 
countermeasures and so forth, again, strengthens the 
deterrence. And it is welcome in our approach to testing.
    Mr. Turner. Excellent.
    Madam Chair, thank you.
    Ms. Tauscher. Thank you, Mr. Turner.
    Mr. Andrews of New Jersey.
    Mr. Andrews. Thank you, Madam Chairman.
    Ms. Tauscher. For five minutes.
    Mr. Andrews. Thank you, gentlemen, for your testimony and 
for your service to our country. I think it is very important 
that the discussion not bring false choices. And the choice is 
not deployment or abandonment. The choice is strategic, 
intelligence, effective deployment versus other options. I just 
want to ask anyone on the panel, has anyone here asked you to 
plan for the abandonment of the system?
    General O'Reilly. Well, as a material developer, I can tell 
you, no.
    Dr. McQueary. And as head of DOT&E, that isn't the way we 
operate. We are charged with testing the systems that the----
    Mr. Andrews. Right.
    Dr. McQueary [continuing]. Government decides to----
    Mr. Andrews. So the abandonment is sort of a metaphysical 
discussion. I would like to switch to one I think is a little 
more focused.
    In January of 2002, Secretary Rumsfeld created a new world 
in testing. And that world exempted the BMD products prior to 
Milestone C. And in the context of that world, it is 
interesting to see the result that occurs in your 2008 report, 
which we mandated, between the Aegis system and the GMD.
    That language that you used on the Aegis system, it was 
declared ``operationally effective and suitable.'' Got very 
good grades. The GMD, on the other hand, the quote is, ``GMD 
flight testing to date will not support a high level of 
confidence.'' And I understand, Doctor, what you mean by 
confidence; not subjective term, but will not support a high 
level of confidence in its limited capabilities. It is apparent 
first one, on Aegis, sounds like an A. And the second one, to 
me, sounds like a C-minus, D-plus.
    Here is a hypothesis, Dr. McQueary, I would like you to 
consider. The main difference in the pre-Milestone C testing 
and activities between Aegis and GMD was that the testing that 
was done for Aegis looks an awful lot like the traditional 
testing that would have happened anyway without the exemption, 
the legacy testing, legacy documents; whereas the testing of 
the GMD looks very different. It looks like something that did 
not go through the same degree of rigor and scrutiny. Would you 
agree or disagree with that hypothesis?
    Dr. McQueary. I don't think I am qualified to be able to 
comment upon what you have just said, because I have not looked 
at that program from that standpoint. The GMD is clearly at an 
earlier stage of development than what the Aegis is.
    Mr. Andrews. But it is correct, isn't it, that the testing 
protocol, the date on the GMD, has not followed the orthodox 
traditional path that other systems have followed. Isn't that 
true?
    Dr. McQueary. That is----
    Mr. Andrews. It is very different from the Aegis testing.
    Dr. McQueary. The entire program was put together that way. 
That is correct.
    Mr. Andrews. Would you characterize the testing as less 
robust than the Aegis testing?
    Dr. McQueary. I wouldn't characterize it as less robust. I 
would characterize it as being in a much earlier stage of 
development than the Aegis. And I think one of the things that 
we bear responsibility to do is to assure that the GMD is 
tested to a sufficient.
    Mr. Andrews. But General, there is more than just a time 
differential. Isn't there? Isn't there a qualitative 
differential in the nature of the tests that have been done on 
Aegis versus the GMD? Don't the Aegis tests look an awful lot 
like more of the pre-Rumsfeld rule?
    General O'Reilly. One of the purposes, sir, of the test 
review that I have conducted since I came on board was in fact 
to lay out all the data that is required to be collected. My 
predecessors have shown the planning for the next two years, 
two-year increments.
    Mr. Andrews. Yes.
    General O'Reilly. And it is more traditional, because what 
we are approaching now is to lay out the entire test program, 
so you can identify what is going to be tested and when, rather 
than having to assume that something is going to be tested.
    Mr. Andrews. And that looks an awful lot like, doesn't it, 
the DOT&E process that would have been followed? Doesn't it 
sort of echo that?
    General O'Reilly. As far as from a planning point of view, 
yes, sir.
    Mr. Andrews. Yes.
    General O'Reilly. Because in the DOT&E----
    Mr. Andrews. Kind of wish we had those seven years back in 
the billions and billions of dollars we spent since then? You 
don't have to answer that question. I do.
    On your testimony, you list a whole laundry list of things 
that are going to have to be done for the GMD, both in the 
Critical Engagement Conditions (CEC) and Empirical Measurement 
Events (EME) categories. If I read correctly, there are nine 
Critical Engagement Conditions and six Empirical Measurement 
Events that have to take place.
    And you list a whole laundry list of key factors that have 
to be looked at: solar and lunar backgrounds, low intercept 
altitudes, timing between salvo launches, long times of flight, 
high closing velocities for ICBM-class targets, correcting for 
varying booster burnout velocities, and responding to 
countermeasures. That is a pretty significant list.
    How long would it take to do all those things to your--
degree of confidence that we need? How long would it take?
    Ms. Tauscher. Gentleman's time has expired.
    But General O'Reilly, you can finish that answer.
    General O'Reilly. Sir, honestly, sir, that is why I am 
working in the--I am. Right now, we have identified what we 
need. And what you have asked is exactly what we are doing over 
the next couple months with these two agencies, and us----
    Mr. Andrews. With that, thank you. Just add one thing, 
Madam Chair. That is an answer I think the committee would 
really like to hear before this year's bill is written, because 
when we make funding priority decisions, I think it makes a big 
difference as to what we do.
    General O'Reilly. And, sir, our plan is to complete this 
plan by May.
    Mr. Andrews. Thank you very much.
    Ms. Tauscher. Thank you, Mr. Andrews.
    Mr. Franks of Arizona for five minutes.
    Mr. Franks. Well thank you, Madam Chair.
    Madam Chair, our ranking member here asked questions that 
reached the goal that my own question had. And I thought he did 
an excellent job. So at the risk of sounding a little redundant 
here, let me just try to put some context by focusing again on 
the threat.
    And again, at the risk of being redundant, on July 4 of 
2006, North Korea tested an ICBM and our GMD system was put on 
alert. And now, of course, there have been reports that North 
Korea may be testing an advanced Taepodong-2.
    And considering General Cartwright, Commander of Strategic 
Command, he said that the July 4, 2006 North Korean missile 
launch has spurred a limited operational activation of the 
Ballistic Missile Defense System. ``We learned that the 
Ballistic Missile Defense System, procedures, and personnel 
performed well, and demonstrated a credible operational missile 
defense capability for homeland defense.''
    And I think that is a pretty profound statement on the part 
of General Cartwright. And so my first question, General 
O'Reilly, is do you share General Cartwright's level of 
confidence? Are you confident that this capability that we have 
today, that we have today, can provide a defense to the 
American people from the current North Korean threat?
    General O'Reilly. Yes, sir, based on the scenarios that we 
have tested three times, although it is limited and it is in 
the beginning, those scenarios overlay a launch from North 
Korea and a response out of Alaska. And so we have tested three 
times that scenario first for obvious reasons. And that is the 
source of my confidence.
    Second of all, our firing doctrine is that we have a 
significant number of missiles. So we can put a significant 
number of missiles in the air at once. And that each time 
significantly increases the overall probability that you are 
going to be successful.
    Mr. Franks. So let me ask the blooming obvious question 
here. Forgive me. Do we have a system that is more mature than 
GMD to defend against the current ICBM threat? And what are the 
implications of delaying GMD production and fielding?
    General O'Reilly. Well, sir, we do have a more mature 
system now than we did then, particularly in our redundancy. 
And we have multiple redundant capabilities throughout the 
system now. And we have more interceptors. And we have learned 
in flight.
    Mr. Franks. Beyond GMD, General. I mean, do we have 
something beyond GMD that is a more mature system to defend us 
against ICBM threats currently?
    General O'Reilly. No, sir. That is the only system that has 
been tested against threats of 4,000 kilometers or greater.
    Mr. Franks. Well, thank you.
    Now, you know, I think it is clear that even if our systems 
haven't given us 100 percent assurance through testing, and of 
course I believe in testing as much as any of you do. And I 
know that you would like to have more capability to do 
additional testing. But it is the only system that we have for 
this particular ICBM threat at the moment.
    And to cut or delay funding and fielding, in my judgment, 
Madam Chair, would send a tremendously dangerous message to the 
North Koreans, not just in terms of the actual ICBM threat to 
us, but I think it encourages them and other nations across the 
world, Iran and others, to develop nuclear programs that 
potentially could be passed, the technology could be passed 
along to terrorists. And I think the coincidence of jihadist 
terrorism and nuclear proliferation represents one of the 
greatest dangers facing us as a free people today.
    And so, I am going to run out of time it looks like here. 
But I will try one more here with Mr. McQueary. As a current 
test evaluator, would you say that, just because the strategic 
BMD received a less-than-perfect test score, that this 
necessarily means that it does not provide the warfighter with 
an operationally effective capability?
    Dr. McQueary. It does not provide the warfighter with an 
operationally effective----
    Mr. Franks. So let me ask----
    Dr. McQueary [continuing]. Capability that I can say with 
high confidence. I think it is important. Our job is testing 
and to deal with the facts at hand. And there has simply not 
been enough testing done in order to be able to state----
    Mr. Franks. But the less-than-perfect score does not 
necessarily mean that it does not provide the warfighter with 
an operationally effective capability.
    Dr. McQueary. The statement is not intended to imply that 
at all.
    Mr. Franks. Right.
    Well, Madam Chair, I just think it is important that, you 
know, I don't know of any system that we have that is proven 
100 percent effective. I am not even sure that we could say 
that about the baseball bat. But it is still pretty effective 
at close range. And so I just hope that we don't, in the face 
of not being able to test all that we could and all that you 
gentlemen would like to test, and certainly I think we have a 
responsibility to facilitate that, to do things that would 
endanger our national security.
    And so with that, Madam Chair, I don't know why that yellow 
light has been on so long. But I will yield back. Thank you.
    Oh, the light is stuck. I should have taken advantage of 
that.
    Ms. Tauscher. This is not a baseball bat, but it is a 
gavel.
    Mr. Heinrich.
    Mr. Franks. Has it been tested?
    Ms. Tauscher. Mr. Heinrich of New Mexico.
    Mr. Heinrich. Thank you, Madam Chair.
    I have a couple of questions for Lieutenant General 
O'Reilly. And one is more general and one is more specific. But 
it goes back to some of the same issues that Mr. Franks has 
raised. And I will start with the more general one. And I would 
just ask, do we have any hard evidence to show that our missile 
defenses have actually deterred North Korea or Iran from 
deploying this?
    General O'Reilly. Sir, I am not in a position to answer 
that. I think that might be more of an intel question.
    Mr. Heinrich. Okay.
    General O'Reilly. But I don't know, sir.
    Mr. Heinrich. Well, on a more specific return to the issue 
of the GMD system, if I understand correctly, Taepodong-2 is a 
liquid-fuel system, is that correct?
    General O'Reilly. Yes, sir.
    Mr. Heinrich. And most of our tests are against solid-fuel 
targets? Is there a qualitative difference in, you know, 
testing against a solid-fuel target versus a liquid-fuel 
target? And is that something that is relevant to future 
testing? Do we need to be testing something that is more 
analogous to, you know, basically to the Taepodong-2?
    General O'Reilly. Well, sir, we test both threat 
categories. As I said earlier in my discussion, we actually 
test frequently actual threat missiles that are the liquids. 
Most of our liquids are actual missiles which we have obtained. 
And we have tested them off the coast and asymmetric. I mean, 
we really want to ensure we have the confidence that Dr. 
McQueary says.
    Against our longer-range threats, we have the challenge 
that these targets are almost ICBMs themselves. And so we rely 
on our fleet of ICBMs in a lot of cases, which are mainly 
solids.
    Mr. Heinrich. Right.
    General O'Reilly. But when you qualitatively compare 
between the two, the solid actually presents much more 
difficulty in intercepting, because as it burns, it actually 
produces chunks, if you will, of solid material that is 
burning. And so it clutters the scene.
    Mr. Heinrich. Which could be mistaken for--okay.
    General O'Reilly. Yes, sir. And when a kill vehicle looks 
at a liquid, it sees primarily the objects or the debris and 
the hard objects, especially if you are looking with an 
infrared camera, which most of our systems have.
    But when you are looking at a solid system, you are seeing 
all this other. So it actually makes it more complex, harder, 
at times more difficult for an intercept to occur.
    Mr. Heinrich. I yield back.
    Ms. Tauscher. Gentleman yields back his time.
    Mr. Lamborn of Colorado.
    Mr. Lamborn. Thank you.
    Dr. McQueary, of Ground-based, Midcourse Defense, the 
analysis indicates the U.S. is more capable of defending itself 
against a single long-range--even though we haven't reached the 
level of high----
    Dr. McQueary. It is more capable, but I can't let you walk 
me into some kind of numerical amount, because we had no 
capability before. We now have capability. And therefore, by--
and we demonstrated it.
    Ms. Tauscher. Majority will stipulate that we are more 
capable than we were before we had nothing.
    Dr. McQueary. Seems like a logical way----
    Ms. Tauscher. I think so too.
    Mr. Lamborn. Can you put high confidence into a percentage 
count? Are you able to----
    Dr. McQueary. Well, we typically speak----
    Mr. Lamborn [continuing]. Is not a subjective term.
    Dr. McQueary. It is not a subjective term. But typically 
when we talk about performance of systems, we will be in the 
range of having reliabilities that might be in the range of 80 
percent to 90 percent reliability for missile systems, if you 
look at that.
    And then we talk about having a confidence generally in the 
70 percent to 90 percent range as being in the high realm. And 
the higher, the more, the higher the number when you go through 
and examine the test data and look at various test scenarios 
using modeling and simulation again. The higher the number, the 
more confidence you have in how the system will work. It is 
pretty straightforward.
    Mr. Lamborn. Okay, so if you have a missile interceptor 
that is not the high-confidence performance level, operational 
level. But let us say it is 70 percent. But you fire that at an 
incoming threat, and it misses. And you fire another one. And 
that also has a 70 percent chance. When you put those two 
together, aren't you left with 91 percent?
    Dr. McQueary. That, I wouldn't attempt to do the arithmetic 
in my head. But that doesn't sound too far off. Yes, the----
    Mr. Lamborn. Or a third one would then decrease the----
    Dr. McQueary. In fact, that is why the doctrine calls for 
firing multiple missiles to accomplish just what you are 
talking. Yes.
    Mr. Lamborn. Okay, thank you.
    Dr. McQueary. But it can get to be an expensive proposition 
if the missiles are very expensive. That is----
    Mr. Lamborn. That is right. Of course, the alternative is 
expensive also.
    Dr. McQueary. This is true.
    Mr. Lamborn. And also, I would like to shift gears to ask 
this: where do we stand with regard to the validation, 
verification--excuse me, I am going to shift to Lieutenant 
General O'Reilly at this point--where do we stand with regard 
to the validation, verification, and accreditation of the BMDS 
element level model? And when do we hope to be in a position to 
validate, verify, and accredit the element level models?
    General O'Reilly. Do you want to start?
    General Nadeau. Okay. I would start by telling you that the 
Operational Test Agency lead and other services are huge fans 
of this three-phase test review that the Missile Defense Agency 
is undergoing now which, when completed, will allow us to 
answer that question very, very specifically, as we were 
discussing, to break out of the two-year window into the entire 
program.
    And with the data and inputs and work that we all 
collectively put into the Phase 1 piece of identifying the 
number of ground and flight tests or flight tests necessary to 
get to the point where you have the ability to validate and 
verify (V&V) the models and then accredit them, and then start 
using those models for the greater good, and then also adding 
flight tests that will be required to do things you can't pick 
up in the models.
    It will end up being a very good process. It has been a 
very good process to this date. And I applaud, in spades, the 
effort to get down this road, because from the test agency's 
perspective, this is exactly the right thing to do.
    Mr. Lamborn. Anything to add to that?
    General O'Reilly. No, sir.
    Well, one result that we actually found as we went through 
Phase 1 is there are other areas in our infrastructure and our 
modeling that need improvement and need investment. It is not 
just the testing. It is also the modeling of our threats from 
areas such as NASIC and others. We need investment in that 
area, so we can have the digital models of the threats that are 
more precise, and the hardware in the loop and the other 
infrastructures.
    So we have learned a lot from this process, I would say, 
and not just the accreditation of the models, but the entire 
infrastructure to give us the confidence that we do have the 
results that the warfighters and the combatant commanders can 
use to make a judgment on the capabilities and limitations of 
the system.
    And Dr. McQueary, you may want to address that.
    Dr. McQueary. I am with you.
    Mr. Lamborn. I thank you all for your answers.
    Ms. Tauscher. Gentleman's time has expired.
    The gentleman from South Carolina, Mr. Spratt, for five 
minutes.
    Mr. Spratt. Thank you very much, indeed.
    And General O'Reilly, we are glad to see you in the 
position you are in--sitting at the table, we are grateful to 
have your service in this connection.
    When Mr. Reagan, in 1983, announced the Strategic Defense 
Initiative, he said its object was to ``render nuclear weapons 
impotent and obsolete.'' I think we would have to agree that we 
are a long way from that goal, are we not?
    Let the record show everyone nodded his compliance. 
[Laughter.]
    General O'Reilly. Yes, sir.
    Mr. Spratt. I am not trying to diminish what you have done. 
I am simply trying to emphasize the great difficulty of 
achieving the goal that the President set down when he launched 
this.
    General O'Reilly, you said that we need this testing, 
because among other things, you did not want to place complete 
reliance, confidence, in simulators, which have inherent 
limitations. What are the doubts or concerns you are trying to 
dispel as you undertake this testing regimen?
    General O'Reilly. Well, sir, first of all, the simulations 
do have limitations in the fact that most of our flight-test 
failures that have occurred in this agency were due to quality 
control. And quality control is not revealed through 
simulations. You need to be doing actual testing in order to 
get that confidence and quality control testing on the ground. 
So that is one thing we are addressing is a comprehensive set 
of testing--not just flight testing, but the ground testing.
    And second of all, sir, it is associated with your opening 
comment, we are often referred to as a shield. We are not 
developing a shield. We are developing more of a multi-layered 
net, I think is a much better analogy. It puts a lot of 
uncertainty into the adversary. Is he going to be successful 
with attacking? But the best we can do is get the probability 
of engagements very high. But it is not an absolute shield.
    And so we need to be addressing, in our testing program, 
multiple systems working together to, in fact, show that 
something ``doesn't fall between the seams,'' between GMD or 
Aegis, or Aegis or THAAD, or THAAD or Patriot. And so that is 
another major area we must address is, how do they work 
together? And it requires a combination, because of the 
expense, of models and simulation. But those actual flight 
tests are very critical.
    Mr. Spratt. At the same time that Mr. Reagan made his 
speech, Paul Nitze, I believe, said that--and he is quoted in 
Mr. Coyle's testimony--that laid down three ground rules for 
judging success, and the last of which was that missile defense 
should be cost-effective at the margin, so that the cost of 
deploying one additional defensive system would be less than 
the cost of an offensive system that might overcome it.
    Do you think you have achieved that criterion, so that the 
cost of defense is less than the cost of offense?
    General O'Reilly. Sir----
    Mr. Spratt. Yes, sir.
    General O'Reilly. I will address that the cost of our 
interceptors are much more expensive than the cost of the 
threat missiles which we see. In fact, we are often surprised 
at how those missiles are built, and what it takes in order to 
produce a missile that could threaten, not only your contiguous 
neighbor, but threaten a region. They are showing we have 1,000 
more missiles in the 19 countries other than the United States, 
Russia, and China than we had just 5 years ago.
    So they are much more inexpensive than our interceptors. 
But taking into account the area which you are trying to 
protect and the cities you are trying to protect from it, I 
believe that might change the calculus some.
    Mr. Spratt. Mr. Coyle, noting the flight test, says that 
MDA, over the past five years, has launched just two successful 
GMD flight-intercept tests. MDA still must carry out 
successfully about 20 more, perhaps 25, flight-intercept tests 
of different types before the system can fully demonstrate 
effectiveness in realistic operational tests. Would you agree 
with that statement? Either one? Anyone.
    Dr. McQueary. I don't know off the top of my head whether 
it is 25, or whether it is 30, or whether it is 15.
    Mr. Spratt. It is in that range.
    Dr. McQueary. I think the key element is working with 
General O'Reilly on the path he is on, on this three-phase 
program in order to ascertain what tests have to be conducted. 
And from that, we can count, at that point, and then have data 
to look at. But I don't know how to answer the question then, 
as you have posed it.
    Mr. Spratt. But it is in the range of 20 to 25 additional 
tests?
    Dr. McQueary. Well, if you want it in a simple mathematical 
sense, if you wanted to have a 90 percent probability of 
something working and have a 90 percent confidence that it 
would be what you want, we would have to run 28 successive 
tests in order to demonstrate--28 successive tests that are 
identical in nature--in order to prove that confidence level. 
So that may be the origin of the comment.
    But I believe Mr. Coyle is going to be on later. So he will 
be able to answer the question himself.
    Ms. Tauscher. The gentleman's time has expired.
    Gentlewoman from California, Ms. Sanchez, for five minutes.
    Ms. Sanchez. Thank you, Madam Chair. It is a pleasure to be 
back on the subcommittee after taking a couple years off. 
Obviously I came back because I think this is an incredibly 
important issue for our Nation. There are lots of threats out 
there. As you know, I chair on Homeland Security and in 
particular the subcommittee that handles the global 
counterterrorism for our Nation.
    So I just want to put to the record that, you know, it is 
always an issue about scarce resources. We are in a time, 
especially today, of scarce resources. There are a lot of 
demands here in Washington of what we are going to do with 
money that, quite frankly, most of us know we don't even have.
    And so, you know, I just want to put for the record that I 
never have believed that this was an issue of wanting to stop 
or not having the capability that many of us think at some 
point we do need. But it is about how do we get there? And how 
do we most effectively get there?
    So I am very, actually, very proud of the gentlewoman from 
California, my friend, who has now chaired this for a couple of 
years. And I think you have been doing a great job to talk 
about how we reassess what we have out there, because we know 
that we put billions into this long-range missile defense 
system. And the confidence, I think, is really not there that 
it will stop something from coming in to our shores or other 
places that we want to protect.
    My question today is to General Nadeau. You expressed our 
limited capability against a launch from Northeast Asia. It 
seems to me, even in recent days when we have taken a look at 
what North Korea may or may not be doing, but most likely may, 
in continuing to expand its missile capability, how does the 
Department of Defense address our limited capability at this 
point, when we are looking at what North Korea may be doing, 
for example?
    General Nadeau. Yes, ma'am. Thank you for asking that 
question. The limited nature, as we have described that 
capability, is confined only to the fact that the flight tests 
that have been run have not been over an expanded series of 
scenarios. So against a narrow set, you end up with the 
assessment of limited capability.
    And so through, whether it is continued flight testing or 
use of modeling and simulation, when you can expand the 
envelope, and you see more, limited turns into a different 
assessment of capability. So again, limited nature, in its most 
simplistic form, is only because we have looked at it only in a 
very finite window so far.
    Ms. Sanchez. So if I am getting this correct--I mean, I 
used to work for Booz Allen & Hamilton. And we used to 
calibrate, you know, we used to put together our algorithms and 
figure out what we thought would happen. And then we would 
calibrate it with some real tests.
    Sometimes we couldn't calibrate everything out. Maybe it is 
sort of like trying to land a person on the moon. You do the 
algorithm. But until you really land them, you really don't 
know whether it is all going to work out. But you try to do or 
simulate as much as you can those critical pieces, especially 
keeping your people alive, or what have you.
    So what I am understanding from you is, we can't account, 
in a real test manner, for every single possibility to give us 
100 percent confidence that we are going to have--that we are 
going to test everything that may come at us. But we do that 
through calculation, through algorithm, through modeling. And 
then where we can in the specific areas, we do these other 
tasks to get a better calibration of whether our system will 
work, or how to tweak the system so it works for us getting 
most of what is coming at us, or what we believe will come at 
us. Is that correct?
    General Nadeau. Yes, ma'am. I agree with that and would add 
a couple of points. You get through modeling and simulation the 
ability to more quickly go into different test scenarios and 
most certainly more economically, because of the cost involved. 
And if there is a belief, in discussion with the Missile 
Defense Agency or DOT&E, that we need to take a look at a test 
scenario that we do not have confidence in the model's ability 
to do that assessment, then the dialogue turns to an actual 
flight.
    Ms. Sanchez. So it is not that we have stopped funding this 
particular area that we are interested and believe, because a 
lot of us, I think, I believe, are interested in this. It is 
that some--I don't necessarily put myself in that category, but 
many have said that the systems that we have are what we built, 
what we actually have on hand, may be more of a facade, that it 
doesn't have that confidence to take out whatever may come at 
us. We don't know yet what they really have and how it will 
come at us.
    But I think most of us are just interested, or at least 
this side, under this chairwoman that I have seen, are 
interested in continuing to test and continuing to figure out 
how we make this system really work for what may come at us.
    Is that sort of putting words in your mouth, Chairwoman?
    Ms. Tauscher. I think those are words that came out of my 
mouth.
    Ms. Sanchez. So, in your opinion, how concerned should we 
be at what North Korea is doing, given what we currently have, 
given that we are not continuing to build the same system all 
over the place, that we think isn't handling the job 
necessarily, but rather wanting to test and improve and really 
build something, or add on to what we already have, something 
that would really stop what may be happening.
    Ms. Tauscher. The gentlewoman's time has expired. But you 
can answer the question.
    General Nadeau. From the test community's perspective, one 
of the variables to consider is not concern over the 
performance of a potential adversary, if I can state it that 
way. And so where I turn our attention from the test 
perspective is to provide as much information to General 
O'Reilly and his team to be helpful to them to advance the 
performance confidence in the system.
    One of the luxuries of a test operation like mine is I am 
not pressured by cost, performance and schedule; meaning that 
in all of the right parameters, because I can look back and 
deal only with the facts and not get concerned about the 
shortness of schedule or perhaps the absence of other 
resources.
    So our function is to be that independent voice to General 
O'Reilly to help him and his agency make the right decisions 
and help alleviate, perhaps, either some of those concerns, or 
the terminology from our world is risk, ma'am.
    Ms. Sanchez. Thank you.
    Ms. Tauscher. Thank you very much, gentlemen. We would like 
to get on to our second panel. We very much appreciate your 
appearance before the subcommittee. We know that we will see 
you again.
    General O'Reilly, we are looking very much into getting you 
back after the results of your test review is ready for us to 
have some testimony from you. So we look forward to doing that 
later in the spring.
    If we could take a 60-second recess and change out our 
panels, I would appreciate it. Thank you.
    We are honored to have our second panel with us today. We 
have experts and academics. And I think we are going to begin 
with Mr. Coyle. Just for the record, I want to state that Mr. 
Coyle and his wife are former constituents of mine and old 
friends.
    But I am glad to have you back.
    And all of your testimony has been submitted for the 
record. If you could summarize in five minutes or less, I would 
appreciate that. We are expecting some votes again. So we will 
have to recess when those votes are called. But we would like 
to get through as much as we can on the panel as possible.
    So Mr. Coyle, welcome back again to the committee. Thank 
you for your service. And the floor is yours.

 STATEMENT OF HON. DR. PHILIP E. COYLE, III, FORMER DIRECTOR, 
  OPERATIONAL TEST AND EVALUATION, U.S. DEPARTMENT OF DEFENSE

    Dr. Coyle. Thank you very much, Madam Chairman. I 
appreciate the opportunity to be with you before you today to 
support your examination of the future of missile defense 
testing. Ranking Member Turner, I appreciate being here with 
you also.
    In my view, there is a troublesome lack of clarity in 
public discourse regarding both the rationale for, and the 
technical progress toward, an effective U.S. missile defense 
network. Quite simply, the public statements made by Pentagon 
officials and contractors have often been at variance with the 
facts at hand. I am referring to the past, not under General 
O'Reilly.
    It is difficult to separate a programmatic spin from 
genuine progress. In particular in the past, the program has 
made claims that have not been demonstrated through realistic 
testing.
    In my prepared testimony I outline the steps that I believe 
the Missile Defense Agency must take. These include tests to 
establish operational criteria, such as how good is the system. 
You had questions earlier about that, and, as you saw, they 
were not able to be answered. Tests to demonstrate that the 
system can withstand attacks involving multiple missiles, not 
just one or two; testing to demonstrate that the system can be 
operationally effective in the presence of realistic decoys and 
countermeasures; and four, test to eliminate the gaps from past 
flight intercept tests, including years of kicking the can down 
the road on basic operational questions, like can the system 
work at night, in bad weather or in likely battlefield 
conditions?
    In my prepared testimony, I make an analogy about the 
different scientific and technical issues that a program can 
face. And I make the analogy with an imaginary Pentagon program 
to demonstrate human flight. And I am not trying to be funny 
there. The Missile Defense Agency faces many very daunting 
scientific and technical problems, and they have not been 
addressing those questions. It appears that Lieutenant General 
O'Reilly is beginning the process to do that.
    Our military often observes that the enemy has a vote. In 
missile defense, this means that if the enemy is bound and 
determined to attack us, they will do whatever they can to 
overwhelm and confound our missile defenses. This means that 
the enemy may launch many missiles, not just one or two; may 
make their warheads stealthy and hard to detect and track; and 
may use decoys and other types of countermeasures to fool or 
confuse our defenses.
    They may attack us at night or in bad weather, or may use 
electronic jamming or stealth. Recently the White House said 
about National Missile Defense, the ground-based system as it 
is called now, ``The Obama-Biden Administration will support 
missile defense, but ensure that it is developed in a way that 
is pragmatic and cost-effective and, most importantly, does not 
divert resources from other national security priorities until 
we are positive the technology will protect the American 
public.''
    How will the Administration and Congress be positive that 
missile defense will protect the American public? It is going 
to take testing far beyond what we have seen to date.
    The easiest ways for an enemy to overwhelm our defenses are 
to, number one, build more missiles, more offensive missiles, 
to attack us; number two, use decoys and countermeasures to 
fool the defenses; and three, attack us in ways that our 
missile defenses are not designed to handle, such as with 
cruise missiles, or through terrorism or insurgency.
    The Missile Defense Agency does not have a charter to 
counter terrorism. But it is responsible to address the ways 
that an adversary might try to overcome or fool our missile 
defenses. The testing program must put those issues front and 
center. But that has not been happening.
    My perspective on the threat may be different from yours. 
In my view, Iran is not so suicidal as to attack Europe or the 
United States with missiles. To me, it is not credible that 
Iran would be so reckless as to attack Europe, or the United 
States for that matter, with a single missile, and also by the 
way, with no decoys or countermeasures to fool us, and then sit 
back and wait for a massive retaliation. As we know, ballistic 
missiles have return addresses.
    I don't believe that the launch of a small satellite by 
Iran earlier this month changes this situation.
    But if you believe that Iran is bound and determined to 
attack Europe or America, no matter what, then I think you also 
have to assume that Iran would do whatever it takes to 
overwhelm our missile defenses, including using decoys to fool 
the defenses, launching stealthy warheads, and launching many 
missiles, not just one or two.
    The Missile Defense Agency admits it can't handle that 
situation today. And accordingly, their testing program must 
begin to address these challenges soon.
    Ms. Tauscher. Mr. Coyle, can I ask you to sum up? We have 
about 10 minutes before we have another series of votes. And I 
would like to get the other two witnesses to give their 
statements.
    Dr. Coyle. Certainly, I can stop right there.
    [The prepared statement of Dr. Coyle can be found in the 
Appendix on page 82.]
    Ms. Tauscher. Thank you, sir.
    Mr. Francis.

    STATEMENT OF PAUL L. FRANCIS, DIRECTOR, ACQUISITION AND 
   SOURCING MANAGEMENT, U.S. GOVERNMENT ACCOUNTABILITY OFFICE

    Mr. Francis. Thank you, Madam Chair.
    Mr. Turner and members of the subcommittee, I appreciate 
your having me here to participate in the discussion of missile 
defense testing.
    Ms. Tauscher. Thank you.
    Mr. Francis. I will attempt to answer the three questions 
that I got in my invitation letter as well. The first was what 
are conclusions from our annual report that will be issued next 
month on missile defense. I would first like to recognize the 
uniqueness of missile defense testing and the challenges it 
faces in complexity, cost, safety, the fact that development 
and operational testing has to be combined, and the fact that 
modeling and simulation is really important for this program. 
So it makes every test event really important.
    Now, during fiscal year 2008, which we looked at in our 
review, we found that the testing itself, while there were many 
things done well, testing--particularly flight testing--was 
less productive than planned. None of the missile defense 
elements conducted all the testing they had planned. And only 
one achieved all its key objectives. In a number of cases, 
tests were either cancelled, deferred or achieved less than 
planned. And this was particularly true for the GMD element.
    Targets have been a persistent problem across all the 
elements that are flight testing. There are a number of 
consequences, in my opinion, associated with less productive 
testing. One of those does relate to anchoring models and 
simulations, which are absolutely key for this program. And 
there was a question earlier about how many models there were.
    There are 40 models; about six of them fully accredited, 
nine have been partly accredited. That leaves 25 to be done 
yet, before you can assess the full performance of the system. 
And I don't believe that will be done until 2011. So quite a 
bit of work to do there; quite a bit of understanding yet about 
the fielded systems' performance against countermeasures.
    Another consequence that we have observed is production 
fielding is beginning to get ahead of testing, so that some 
assets are being produced and fielded before they are tested. 
And finally, declarations of capability--that is when you say 
an asset is ready to be operational--some of those have been 
postponed. And some declarations have been made on the basis of 
less information than planned.
    The second question you had asked me to address was our 
views on the three-stage review that General O'Reilly is 
conducting. We think that is something that is needed, and we 
welcome it. And I think General O'Reilly's experience as the 
THAAD program manager is especially relevant in this review, 
because he has kind of been through this before.
    We have only gotten initial briefings on it, but I like the 
overall approach. I think identifying those critical variables 
that are going to be in the models and simulations and cross-
walking those to testing, I think that is important and should 
close some of the gaps that we see today between modeling, 
simulation, and testing. I think the involvement of the test 
community has been very important.
    That third phase is going to be really critical. That is 
where General O'Reilly will address resourcing, the flight test 
program, and the ground test program with assets. And that gets 
to the third question that you had asked me to address is what 
actions do we think missile defense should take in this new 
approach? And I think there are five.
    One is continued involvement of the testers in the process. 
The second is the test program that emerges has to be robust in 
terms of targets, test assets, allowing time to analyze after a 
test and do post-flight reconstruction. And I think that is 
really important.
    The third thing is the fiscal year 2009 test plan is very 
ambitious now, because a lot of the fiscal year 2008 testing 
has been pushed into it. So I think that has to be looked at to 
see if it is still rational and achievable.
    The fourth thing is synchronizing, or re-synchronizing, I 
would say, production and fielding decisions with modeling and 
testing information, so that the modeling and testing comes 
before the production and fielding. And the last thing is I 
think it will take about two years for the new test plans to be 
fully implemented. So we are looking at 2010, 2011.
    So the MDA will be in a transitional period. I think that 
is going to be a time for careful management and some prudent 
decisions about production and fielding while we are waiting 
for a really sound test plan to emerge. So----
    Ms. Tauscher. Thank you, Mr. Francis.
    Mr. Francis. I am ready for any questions.
    [The prepared statement of Mr. Francis can be found in the 
Appendix on page 113.]
    Ms. Tauscher. I appreciate that very much.
    Mr. Mitchell.

 STATEMENT OF DONALD C. MITCHELL, CHIEF ENGINEER FOR BALLISTIC 
 MISSILE DEFENSE, AIR AND MISSILE DEFENSE SYSTEMS DEPARTMENT, 
      APPLIED PHYSICS LABORATORY, JOHNS HOPKINS UNIVERSITY

    Mr. Mitchell. Thank you, Chairwoman Tauscher, Ranking 
Member Turner, distinguished members of the subcommittee. Thank 
you for the privilege of appearing before you today on the 
topic of Ballistic Missile Defense.
    I have served the Missile Defense Agency since 2005, first 
as a member of the Mission Readiness Task Force, and now as 
Director for Readiness Assessment. In those assignments, I 
worked closely with the test and evaluation communities of GMD, 
Aegis BMD, and THAAD as they prepared for firing exercises in 
order to develop an independent assessment of their readiness 
to conduct those missions. The written testimony that I 
provided to the subcommittee is based upon that experience 
these last four years.
    The firing histories for those three elements indicate that 
there is a military capability against simple separating 
targets, and that upcoming flight tests will demonstrate a 
capability against more challenging threats. Though important, 
flight tests are not sufficient to accurately understand the 
operational effectiveness and operational suitability of a 
system.
    A test and evaluation plan that integrates the results from 
flight tests, ground tests, and high-fidelity models and 
simulations is required to understand the effectiveness and 
suitability of the BMDS. High-fidelity models and simulations 
are used first to predict the outcome of a flight test under 
various conditions, and second, to replicate the outcome of the 
flight test using the conditions as experienced during the 
mission.
    This technique, called anchoring, is part of the 
verification, validation, and accreditation (VV&A) for the 
models and simulations that allows one to believe the 
predictions  produced by them. Models and simulations that are 
VV&A-ed can be used to produce a believable, statistically 
significant, cost-effective estimate of the effectiveness of 
the system.
    Ground tests can be used to demonstrate the operational 
suitability by showing the deftness with which the elements of 
the BMDS interact with one another. Thoughtful planning can 
produce complementary results from flight tests and ground 
tests. By emphasizing suitability in ground tests, the simplest 
set of flight tests can be defined with which to anchor the 
models and simulations. This approach provides flexibility in 
making fielding decisions of the BMDS.
    MDA has embarked on a three-phase effort to define a set of 
flight tests that will anchor the high-fidelity model in use in 
MDA, and ground tests that will demonstrate the suitability. An 
oft-asked question is, how many flight tests are necessary to 
demonstrate that a system is effective? That question is now 
properly reframed as how many flight tests are necessary to 
anchor high-fidelity models?
    The answer to that question is being developed using the 
critical engagement conditions and empirical measurement 
events. From this review, MDA will know what portions of the 
models can be anchored by measurements on the ground, and what 
portions should be anchored in flight. I look forward to a 
conclusion that presents the Director, Lieutenant General 
O'Reilly, with an efficient plan that demonstrates 
effectiveness and suitability of the BMDS.
    I would like to make a brief comment on GMD, if I may. That 
program has made significant strides in improving its test 
discipline and has adopted a quality improvement program that 
bear fruit in the future. I respectfully request that the 
subcommittee continue to support GMD in these efforts.
    I welcome the opportunity to speak with you today and will 
be pleased to answer any questions that you have.
    [The prepared statement of Mr. Mitchell can be found in the 
Appendix on page 128.]
    Ms. Tauscher. Thank you, Mr. Mitchell.
    We are about to be called for votes, but I am going to 
actually yield my time to Mr. Larsen for five minutes.
    Mr. Larsen. Thank you.
    Perhaps Mr. Francis can answer a question that I wasn't 
able to ask. Sorry I wasn't here for the whole panel. But it is 
still relevant to the panelists here, because it is a question 
that I will explore with Dr. McQueary as well.
    But from the GAO's perspective, the idea that General 
O'Reilly has to lay out a, call it a lifetime testing plan, or 
just a longer-term testing plan for us, and then not conduct 
full tests each time in order to save money on a test. That is 
making a determination that, on any one test, you may not have 
to start from the very beginning and then go all the way 
through to the element that they want to test. That saves 
money.
    But in your view, will anything be lost doing it that way 
as well? The benefit is perhaps saving money on a test. But is 
there anything lost on the test from doing it that way?
    Mr. Francis. It would depend on how the plan is laid out. 
So for example, if there is a very rigorous, say, ground test 
plan that is anchoring models and simulations, I think a--I 
haven't seen the specifics of what General O'Reilly is 
proposing--but I would say then a more limited flight test 
might be okay, as long as it has that kind of a foundation.
    I think where you run into trouble is where you have a very 
success-oriented schedule set out that is predicated on 
everything going just fine. At the same time, we are producing. 
And when something happens and things don't go well, then we 
end up loading up a test, for example.
    So the current approach was what I would call a crawl-walk-
run approach.
    Mr. Larsen. Right.
    Mr. Francis. And we have got a little bit away from that. 
So instead of each test, flight test for example, demonstrating 
one new variable or one new capability, they are starting to 
load up. So I think it is Flight Test 6 that is coming up. It 
is going to have a new Exoatmospheric Kill Vehicle (EKV) in it. 
It will be the first test against a complex threat scene. And 
it is only going to be the second test of a new target.
    So that is my caveat. As long as the plan is laid out to 
anticipate some contingencies that it can react to, I think 
that would work out. If it is success-oriented, then we might 
end up loading up that flight test.
    Mr. Larsen. Yes.
    Mr. Mitchell, do you have a view on that?
    Mr. Mitchell. I have a view that----
    Mr. Larsen. On my question.
    Mr. Mitchell. Yes, I have a view very similar to Mr. 
Francis. Models and simulations represent three things: the 
functional behavior of the system; the performance attributes 
associated with the system; and the error sources within the 
system. Many of those can be adequately understood on the 
grounds of ground test. And it is the purpose of a flight test 
to then fully anchor those things we really can't get at on the 
ground.
    With that approach to VV&A of the model, I don't think we 
will lose anything in these flight tests, so long as they are 
carefully planned, sequential in nature, and don't try and rush 
to a complicated, complex test that we haven't walked up to, as 
Mr. Francis suggested.
    Mr. Larsen. Yes.
    Mr. Coyle, do you have any thoughts on that? Do we lose 
anything by testing just the element that we wanted to test as 
opposed to starting from the beginning and testing through the 
element that we have added?
    Dr. Coyle. Well, I would recommend that there wouldn't ever 
be any test that you hadn't attempted to model and simulate 
beforehand. But there are some things that models will always 
be mysterious about those things. And it is just going to take 
a flight test.
    For example, Dr. McQueary, in his report to Congress, 
points out that the GMD system, the ground-based system, has 
not demonstrated its performance throughout, and I quote, ``the 
expected range of adverse natural environments.'' He is not 
talking about what the enemy might do to fool you. He is 
talking about, at night, when the sun is shining in your eyes, 
things like that.
    And so Dr. McQueary points out in his report to Congress 
that there are some issues like this that just haven't been 
addressed yet.
    Mr. Larsen. Madam Chair, that has been the main question I 
have been exploring. And so that is fine. So I yield back my 
time.
    Ms. Tauscher. Thank you, Mr. Larsen.
    Gentlemen, can we ask for your forbearance for about a half 
an hour. We have three votes, a 15, a 5, and a 15. But that 
doesn't mean that we will be gone for that entire 15 minutes. 
So if you don't mind, we will recess for about a half an hour. 
We thank you very much. Be right back.
    [Recess.]
    Ms. Tauscher. Witness line warm, as they say? [Laughter.]
    I think we want to just go and talk about the successes of 
theater missile defense, for example. DOT&E's Fiscal Year 2008 
Annual Report noted that theater missile defense systems, such 
as Aegis BMD and THAAD, continue to make significant progress, 
although the long-range GMD system continued to face 
challenges. Are there lessons we have learned from the theater 
missile defense testing that should be applied to the GMD 
system, first?
    And second, if so, what specific recommendations would you 
make to the Department of Defense? Let me start with Mr. Coyle 
and go right down the line. If you can keep your answers brief, 
I think we are going to see Mr. Turner pretty soon.
    Dr. Coyle. Thank you.
    Yes, the services do have an approach toward testing, which 
I think is very healthy. Publicly, in other forums, I have 
given the Navy credit for the approach that they have taken 
with the Aegis system. They have a tradition of doing quite 
realistic tests at sea. And that tradition has carried over to 
their missile defense work. Analogies like that could be made 
with respect to the Army in both Patriot and THAAD.
    However, I have to add, for all their good successes, I 
continue to be concerned that Congress is not fully and 
currently informed about the ways in which these tests are 
scripted. And I think that is something that probably General 
O'Reilly is going to try to change. And that will be good.
    Ms. Tauscher. Mr. Francis.
    Mr. Francis. Yes. I think some of the things--I agree 
certainly with what Mr. Coyle said about the testing. I think 
both in THAAD and Aegis, there is I think a wider range of 
targets being presented. I think the operators know less in 
advance about what is going to happen. And I think that they 
are engaging in a broader flight regime, if you will, not as 
narrow as GMD.
    If you go back, though, both Aegis and THAAD are self-
contained. They own the missiles. They own the fire control. 
They own the radar. So I think it is a little less complex. And 
GMD didn't own that.
    But having said that, I think if you turn the clock back, 
it would be interesting to ask General O'Reilly, I think THAAD 
was in this very situation in early development. And they were 
trying to do a lot of testing and gang up on a single test. And 
they had to stop and remap the test program. So I am hoping 
that is what they are doing with GMD now.
    Ms. Tauscher. It is not just a coincidence that it was 
General O'Reilly that did that.
    Mr. Francis. No. Yes.
    Ms. Tauscher. Mr. Mitchell.
    Mr. Mitchell. Yes, ma'am. The points are well-taken. But 
there is something else about Aegis and THAAD that you need to 
understand. It is the way they prepare for flight tests.
    They are very disciplined. They have a primary objective. 
They work hard to understand what the probability of success is 
against that primary objective. And they work equally hard at 
knowing what the risks are against attaining that probability 
of success in the mission. That is why they have been 
successful. And again, it is a wide variety of targets.
    GMD has adopted that precise mentality. It has caused them 
to postpone some missions: FTG-04 was canceled because of a 
problem in the telemetry system. It was very likely we would 
not get any telemetry data. And so we wouldn't be able to 
reconstruct what happened. That was, in my mind, a correct 
decision, given the cost of these exercises and represents 
discipline that I am talking about.
    Ms. Tauscher. Thank you.
    I was just trying to keep the witnesses warm for you, Mr. 
Turner. I am happy to yield to you.
    Mr. Turner. That is very kind of you. Thank you.
    Thank you, all, for participating first off and for 
bringing your expertise.
    Mr. Coyle, am I pronouncing that correctly?
    Dr. Coyle. Yes, sir.
    Mr. Turner. Coyle, okay, thank you.
    One of the great aspects, I think, of any of these 
hearings, when you bring diverse views together, how much you 
can learn from the different perspectives and the advice that 
they give. Mr. Coyle, I was really interested in your testimony 
because, besides your admission that you cannot fly, there was 
some revelation there.
    And on page six, you say: ``In my view, Iran is not so 
suicidal as to attack Europe or the United States with 
missiles. To me, it is not credible that Iran would be so 
reckless as to attack Europe or the United States with a single 
missile, no decoys and the like. Similarly, North Korea isn't 
so suicidal as to attack Japan or the United States.''
    And this hearing is about testing. But I took from your 
statement a belief that there is an exaggeration of the threat, 
a lack of a view which I think is different, and I would like 
you to expound upon, that Iran and North Korea really don't 
pose the type of threat that everyone is saying that they do.
    Dr. Coyle. Actually, my point is just the opposite, that if 
you believe that Iran would be suicidal enough to attack Europe 
or the U.S., then you have to also believe that they would do 
whatever they would, they could, to overwhelm our defenses. And 
that would mean firing more than one or two missiles. And as 
General O'Reilly acknowledges in his testimony, that is not 
something the Missile Defense Agency can handle.
    For example, there is only supposed to be 10 interceptors 
in Poland. And it was pointed out by the first panel that the 
doctrine is to shoot as many as five missiles at each one. So 
if Iran launches two missiles, those 10 are all gone. And if 
Iran launches a third one, you have got no interceptors left. 
So if you want to take the threat from Iran seriously, then I 
think you have to look where they might go with it.
    Mr. Turner. Okay. And I absolutely agree with you, with 
your point of you must, in your evaluation, think about, you 
know, that they would do whatever they could. Similarly, we 
should do whatever we can. And in that, then, should I take 
your comments to be an advocacy for more deployment of missile 
defense; because if you think that what we have is going to be 
insufficient for an ever-increasing threat, wouldn't the 
logical conclusion of your testimony then be that we should 
deploy more, not less?
    Dr. Coyle. I support research and development (R&D) in 
missile defense and have for my whole life, if for no other 
reason than we should avoid technological surprise. What I 
would not support is deploying a bunch of hardware that we 
either know wouldn't work in the situations we would face, or 
which have----
    Mr. Turner. But we don't have that situation, right? 
Because no one has ever testified that we have something that 
we know does not work. No one has, I think, ever testified 
before this committee that we have things that we know don't 
work.
    Dr. Coyle. Actually, the----
    Mr. Turner. They might not work as well, or they might not 
be perfect. But no one has said we have things that don't work.
    Dr. Coyle. But what the Missile Defense Agency itself has 
said is that they cannot handle attacks with multiple missiles. 
That is an example of something they have said that they do not 
know how to do right now. Hasn't been tested, and they don't 
know how to do it. So if you believe that Iran would attack 
Europe or the United States, I think you have to take that 
seriously.
    Mr. Turner. I appreciate that.
    Mr. Francis, Mr. Mitchell, one of the things that I have 
been very impressed with in all of the testimony, is the 
secondary issue of you test, flight test, you get obviously a 
specific response from that. But you also get an incredible 
amount of data. And that data, in part, is used for simulation, 
modeling, computer work on, not only just improving the system, 
but on determining later how the system might perform or what 
are the uses it could be.
    I mean, one of the things that I think of when I hear them 
talk about that, is the Aegis system. You know, we never tested 
it to take out a satellite that was falling out of the sky. No 
one would argue that we should have never pressed the button to 
have it take a satellite out of the sky once the satellite is 
falling from the sky, just because it had not been tested.
    But we had computer modeling and simulation that aided us 
in determining whether or not this was something that was 
possible. And we had, obviously, a situation that we needed to 
act. And then we did.
    Could you speak a moment about the importance in testing of 
the data that is generated in the computer simulation and 
modeling, because I am very interested in your opinion and 
thoughts on that process.
    Mr. Francis. Yes, well, I will start off and then turn over 
to Mr. Mitchell. But the data is very important, because after 
you have a flight test, they do what is called a post-flight 
test, a post-test reconstruction, where you actually try to 
replicate what happened in the real test through the modeling. 
And there is kind of a symbiotic relationship between the two.
    If you can get your data from the flight test to make the 
model better and to anchor it in reality, then when you are 
presented with a new situation or you are about to do another 
flight test, you can run the model and get some idea as how you 
are likely to do in a real flight test. So it is very 
important. And they build on each other.
    So when we are looking at a performance assessment, which 
is basically a way to look at how the missile defense system, 
as it is in the field, will work today, that is an aggregation 
of models. There is no one grand model that does that. So each 
one of these models, Aegis and THAAD and so forth, would get 
aggregated to give you some kind of prediction of the overall 
system. So there is, like I say, a symbiotic relationship, very 
important.
    Mr. Mitchell. In addition to what Mr. Francis said, the 
data has another very important issue. And that is the system 
can behave in unexpected ways that didn't threaten the flight. 
You may have had a success. But there is something peculiar, 
that data, that leads you to an investigation about what did 
this function do? Why did it do it? Why was the tracking 
accuracy the way it was and would you expect it to be a little 
bit better than that that occurred in flight?
    That data is a rich field with which to really poke at, not 
only your understanding of the system, but the way it is 
physically implemented, to determine whether or not it was 
intended to be that way. That is a second-level--a very 
important use of that data.
    Mr. Turner. Thank you very much.
    Madam Chair.
    Ms. Tauscher. I think the issue of validation of the models 
and simulations is one of the first things that General 
O'Reilly is moving vigorously to mitigate.
    The analogy I think is like if you have a patient appear in 
the emergency room. And you don't take down their temperature 
and their vital signs, but you decide that they have a critical 
issue where they may need surgery. Nobody is going to go into 
surgery without figuring out whether the person, you know, has 
a heartbeat that is going, and a temperature that is okay, and 
whether they can manage the anesthesia or not.
    And I think that what is clear to me is that the lack of 
verification of these models and simulations, some number north 
of 40, creates, for Dr. McQueary, this question of confidence.
    And Mr. Francis, this is a specific area that you have 
talked about in the GAO report. Could you just expand a little 
bit on the novelty of the fact that this hasn't been done, the 
fact that this is an underpinning of, not only ``fly before you 
buy,'' but the kind of assurances and surety that systems are 
meant to have?
    Mr. Francis. Sure. I do review a number of different 
programs, shipbuilding programs and Army programs. And we just 
did some work looking at testing of body armor. And in a test 
of a system like that, you can run repeated tests and get all 
the data you need, whether or not you have a model. You fully 
understand everything that vest can do.
    I think what is unique here about the missile defense is 
testing cannot achieve that. We can't know everything about the 
system, because just the physical limitations of testing as it 
relates to the BMDS system. So modeling in some cases is a 
nice-to-have. But in missile defense, it is a must-have.
    And so one of the things that missile defense has been 
trying to do--and I think it is General Nadeau's 
responsibility--is to do an annual performance assessment, 
which is an attempt to use modeling and simulation to say, 
``What do we know about the system that is in the field 
today,'' because testing can't tell us enough. You have to have 
your models accredited, part of which means being anchored in 
ground and flight test to be able to say that.
    And as I had said earlier, we are quite a ways from that. 
We have 40 models right now that would have to be aggregated in 
some form to say ``Do we understand how the system in the field 
today works?'' Twenty-five have yet to be accredited.
    So the significance of that is you can't say how the 
fielded system is going to perform without the modeling and 
simulation. So it is a must-have.
    Ms. Tauscher. Thank you.
    I am not sure who came in first, Mr. Franks or Mr. Lamborn.
    Mr. Franks for five minutes.
    Mr. Franks. Well, thank you, Madam Chair.
    Madam Chair, I was just listening to the responses by Mr. 
Coyle. I guess related to the European site with the 10 
interceptors suggestion that, you know, as many as five of the 
interceptors might be launched against one potential incoming 
missile. Even if that scenario was true, I guess that there are 
a couple of conclusions there.
    First of all, that would mean perhaps at least two cities 
would be saved. Secondarily, given Iran's present, at least 
what we believe is their present amount of fissile material 
that they have, there is a limit to the number of warheads that 
they might have, which might give us a chance to respond with 
greater numbers of interceptors, if that becomes clear.
    I think that my greatest hope is that the presence of a 
ground-based system in Poland and in the Czech Republic would, 
as it has been said so many times today, to devalue the Iranian 
nuclear program, to the hope that somehow many of the other 
things that we are trying to do would dissuade the program 
entirely and, again, try to keep that technology from the hands 
of terrorists.
    That is just a comment. But I wanted to ask you, Mr. 
Mitchell, a question. You stated in your written testimony, and 
I am going to quote. It says, ``I conclude from this evidence 
that a fundamental useful defensive capability based on an 
autonomous operation of Aegis BMD, THAAD and GMD elements is 
available to our armed forces. But I cannot state that BMDS has 
reached maturity.'' And I think that is a very erudite 
statement.
    And you raise an important point. Even though that BMDS 
hasn't reached complete maturity, that it is still a useful 
defensive capability. And you are certainly an expert in this 
area. Can you comment on a few of the factors within the system 
and the testing mechanisms, as they are now, that give you the 
confidence to say that there is a useful defensive capability 
here?
    Mr. Mitchell. Well, I have to be careful about the use of 
the word ``confidence,'' as Dr. McQueary schooled us earlier 
today. So----
    Ms. Tauscher. You can say it is subjective confidence.
    Mr. Mitchell. I am going to try and use the word only when 
I am talking about statistical significance. GMD, in its last 
three firing missions, has essentially successfully detected a 
re-entry vehicle (RV), targeted it, and the EKV has destroyed 
that warhead. They were simple, separating targets, as General 
O'Reilly said, much like the trajectories we might have to 
engage from North Korea, using----
    If you look at that, there have been three chances. It has 
done it three times. There is something there that is useful. 
Now, I also am aware of the models that they use to predict 
their performance prior to these missions. And they are very 
detailed. And playing data back through, they were able to 
replicate what happened in flight.
    So I am beginning to believe that, if we finish the work 
started by General O'Reilly and use those models to arrive at a 
true understanding of the probability of success, we will have 
a credible defense. But that is yet to be proven. Now, that is 
solely for simple separating targets, as you----
    Mr. Franks. No, that is a great answer, Mr. Mitchell. And I 
appreciate the analogy that you use of flipping the coin. You 
described in your testimony where statistics describe 
increasing confidence as a result of more flips of the coin or 
a greater number of testing trials in the case of BMDS.
    And while we absolutely need to conduct flight and ground 
testing, you said in you testimony, ``the cost to conduct a 
firing mission makes it prohibitively expensive to develop a 
high degree of confidence for the performance of the system for 
any one scenario, much less full battle space using only live-
fire events.'' In other words, it takes a lot to do all of 
these things. It costs a lot.
    Based on that, can you discuss the importance of high-
fidelity models and simulations in order to achieve the type of 
statistical credibility and reliability in BMDS that you 
describe in your analogy? And what is the confidence level in 
high-fidelity models and simulations when testing BMDS? And I 
want you to hold that thought, if you could.
    There is one voice that I have heard in the crowd here. And 
I think it is the most compelling voice that I have heard on 
missile defense today. I think he is about seven-and-a-half 
months old. And I appreciate him being here. And I want you to 
know, my purpose for being on this panel is to make sure that 
he walks in the sunlight of freedom like the rest of us do.
    So please, I hope that he didn't distract you here.
    Mr. Mitchell. Oh no, not at all. The key to using models 
and simulations is that you have to be able to believe the 
results that you get from a model. That belief is what we have 
been calling VV&A, anchoring. We have used several terms, 
several different things that go into building a belief in the 
output.
    If you can believe that output, you can conduct any number 
of trials you want just using computer. You can do 250 and do 
1,000 against a scenario and get a very narrow range in which 
the real probability of success lies with a high degree of 
statistical confidence. You can then repeat that for any other 
scenario you wish to use. And by that technique, you can 
develop a sense of what the operational effectiveness of the 
system is.
    Mr. Franks. Thank you, all, very much.
    Thank you, Madam Chair.
    Ms. Tauscher. Isn't it true that the reason we can use 
stockpile stewardship, which is simulation of testing using 
high-speed computers and other means, putting the largest laser 
in the world that is in my district in California, is because 
we had 1,000 tests at the Nevada Test Range and other places?
    Isn't it true that one of the reasons why there is a 
question of anchoring and certifying the simulation and the 
testing and the modeling of the long-range system is that there 
hasn't been that number of tests that actually go back and say 
what you are projecting in the modeling and simulation which, 
by the way, are projections, or is not grounded in the reality 
of enough tests? Is that true? Yes.
    Mr. Mitchell. That is true. But I would like to expand upon 
the observation, if I may.
    Ms. Tauscher. Sure.
    Mr. Mitchell. Tests come in a number of different types. 
Much of the things that are represented in models can be 
verified by conducting experiments underground. You can measure 
what the drift rate in an inertial measurement unit (IMU) is 
and have that replicated in a model as a statistical draw. You 
can do a lot of that.
    You will need flight tests for some things to do. The 1,000 
tests, if I were to draw that analogy, would encompass all of 
these things, including flight tests, and say yes, we have to 
do that. But I don't want to leave the suggestion that we have 
to conduct 1,000 flight tests or 100 flight tests.
    Ms. Tauscher. Well, my point wasn't that we have to do 
1,000 tests. But my point is that the reason why we have such 
confidence in the science-based stockpile stewardship 
management of the weapons, particularly the weapons now, is 
that we tested 1,000 times. We have such a body of tests that 
you are not stretching the algorithm to try to get yourself to 
a place, because you have a significant body of live tests.
    Mr. Mitchell. That is correct.
    Ms. Tauscher. And I am not suggesting that the long-range 
system has to go through 1,000 tests. But I think that what 
General O'Reilly--the point that he is approaching, is that we 
have not had enough live tests to be able to certify enough of 
the models and simulations.
    We have 25 out of the 40 that cannot be anchored. And that 
there is no credibility, perhaps--maybe not the right word, but 
a significant piece of it is credibility. There is no 
credibility to the projections that these simulations and these 
modelings have, because you cannot ground them. You cannot 
anchor them in live tests.
    Mr. Francis.
    Mr. Francis. At this point, that is true. So you can't use 
the models to predict the performance of the system. I think 
one of the things that is different, and I will defer to my 
colleagues here on the panel, is, in the case of the long-range 
system, I don't think it is possible to physically test 
everything about it.
    Ms. Tauscher. That is right.
    Mr. Francis. And that is where the models and simulations 
are actually going to have to do things that we can't do 
physically.
    Ms. Tauscher. That is right.
    Mr. Lamborn for five minutes.
    Mr. Lamborn. Is that working?
    Ms. Tauscher. I can hear you.
    Mr. Lamborn. Thank you.
    Thank you, Madam Chairman.
    Mr. Coyle, I would like to ask you a question about your 
three over-arching points, especially point number one on page 
six. And to me, I am going to give you some scenarios. And I 
think you will have--I hope you would agree that we have to 
question your first over-arching point when you consider the 
following.
    You say that Iran or North Korea would not be suicidal and 
would not do a launch against the U.S., because ballistic 
missiles have return addresses. But I can think of--just 
sitting here--right off the top of my head, I can come up with 
four different scenarios where that would not be true.
    For instance, if they secretly armed a terrorist 
organization thinking that they could get away with it and 
leave no fingerprints; or if there was theft by some breakaway 
group within the country; or launch by rogue officials, rogue 
officers; or even accidental launch. I mean, in none of those 
four scenarios is the threat of retaliation by the U.S. an 
effective deterrent.
    So even if your point is true that they are not suicidal, 
and I am not sure I even buy that point. But even if that is 
true, these other alternative scenarios show that we should 
have some kind of protection if it is technologically possible. 
Even if the risk is slight, the consequence are so serious--it 
is a threat that has to be taken seriously.
    So when I look at that way of looking at it, Mr. Coyle, I 
just can't buy your first point. What is your response to that?
    Dr. Coyle. Well, of course, the first two scenarios you 
mention, terrorism and theft, missile defense isn't effective 
against those things. So perhaps we could put those aside, we 
would agree about those two.
    The other two that you mention, a rogue launch, accidental 
or unauthorized launch, those were exactly the criteria that 
the Clinton Administration had for missile defense during the 
Clinton Administration.
    In those days, of course, we were talking about an 
accidental or unauthorized launch from Russia or China--not 
from North Korea or Iran--but similar. And the reason that 
President Clinton didn't decide then to go ahead with missile 
defense for that mission, because that was the only mission. It 
was not to try to stop an all-out attack of missiles----
    Mr. Lamborn. Understood.
    Dr. Coyle. It was because, when his second term was 
finished, there had only been three tests. And two of them had 
failed. So there wasn't much of an argument that the system 
would be effective.
    A good question that you are asking is, ``Okay, what has 
changed since then?'' And one of the difficulties that 
President Clinton faced was that both Russia and China do use 
decoys and countermeasures. And so the--the regional 
commanders, the CINCs as they were called in those days, 
advised him that chances are that our missile defense system 
wouldn't work against an accidental or unauthorized launch from 
Russia or China, because those decoys would deploy, 
countermeasures would work. There would be, you know, these 
kinds of problems.
    And so that is what I am trying to bring out in my 
testimony is, one way or the other, if you think that this 
could happen, you have got to deal with the possibility of 
multiple, simultaneous launches or launches with decoys and 
countermeasures. And that has been something which, until 
General O'Reilly, the Missile Defense Agency has been kicking 
down the road.
    Mr. Lamborn. Well, and I agree with you. I think this 
testing is going to be stepped up and beefed up and made more 
comprehensive. And I am very happy about that. But you said 
that that is in the context of Russia and China. North Korea 
and Iran, I think we would agree, are not nearly as 
technologically advanced.
    And as far as your earlier point, you said that those first 
two scenarios somehow didn't apply. But my understanding is 
that, as missile technology becomes more advanced, even in 
lesser countries, lesser technological countries, like Iran and 
North Korea, they are developing a mobile capability. I mean, 
these tend to be mobile launchers.
    So acquisition by terrorists, whether it is deliberate or 
not, becomes easier the more that mobile technology for 
ballistic missiles, is available. So I think all these 
scenarios are valid. And I am sure there are others that we 
haven't discussed.
    Dr. Coyle. Well, I would agree with you. The difficulty is 
that the systems that are planned and being developed, the 
systems in Alaska, for example, and proposed for Europe are 
very focused on two countries, Iran and North Korea, not on 
other countries or, you know, terrorist groups such as you are 
positing. If that is the threat we have to worry about--I hope 
it isn't, but if it is--I think that would argue for the more 
mobile kinds of systems, shorter-range and more mobile kind of 
systems, which as I understand it is where your chairwoman is 
going also.
    Mr. Lamborn. Thank you for your answers. And I think all of 
the kinds of defenses we can field are all valuable.
    Ms. Tauscher. I don't think we disagree.
    Mr. Lamborn. I would agree with you on that.
    Thank you, Madam Chairman.
    Ms. Tauscher. Thank you. Thank you, sir.
    We want to thank the panelists very, very much. We are just 
beginning our hearings. This was our first subcommittee hearing 
of this year. We specifically wanted to talk about missile 
defense and testing. We obviously believe that it is very 
important that we have the system, the suite of systems in its 
best capability as possible. We appreciate your efforts to 
illuminate the debate. And we will probably be calling on you 
again.
    Our working relationship with General O'Reilly is very, 
very good. And we expect that we are going to be hearing from 
him later in the spring as we move on toward doing the mark for 
the full committee for the defense bill.
    So this hearing is adjourned. The committee offers its 
thanks to the panelists very, very much for your hard work, for 
your patriotism and for your willingness to be before us. Good 
afternoon.
    [Whereupon, at 4:42 p.m., the subcommittee was adjourned.]
?

      
=======================================================================




                            A P P E N D I X

                           February 25, 2009

=======================================================================

      
?

      
=======================================================================


              PREPARED STATEMENTS SUBMITTED FOR THE RECORD

                           February 25, 2009

=======================================================================

      
      
    [GRAPHIC] [TIFF OMITTED] T1659.001
    
    [GRAPHIC] [TIFF OMITTED] T1659.002
    
    [GRAPHIC] [TIFF OMITTED] T1659.003
    
    [GRAPHIC] [TIFF OMITTED] T1659.004
    
    [GRAPHIC] [TIFF OMITTED] T1659.005
    
    [GRAPHIC] [TIFF OMITTED] T1659.006
    
    [GRAPHIC] [TIFF OMITTED] T1659.007
    
    [GRAPHIC] [TIFF OMITTED] T1659.008
    
    [GRAPHIC] [TIFF OMITTED] T1659.009
    
    [GRAPHIC] [TIFF OMITTED] T1659.010
    
    [GRAPHIC] [TIFF OMITTED] T1659.011
    
    [GRAPHIC] [TIFF OMITTED] T1659.012
    
    [GRAPHIC] [TIFF OMITTED] T1659.013
    
    [GRAPHIC] [TIFF OMITTED] T1659.014
    
    [GRAPHIC] [TIFF OMITTED] T1659.015
    
    [GRAPHIC] [TIFF OMITTED] T1659.016
    
    [GRAPHIC] [TIFF OMITTED] T1659.017
    
    [GRAPHIC] [TIFF OMITTED] T1659.018
    
    [GRAPHIC] [TIFF OMITTED] T1659.019
    
    [GRAPHIC] [TIFF OMITTED] T1659.020
    
    [GRAPHIC] [TIFF OMITTED] T1659.021
    
    [GRAPHIC] [TIFF OMITTED] T1659.022
    
    [GRAPHIC] [TIFF OMITTED] T1659.023
    
    [GRAPHIC] [TIFF OMITTED] T1659.024
    
    [GRAPHIC] [TIFF OMITTED] T1659.025
    
    [GRAPHIC] [TIFF OMITTED] T1659.026
    
    [GRAPHIC] [TIFF OMITTED] T1659.027
    
    [GRAPHIC] [TIFF OMITTED] T1659.028
    
    [GRAPHIC] [TIFF OMITTED] T1659.029
    
    [GRAPHIC] [TIFF OMITTED] T1659.030
    
    [GRAPHIC] [TIFF OMITTED] T1659.031
    
    [GRAPHIC] [TIFF OMITTED] T1659.032
    
    [GRAPHIC] [TIFF OMITTED] T1659.033
    
    [GRAPHIC] [TIFF OMITTED] T1659.034
    
    [GRAPHIC] [TIFF OMITTED] T1659.035
    
    [GRAPHIC] [TIFF OMITTED] T1659.036
    
    [GRAPHIC] [TIFF OMITTED] T1659.037
    
    [GRAPHIC] [TIFF OMITTED] T1659.038
    
    [GRAPHIC] [TIFF OMITTED] T1659.039
    
    [GRAPHIC] [TIFF OMITTED] T1659.040
    
    [GRAPHIC] [TIFF OMITTED] T1659.041
    
    [GRAPHIC] [TIFF OMITTED] T1659.042
    
    [GRAPHIC] [TIFF OMITTED] T1659.043
    
    [GRAPHIC] [TIFF OMITTED] T1659.044
    
    [GRAPHIC] [TIFF OMITTED] T1659.045
    
    [GRAPHIC] [TIFF OMITTED] T1659.046
    
    [GRAPHIC] [TIFF OMITTED] T1659.047
    
    [GRAPHIC] [TIFF OMITTED] T1659.048
    
    [GRAPHIC] [TIFF OMITTED] T1659.049
    
    [GRAPHIC] [TIFF OMITTED] T1659.050
    
    [GRAPHIC] [TIFF OMITTED] T1659.051
    
    [GRAPHIC] [TIFF OMITTED] T1659.052
    
    [GRAPHIC] [TIFF OMITTED] T1659.053
    
    [GRAPHIC] [TIFF OMITTED] T1659.054
    
    [GRAPHIC] [TIFF OMITTED] T1659.055
    
    [GRAPHIC] [TIFF OMITTED] T1659.056
    
    [GRAPHIC] [TIFF OMITTED] T1659.057
    
    [GRAPHIC] [TIFF OMITTED] T1659.058
    
    [GRAPHIC] [TIFF OMITTED] T1659.059
    
    [GRAPHIC] [TIFF OMITTED] T1659.060
    
    [GRAPHIC] [TIFF OMITTED] T1659.061
    
    [GRAPHIC] [TIFF OMITTED] T1659.062
    
    [GRAPHIC] [TIFF OMITTED] T1659.063
    
    [GRAPHIC] [TIFF OMITTED] T1659.064
    
    [GRAPHIC] [TIFF OMITTED] T1659.065
    
    [GRAPHIC] [TIFF OMITTED] T1659.066
    
    [GRAPHIC] [TIFF OMITTED] T1659.067
    
    [GRAPHIC] [TIFF OMITTED] T1659.068
    
    [GRAPHIC] [TIFF OMITTED] T1659.069
    
    [GRAPHIC] [TIFF OMITTED] T1659.070
    
    [GRAPHIC] [TIFF OMITTED] T1659.071
    
    [GRAPHIC] [TIFF OMITTED] T1659.072
    
    [GRAPHIC] [TIFF OMITTED] T1659.073
    
    [GRAPHIC] [TIFF OMITTED] T1659.074
    
    [GRAPHIC] [TIFF OMITTED] T1659.075
    
    [GRAPHIC] [TIFF OMITTED] T1659.076
    
    [GRAPHIC] [TIFF OMITTED] T1659.077
    
    [GRAPHIC] [TIFF OMITTED] T1659.078
    
    [GRAPHIC] [TIFF OMITTED] T1659.079
    
    [GRAPHIC] [TIFF OMITTED] T1659.080
    
    [GRAPHIC] [TIFF OMITTED] T1659.081
    
    [GRAPHIC] [TIFF OMITTED] T1659.082
    
    [GRAPHIC] [TIFF OMITTED] T1659.083
    
    [GRAPHIC] [TIFF OMITTED] T1659.084
    
    [GRAPHIC] [TIFF OMITTED] T1659.085
    
    [GRAPHIC] [TIFF OMITTED] T1659.086
    
    [GRAPHIC] [TIFF OMITTED] T1659.087
    
    [GRAPHIC] [TIFF OMITTED] T1659.088
    
    [GRAPHIC] [TIFF OMITTED] T1659.089
    
    [GRAPHIC] [TIFF OMITTED] T1659.090
    
    [GRAPHIC] [TIFF OMITTED] T1659.091
    
    [GRAPHIC] [TIFF OMITTED] T1659.092
    
    [GRAPHIC] [TIFF OMITTED] T1659.093
    
    [GRAPHIC] [TIFF OMITTED] T1659.094
    
    [GRAPHIC] [TIFF OMITTED] T1659.095
    
    [GRAPHIC] [TIFF OMITTED] T1659.096
    
?

      
=======================================================================


              QUESTIONS SUBMITTED BY MEMBERS POST HEARING

                           February 25, 2009

=======================================================================

      
                  QUESTIONS SUBMITTED BY MS. TAUSCHER

    Ms. Tauscher. What is your current assessment of the capability of 
the GMD system to successfully engage and destroy a long-range missile 
threat from North Korea--high, medium, or low?
    Dr. McQueary. Ground-based Midcourse Defense (GMD) has demonstrated 
a limited capability to defend against simple, long-range ballistic 
missile threats launched from North Korea toward the United States.
    As I have previously testified, my statistical confidence in the 
performance of the GMD system across the entire battle space and 
against the full range of possible threat types remains low for two 
reasons. First, the Missile Defense Agency (MDA) has conducted only 
three intercept flight tests using the operational equipment and 
software and all of these occurred within a relatively small portion of 
the threat battlespace. Second, the models and simulations used by the 
MDA to assess GMD capability over the full battlespace and threat 
scenarios have not yet been verified, validated, or accredited for use 
in these assessments.
    Notwithstanding these limitations, I believe the warfighters have 
developed tactics, techniques, and procedures that improve the 
capability of the GMD to successfully engage and destroy a long-range 
missile threat from North Korea. I defer to the operational commander 
for his assessment of his ability to defend against any specific threat 
that may be posed against the United States today.
    Ms. Tauscher. To what extent was the GMD system designed to be 
suitable and survivable?

      What specific steps do you believe are necessary to 
increase our confidence in the suitability and survivability of the GMD 
system?

    Dr. McQueary. The Missile Defense Agency (MDA) has imposed design 
specifications on the prime contractor that in its opinion balanced 
need for the Ground-based Midcourse Defense (GMD) system's rapid 
deployment with operational suitability and survivability. The Agency 
is best suited to provide design specifics and the underlying 
rationale.
    While the GMD system did not have Reliability, Availability and 
Maintainability (RAM) requirements designed into the system, the Agency 
implemented a limited RAM program in 2006. For the last year, the 
Operational Test Agency (OTA) Team has been working with the Agency to 
collect data on the suitability of the fielded GMD components. The 
current three-phased review of testing is examining the suitability and 
survivability data that have already been gathered. As improved 
components are fielded, such as the Capability Enhancement II 
Exoatmospheric Kill Vehicle, the Agency and the OTA Team will collect 
and assess the reliability, availability, and maintainability data.
    The MDA has committed to enhance the current RAM program and to 
implement a reliability growth program for new components. Building on 
the Critical Operational Issues and measures previously developed by 
the multiservice OTA Team, the Director, Operational Test & Evaluation 
staff, and U.S. Strategic Command, the Agency and the multiservice OTA 
Team will identify and prioritize tests, venues and resources needed. 
The updated Integrated Master Test Plan (IMTP) will incorporate the 
results of this three-phase review. Execution of this updated IMTP will 
provide the necessary confidence in the operational suitability and 
survivability of the GMD system.
    Ms. Tauscher. As MDA develops its revised testing program, what 
testing work remains to be done and how would you prioritize the work?
    Dr. McQueary. The Missile Defense Agency's (MDA) three-phased test 
review began with an agency wide effort to identify the critical 
factors necessary to examine system capability for each element and the 
overall Ballistic Missile Defense System (BMDS). The goal is to build a 
foundation of models and simulations that will allow us to understand 
performance at the system, element, or sub-element level. In addition, 
both the developmental and operational test communities are identifying 
the other data, such as reliability and maintainability data, necessary 
to support their respective evaluations. This review has already 
highlighted common gaps across the elements such as modeling of 
threats, debris, and general environmental conditions. The focus is to 
identify the testing that the MDA needs to accomplish to validate and 
accredit the models and simulations necessary to evaluate the 
capability of the elements, such as Ground-based Midcourse Defense, as 
well as to evaluate the BMDS as an integrated system. The second phase 
is developing test strategies to capture the required data. The third 
phase will prioritize these requirements and allocate them to the test 
resources available, considering all test capabilities and limitations. 
The MDA plans to complete this effort by June 2009 and to publish a new 
Integrated Master Test Plan that will establish the priorities. The MDA 
Director and I will review and approve this plan. In general terms, I 
expect the top priority to be collecting the data to validate the basic 
system performance models within the likely operational domain of 
current threat systems. Once the basic performance models are 
validated, the Agency can expand testing to examine emerging threat 
capabilities.
    Ms. Tauscher. In the Fiscal Year 2008 Annual Report to Congress, 
DOT&E continued to raise concerns about the effectiveness, suitability, 
and survivability of the GMD system, noting that there was insufficient 
information available to make a determination.

      What specific actions do you need to see before you are 
prepared to declare GMD effective, suitable, and survivable?

    Dr. McQueary. To declare the Ground-based Midcourse Defense (GMD) 
effective, suitable, and survivable, the Missile Defense Agency (MDA) 
will need to accomplish sufficient ground and flight testing to 
successfully validate and accredit the models and simulations that we 
will use to assess GMD capability. There may also be certain events 
which are best empirically measured, such as flight tests with low 
radar cross section re-entry vehicles, high closing and separation 
velocities, and tumbling re-entry vehicles. Finally, technical analyses 
and maintenance data from fielded components will be integrated into 
analytical models to provide predictions of sustainability and 
survivability. I am confident that the MDA's three-phase test review 
will result in a test program that, if fully funded and implemented, 
will allow me to assess GMD effectiveness, suitability, and 
survivability.
    Ms. Tauscher. What multi-mission events, such as cyber attack or 
other asymmetric attacks of key assets, have been introduced during GMD 
flight testing?

      What multi-mission events are planned to be introduced in 
the future and when?

      How is MDA adjusting its overall information assurance 
plan to address these issues?

      If we have no such plans, why is this lack of threat 
realism acceptable?

    Dr. McQueary. Given the overarching safety considerations, flight 
tests are generally not an appropriate venue for the introduction of 
cyber attacks. The Missile Defense Agency (MDA) has conducted this type 
of testing on ground equipment and communication links using integrated 
and distributed ground tests. To date, cyber-attacks scenarios have 
been simulated for Ground-based Midcourse Defense during the Assured 
Response warfighter exercise, and are also planned in future Terminal 
Fury (1 scenario) and Global Thunder (10 scenarios) exercises. This is 
an appropriate method to evaluate the vulnerability and hardness of the 
Ballistic Missile Defense System (BMDS) elements and their 
communication links to cyber and asymmetric attacks. The MDA is 
coordinating the execution of its information assurance plan with the 
overall ground test plan. The on-going three-phased review will 
identify the additional testing required to validate and accredit the 
models and simulations, as well as any other testing needed to complete 
a comprehensive survivability evaluation. The MDA is developing an 
updated Integrated Master Test Plan that will identify the future 
testing needed to address any voids in the BMDS system assessment. The 
warfighter's ability to employ the BMDS under asymmetric or cyber 
attack is best assessed during ground tests or major warfighter 
exercises employing multiple attack vectors simultaneously.
    Ms. Tauscher. The Fiscal Year 2009 National Defense Authorization 
Act prevents DOD from deploying long-range missile defense interceptors 
in Europe until the Secretary of Defense, taking into account the views 
of DOT&E, certifies that the proposed interceptor will work in an 
operationally effective manner and accomplish the mission.

      From your perspective, what are the key operational 
differences associated with deploying the GMD system in Europe as 
compared to Alaska and California?

      What impact would those differences have on how you would 
structure the testing program for the European GMD deployment?

      What specific steps do you believe MDA needs to complete 
before you would recommend that the Secretary of Defense certify that 
the proposed interceptor has a high probability of working in an 
operational effective manner and is capable of accomplishing the 
mission?

    Dr. McQueary. I see four key operational differences associated 
with employing the Ground-based Midcourse Defense (GMD) in Europe as 
compared to the current deployment in Alaska and California: the two-
stage missile; the sensors; the command, control, battle management & 
communications (C2BMC); and the mission timelines. These key 
operational differences must be considered in the testing program for 
the European GMD deployment.
    It is important to note that while the two-stage missile is an 
essential component of the European Capability, the interceptor itself 
is not necessarily unique to the European mission. There are certain 
scenarios where the employment of a two-stage interceptor from Alaska 
might offer specific operational advantages. There are numerous 
similarities between the two-stage booster, its associated launch 
hardware and software, and the existing three-stage booster. The 
Missile Defense Agency (MDA) has successful experience making this kind 
of modification. These changes can be adequately tested in the two 
flight tests currently proposed by the MDA. On the other hand, there 
are two distinct issues with the proposed two-stage interceptor: the 
interceptor itself and its performance in the European scenario. 
Testing the European mission cannot be accomplished with only one 
intercept flight test. I anticipate that the MDA will need to 
accomplish multiple intercept tests as well as numerous hardware-in-
the-loop and ground tests, replicating as closely as possible intercept 
geometries and the timelines associated with them, to validate and 
accredit the necessary models and simulations.
    Testing the sensors, C2BMC, and mission timelines--basically the 
heart of the European mission--is even more challenging. The very short 
timelines associated with threat and target locations, the sensor 
locations, and their associated intercept geometries, makes 
understanding the coordination challenges and communications latencies 
of the C2BMC critical to mission success. The only way to confidently 
understand and adjust to these challenges and latencies is to ground 
test in Europe after the hardware and software have been deployed 
there. If this is not possible, all testing, not just the live 
intercept testing, must be accomplished using the current Pacific test 
bed.
    The intercept geometries, the timelines associated with them for 
both decision making and intercept, and the complex command & control 
issues must be developed, refined, and tested during both intercept 
flight tests and extensive hardware-in-the-loop ground testing while 
simulating the European architecture. To do this, there are a number of 
issues that must be resolved. How does the MDA emulate the European 
Midcourse Radar if it cannot be used for actual intercept testing in 
the Pacific test bed? How does the MDA accurately calculate and then 
replicate communications latencies in the Pacific test bed? How does 
the MDA overcome the limitations with the Pacific test bed that prevent 
realistic testing of the European Mission? Ultimately, models and 
simulations must be developed and verified, validated, and accredited 
before we can be confident in our ability to perform the European 
missile defense mission. This process must be accomplished using the 
Pacific test bed which is not an optimum solution.
    The results of the MDA's current three-phase testing review should 
provide me with better estimate of when I will be able to recommend 
that the Secretary of Defense certify that the proposed interceptor has 
a high probability of working in an operational effective manner and is 
capable of accomplishing the mission.
    Ms. Tauscher. Would you recommend that the GMD program establish a 
master test program similar to Aegis BMD and THAAD?

      What impact has the lack of a master test plan had on 
GMD's ability to adequately plan for the system's long-term testing?

    Dr. McQueary. Both Aegis Ballistic Missile Defense (BMD) and the 
Terminal High Altitude Area Defense (THAAD) have benefited from having 
legacy master test programs. This approach has also allowed these 
programs to efficiently and effectively verify and validate the models 
and simulations necessary to fully examine their capabilities. The 
Missile Defense Agency (MDA) Director has recognized the value of 
having a master test program. He has initiated the Agency wide three-
phase review of the test program and directed development of a test 
plan that spans the Future Years Defense Plan (FYDP). This new approach 
will benefit all of the MDA's programs and will clearly define the 
requirements and resources necessary to accomplish this testing.
    Ms. Tauscher. Last year, the operational test authorities 
accredited the models for Aegis BMD version 3.6.

      Does the Aegis BMD do modeling and simulations 
differently from other BMDS elements?

      If so, what are the key differences?

      Are there lessons from the Aegis BMD modeling and 
simulation program that could be applied across the BMDS, particularly 
to the GMD system?

    Dr. McQueary. Aegis Ballistic Missile Defense (BMD) modeling and 
simulation differs from other Ballistic Missile Defense System (BMDS) 
elements principally due to differences in model maturity. The recently 
transitioned Aegis BMD build (Version 3.6) leveraged existing Aegis 
hardware and software, including associated models and simulations. In 
comparison to other BMDS element models and simulations (for example 
Ground-based Midcourse Defense and Terminal High Altitude Area 
Defense), the Aegis BMD models are older and have acquired significant 
anchoring data to support verification, validation, and accreditation.
    Aegis BMD employs a number of complementary, element-focused, and 
predominantly digital models and simulations. Their results are 
rigorously compared and analyzed during pre-test readiness reviews to 
gain confidence in the model estimates and to predict system 
performance. The results are then compared with the post-flight 
reconstructions.
    The MDA's actions to conduct a comprehensive three-phase test 
review and to develop verification and validation plans for their 
models and simulations (supported by anchoring data) are evidence that 
the MDA is already applying lessons-learned from the Aegis BMD modeling 
and simulation program to other BMDS elements.
    Ms. Tauscher. In its Fiscal Year 2008 Annual Report to Congress, 
DOT&E noted that theater missile defense systems (e.g., Aegis BMD, 
THAAD, and PAC-3) continued to make progress, while strategic systems 
(e.g., GMD) continue to face challenges in regards to testing.

      What are the key reasons for these differences?

      To what extent have Aegis BMD and THAAD's success been a 
result of using their original operational requirements document to 
guide their testing and development?

      Are their lessons from the Aegis BMD and THAAD programs 
that we could apply to the GMD program?

    Dr. McQueary. There are several reasons for these differences:

    1.  The Ground-based Midcourse Defense (GMD) mission (defense 
against intercontinental ballistic missiles) is a more complex task 
than the defense against short and medium range ballistic missiles. 
While there was extensive prototype testing, the current GMD system is 
still in a predominantly developmental test regime with the first 
flight of the new production Capability Enhancement II Exoatmospheric 
Kill Vehicle scheduled later this year.

    2.  The Navy and Army have long traditions of conducting 
operationally realistic testing. The active involvement of the 
respective Service Operational Test Agencies has contributed to this 
success, particularly in determining operational suitability.

    3.  Unlike the Aegis Ballistic Missile Defense (BMD) and Terminal 
High Altitude Area Defense (THAAD) programs, GMD did not originate from 
a Service program with clearly stated operational requirements. While 
it is difficult to ascribe specific benefits to the Aegis BMD and THAAD 
programs, operational requirements documents frame the evaluation 
requirements that ultimately drive rigorous testing.

    In general, Aegis BMD, THAAD, and Patriot have been executing 
rigorous, evaluation-based, traditional test programs that use Service 
best practices to demonstrate required capabilities through testing. 
Completion of the three-phase test review will give the Missile Defense 
Agency (MDA) a significantly improved and evaluation-based test 
program. Execution of this evaluation-based strategy should result in 
verified and validated models and simulations over the expected 
engagement envelope with unique capabilities demonstrated through 
empirical measurement events. The comprehensive test review is evidence 
that the MDA is applying lessons from the Aegis BMD and THAAD programs 
to other Ballistic Missile Defense System elements, including the GMD 
program.
    Ms. Tauscher. How have the missile defense elements, including 
interceptors and sensors, proven in their suitability for rain, high 
winds, snow or sleet, and other severe weather conditions?
    Dr. McQueary. To date, most Ballistic Missile Defense System flight 
testing has occurred under benign conditions. This is primarily due to 
the fact that, while the Missile Defense Agency follows a combined 
operational and developmental (DT/OT) testing program, flight testing 
to date has been more developmental in nature requiring controlled test 
conditions to meet both test objectives and safety requirements. On the 
other hand, all the sensors have tracked objects during a variety of 
environmental conditions; they have just not supported intercept flight 
tests in these conditions.
    Future operational testing will occur in natural environments and 
conditions, as they are present on the day of testing, subject to range 
safety limitations. As long as safety requirements are met, the test 
will execute. However, it would be cost prohibitive to delay a test--
and all the expensive test support--waiting for specific weather 
conditions. Therefore, high fidelity ground testing, incorporating 
validated and accredited environmental models, will be the primary 
means to assess system performance under severe weather conditions. 
This includes, when and where possible, testing in climatic chambers 
such as will be done with all elements of the Terminal High Altitude 
Area Defense system connected and powered simultaneously.
    Ms. Tauscher. How was the Sea-based X-Band radar designed to be 
survivable?

      Have we tested and run exercises to understand this 
issue?

      What about other BMDS sensors?

    Dr. McQueary. The Sea-Based X-band (SBX) radar and the host 
platform were designed to support operations in the Northern Pacific 
Ocean. The 2007 Winter Shakedown period demonstrated SBX survivability 
in extreme environmental conditions. Survivability design 
considerations also included electromagnetic interference and 
compatibility; information operations; and physical security. The 
Department of Defense has developed and implemented tactics, techniques 
and procedures to address the physical security of the SBX. In 
addition, the SBX was designed to provide for future survivability 
(Nuclear, Biological, and Chemical) upgrades. The Missile Defense 
Agency (MDA) continues to conduct tests, exercises, and analyses to 
provide data to characterize SBX survivability.
    For fixed-site sensors such as the Upgraded Early Warning Radars 
and Cobra Dane, which are located on military installations, the MDA 
and the multiservice Operational Test Agency Team will leverage 
previous Service assessments (for example, physical security) wherever 
possible. Performance of these long time operational sensors in various 
environmental conditions is well understood.
    For the AN/TPY-2, the MDA is conducting tests, exercises, and 
analyses of data from actual deployments to characterize the 
survivability of both the forward-based version and the tactical 
version of the radar system. In addition, the tactical version will 
undergo environmental testing in the climatic chamber at Eglin Air 
Force Base, Florida.
    The assessment of sensor survivability is an on-going process. 
Where data voids exist, the MDA will address them as part of the 
current three-phase test review.
    Ms. Tauscher. Due to cost pressures, MDA has removed three flight 
tests from the THAAD flight test program. In the past, DOT&E has raised 
concerns that this action has increased risk to the THAAD program.

      Does it remain your view that MDA's decision to remove 
the three flight tests from the THAAD test program has increased risk 
to the program?

      What specific steps would you recommend for reducing risk 
for the THAAD test program?

    Dr. McQueary. It is still my view that the re-baseline of the 
Terminal High Altitude Area Defense (THAAD) test program increased 
development risk to the program. The reduced number of flight tests, 
combined with the loss of data from FTT-04 as the result of the target 
failure, means fewer opportunities to demonstrate repeatability of 
performance, which raises development risk and lowers confidence in any 
assessments we will make in the future. As it stands today, any loss of 
flight test data will likely require additional flight tests to achieve 
the prescribed knowledge points for THAAD. To the Missile Defense 
Agency's (MDA) credit, when the target failed during FTT-10, the agency 
elected to repeat the flight test (FTT-10a). This decision reflects 
MDA's renewed commitment to an evaluation driven approach.
    The completion of the current three phased review will provide 
another bottom up review of test requirements. The key to reduced 
development risk and a successful program remains a commitment to an 
evaluation based strategy that focuses on the information needed to 
form an evaluation rather than a specific number of flight tests.
    Ms. Tauscher. Do you think that the lethality demonstration 
scheduled for 2009 will constitute proof that ABL is operationally 
effective, suitable, or survivable?

      Will additional tests and analysis are required before 
the operational effectiveness or suitability can be determined?

    Dr. McQueary. The Missile Defense Agency (MDA) is building the 
Airborne Laser (ABL) to demonstrate technology, not to demonstrate 
effectiveness, suitability, and survivability. The MDA did not 
structure the technology demonstration program to provide the data 
necessary to make such an operational assessment. Testing leading up to 
the demonstration will concentrate on preparations for achieving a 
successful shoot down.
    Additional tests and analyses will need to be conducted during the 
systems development phase. Questions of effectiveness, suitability and 
survivability are normally addressed during testing of the production 
representative equipment, in this case airframe #2. I cannot draw 
meaningful conclusions about the potential operational suitability, or 
survivability of the ABL based on the program to date and the success, 
or failure, of the demonstration shoot down. Even attempting to relate 
lessons learned from the design, development, and construction of the 
current ABL airframe to future operational effectiveness, suitability, 
and survivability would be conjecture at best. The demonstration 
airframe is strictly a prototype built to demonstrate a technology. A 
single shoot down during a very controlled, non-operational scenario 
will only give a single example of capability at one point in the 
projected operational envelope of the ABL. At the time of the 
demonstration, there will be no data to draw any conclusions about the 
suitability of the ABL. Any survivability conclusions would be 
hypothetical, as the current technology demonstrator program is not 
structured to address ABL survivability issues. One would anticipate 
that the MDA would make many changes in the first developmental ABL, 
airframe #2, in an effort to make it operationally effective, suitable, 
and survivable. I will be better able to answer these questions during 
the development and testing of the first developmental airframe, not 
the current ABL technology demonstrator.
    That said, a successful high-power laser flight demonstration would 
be a major program milestone and could, with additional relevant 
testing, validate the feasibility of employing the current platform in 
support of high energy laser adjunct missions.
    Ms. Tauscher. The Director of Operational Test and Evaluation has 
indicated in its fiscal year 2008 annual report, that testing for the 
MDA is not yielding enough data to support certification of the 
elements at an individual level and at the integrated system level.

      How does MDA plan to ensure that the BMDS is fully 
tested--including operationally effective and suitable--prior to 
continuing production?

      How is MDA working with DOT&E to improve the data that 
DOT&E needs to certify the BMDS and its elements?

      Will MDA continue the approach of concurrent testing and 
fielding under the new block structure?

      Is that approach still necessary given that the 
Presidential directive to field an early capability has been met?

    Dr. McQueary. A combination of flight and ground testing together 
with validated and accredited models and simulations is needed to 
ensure that the Ballistic Missile Defense System (BMDS) is fully tested 
and demonstrated to be operationally effective and suitable. An 
integrated approach that leverages combined developmental and 
operational testing to the maximum extent feasible is essential.
    Based upon the on-going three-phase review, the Missile Defense 
Agency (MDA) is developing a revised Integrated Master Test Plan (IMTP) 
to document test requirements and ensure that they fully accomplish all 
required BMDS testing through the Future Years Defense Plan. The plan, 
once executed, should also provide all the necessary validation data to 
anchor the models and simulations. My staff is working closely with the 
MDA and the multiservice Operational Test Agency Team to ensure that 
the IMTP addresses our data requirements for certifying the BMDS and 
its elements.
    The MDA has sought to balance developmental maturity and production 
stability, technical risks, and costs, to provide a capability to the 
warfighter where none existed. I will recommend certification after the 
system has demonstrated a high probability of accomplishing its mission 
in an operationally effective manner. The decision as to whether or not 
to continue with concurrent testing and fielding of part or all of the 
BMDS is a matter of policy best considered after advice from the 
Chairman of the Joint Chiefs of Staff and the Combatant Commanders. My 
commitment is to provide the Congress and the Secretary of Defense (and 
ultimately through the latter, the warfighter) with the best available 
information upon which to make their decisions.
    Ms. Tauscher. If a missile defense system has the ``technical 
capability'' to shoot down an incoming ballistic missile target, does 
that mean the system is operationally effective, suitable, or 
survivable and has the ability to accomplish the mission?

      What are the differences between technical capability and 
effectiveness, suitability, and survivability?

    Dr. McQueary. Even though a missile defense system may have the 
``technical capability'' to shoot down an incoming ballistic missile, 
it does not necessarily mean that the system is operationally 
effective, suitable, or survivable.
    ``Operational Effectiveness'' is the overall degree of mission 
accomplishment of a system when used by representative personnel in the 
environment planned or expected for operational employment of the 
system considering organization, doctrine, survivability, 
vulnerability, and threat.
    ``Operational Suitability'' is the degree to which a system can be 
satisfactorily placed in field use, with consideration given to 
availability, compatibility, transportability, interoperability, 
reliability, wartime usage rates, maintainability, safety, human 
factors, manpower supportability, logistics supportability, 
documentation, and training requirements.
    ``Survivability'' is the susceptibility and vulnerability of a 
system to a threat and the ability to repair the system following 
threat-induced damage.
    Technical capability is a system's ability to perform a specific 
function or accomplish a specific mission, for example the ability to 
intercept a particular threat in a given flight regime. While not all 
technically capable systems are effective, all effective systems are 
technically capable. Operational effectiveness implies that the system 
will perform as desired across the full battle space against the full 
spectrum of intended threat systems. Similarly, an operationally 
suitable system will perform satisfactorily under the full range of 
conditions, not just under a certain demonstrated subset.
    Ms. Tauscher. To what extent was the GMD system designed to be 
suitable and survivable?

      What specific steps do you believe are necessary to in 
increase our confidence in the suitability and survivability of the GMD 
system?

    General O'Reilly. The GMD system continues to mature its 
suitability and survivability capabilities. The DOT&E has specified 
several critical operational issues to characterize suitability and 
survivability, and these are listed in their 2008 Assessment of the 
BMDS, dated January 2009. Suitability is defined in the context of BMDS 
strategic and theater missile defense operations as being (1) 
interoperable, (2) reliable, (3) available, and (4) maintainable.
    Interoperability has been ground and flight tested and the Agency 
continues to demonstrate good interoperability among BMDS sensors (AN/
TPY-2, Sea-Based X-band radar, Upgraded Early Warning Radar--UEWR, and 
the Aegis SPY-1 surveillance and tracking radar). Interoperability with 
C2BMC upgrades includes effective communications between the command 
authorities who authorize engagements to the weapon system operators at 
the fire direction centers. GMD flight test FTG-05, in December 2008, 
successfully demonstrated end-to-end testing of the BMDS system with 
excellent interoperability among all four sensors and led to the 
generation of a weapons task plan, a successful engagement, and an 
intercept.
    GMD is implementing a comprehensive, but very limited Reliability, 
Availability and Maintainability (RAM) program to both quantify 
capabilities as well as increase the reliability of the System. The 
current RAM Program, initiated in 2006, includes data collection and 
assessments of deployed assets, reliability growth testing of critical 
components, root cause and corrective action for failures, and a stock 
pile reliability program to assess the shelf-life of selected 
components. GMD formed a Joint Reliability & Maintainability Evaluation 
Team (JRMET) with the Operational Test Agencies (OTAs) to assess the 
RAM data generated from field assets and test events.
    Specific steps to enhance confidence in reliability include: 
comprehensive testing of all components as well as implementing a true 
growth program to both increase reliability as well as enhance service 
life of the components. The GMD System has much the same reliability of 
the initial Minute Man System, which has a comparable missile to the 
GMD Ground-Based Interceptor.
    The DOT&E reported that GMD Blocks 1 and 3 are partially suitable 
for their missions based on a very limited database, and that more data 
are required to perform a comprehensive characterization of 
suitability. GMD materiel readiness has been maintained over the past 
36 months to the point that GMD components are readily available to 
meet heightened Readiness Conditions (REDCON) requirements. In 
addition, readiness rates have consistently exceeded GMD program 
material readiness goals and are consistent with legacy missile 
systems.
    Survivability is an attribute of the degree to which the system is 
survivable against a conventional attack and is survivable in its 
intended operating environment. Therefore, the survivability of the GMD 
system reflects the security of its primary operating locations at 
Schriever AFB, Vandenberg AFB, and Ft. Greely. Additionally, these 
sites have been augmented to meet the security requirements established 
by USSTRATCOM.
    In the area of improvements to Physical Security, GMD is in the 
process of upgrading the Integrated Electronic Security System and 
overall security capabilities. GMD is working with Space & Missile 
Defense Command (SMDC) to definitize requirements that achieve 
necessary system effectiveness to secure the Fort Greely, Alaska site. 
In 2008/2009 GMD installed Ground Surveillance Radars to improve 
detection and plan to increase access delay and denial capabilities in 
2010-2012. As part of this overall upgrade we are also exploring the 
options to harden existing facilities to make them better able to 
withstand direct and indirect attack.
    Survivability of network communications is currently achieved via 
multiple diverse and redundant communications paths provided by both 
satellite and fiber optic links. Ground Systems [GMD Fire Control 
(GFC), GMD Communications Network (GCN), Command Launch Equipment 
(CLE), and In-Flight Interceptor Communications System (IFICS) Data 
Terminal (IDT)] support survivability currently through multiple 
computer processors, communications diversity, and geographic 
redundancy.
    GMD is improving the survivability of its interceptors through 
implementation of the Fleet Avionics Upgrade/Obsolescence Program for 
interceptors to enhance our current capability to operate in the 
operational natural environment.
    Ms. Tauscher. What is your current level of confidence in the 
ability of the GMD system to successfully intercept a potential long-
range missile launched from North Korea--high, medium, or low?
    General O'Reilly. Our confidence in the GMD system's ability to 
successfully intercept a long-range missile launched from North Korea 
at the United States is high. Our confidence is based on several 
factors, 1) Testing of the GMD system, 2) Sufficient weapons (ground-
based interceptors) to counter the expected threat, and 3) Sufficient 
warfighter interaction with the system to develop effective tactics, 
techniques and procedures.
    Although limited to North Korean scenarios, GMD flight testing has 
been successful, as evidenced from three GMD Flight Tests (FTG-02 in 
September 2006, FTG-03a in September 2007, and FTG-05 in December 2008) 
where we intercepted threat representative targets on each occasion. 
However, GMD has not tested the battlespace beyond North Korean 
scenarios representing simplistic threats. Our analysis strongly 
suggests countermeasure tests can be managed during an engagement, and 
a series of countermeasure tests will begin with the next flight test, 
FTG-06 in the 4QFY09.
    The warfighting community has and continues to participate in 
ground and flight tests. Warfighters also use wargames and exercises as 
opportunities to develop and hone their tactics, techniques, and 
procedure to maximize its ability to prosecute the missile defense 
mission. They are trained and certified with the most recent BMDS 
configuration available, and have demonstrated the ability on several 
occasions to activate the system when needed and posture for credible 
and effective operational missile defense of the homeland.
    Ms. Tauscher. What percentage of the currently deployed GMD 
interceptors in Alaska and California would you rate as fully mission 
capable for combat operations at any given time?
    General O'Reilly. [The information referred to is classified and 
retained in the committee files.]
    Ms. Tauscher. Recent GAO reports state the GMD element has 
experienced the same anomaly during each of its flight tests since 
2001. According to the GAO, while the anomaly has not yet prevented the 
program from achieving any of its primary test objectives, GMD has been 
unable to determine its source or determine the anomaly's root cause.

      Please provide an update on the assessment of the 
anomaly, including its potential for causing the interceptor to miss 
its target.

      Please detail how the GMD element has mitigated the 
anomaly and whether all the mitigations have been flight tested and the 
data analysis completed.

      Does this anomaly reduce the confidence in the 
reliability of the emplaced interceptors?

      And has the shot doctrine been changed to provide a 
better chance of probability of kill?

    General O'Reilly. [The information referred to is classified and 
retained in the committee files.]
    Ms. Tauscher. Given its strategic mission to intercept potential 
nuclear armed long-range ballistic missiles, why weren't GMD 
interceptors designed to operate in nuclear environments?

      Do you have plans to retrofit GMD interceptors to operate 
in nuclear environments?

      What are the costs associated with such an upgrade?

      Has MDA planned and programmed for this?

    General O'Reilly. [The information referred to is classified and 
retained in the committee files.]
    Ms. Tauscher. To what extent has the GMD been tested in harsh 
weather environments (e.g., rain, snow, fog, etc)?

      What information can be learned from such testing?

      If not, what are your plans to do so?

    General O'Reilly. The GMD system operates in benign environments 
since it operates at fixed sites on U.S. military bases. The 
interceptors are comprehensively verified after emplacement in silos 
whose environments are carefully monitored and controlled. However, the 
GMD system-level components have been tested under harsh environments 
per MIL-STD-1540. These environments include analysis and testing for 
vibration, shock, thermal balance and climatic conditions. This testing 
has provided high confidence in the components' abilities to perform in 
the widest range of harsh environments expected. Using models and 
simulations, the Agency has conducted system-level ground testing 
against threats in conditions of rain, high winds, snow, sleet, and 
other weather conditions. Results indicate that the GMD system will 
meet its requirements.
    The amount of useful information from ground testing and analysis 
is sufficient to characterize system performance across the spectrum of 
conditions expected.
    Ms. Tauscher. What multi-mission events, such as cyber attack or 
other asymmetric attacks of key assets, have been introduced during GMD 
flight testing?

      What multi-mission events are planned to be introduced in 
the future and when?

      How is MDA adjusting its overall information assurance 
plan to address these issues?

      If we have no such plans, why is this lack of threat 
realism acceptable?

    General O'Reilly. All aspects and operating conditions of the GMD 
system undergo intense scrutiny of multiple Department of Defense 
review and test teams to ensure it is protected from cyber and other 
symmetric attacks in all operating phases, including analysis of 
performance during flight tests. MDA in coordination with COCOMs, JTF-
GNO, and NSA continuously conducts network monitoring and defense in 
order to protect the BMDS.
    Cyber attack simulations or other asymmetric attacks of key BMDS 
assets are not expressly included during development flight testing. 
Introducing an anomaly like a cyber attack rendering inoperable a 
portion of the BMDS during the course of a developmental flight test 
would introduce an unacceptable level of risk of corrupting the test 
objectives. However, for an operational test of military utility, or as 
part of a warfighter's rehearsal or operational readiness drill, 
simulating an attack on the infrastructure would be entirely necessary 
and appropriate.
    To date, cyber-attack scenarios have been simulated for GMD during 
Assured Response warfighter exercise, and are also planned in future 
Terminal Fury (1 scenario) and Global Thunder (10 scenarios) exercises. 
These scenarios exercise the responsiveness to simulated cyber-attacks. 
Penetration tests are regularly performed immediately following ground-
test runs for record. Current Penetration Tests are in planning stages 
to incorporate defensive operations and procedures in response to 
realistic cyber-attacks.
    As part of developing and fielding BMDS capabilities, MDA performs 
Information Assurance (IA) compliance validation tests to make sure 
BMDS capabilities are IA compliant with DoD standards and can operate 
in a cyber threat environment.
    As part of the ground test program, while the system is still in 
the test configuration, MDA performs penetration testing to determine 
if there are any IA weaknesses that could be exploited by potential 
adversaries.
    During normal day-to-day operations, Blue teams are scheduled to 
perform cyber attacks on selected key assets to determine likely threat 
vectors that could be used against BMDS capabilities.
    The Agency continues to plan for and expand testing to address 
emerging threats consistent with the intent of OSD procedures for OT&E 
of Information Assurance in Acquisition Programs, and we are moving 
towards compliance as our penetration testing capabilities increase. 
Our overall information assurance plan provides for a risk-based 
implementation of procedures and countermeasures. The cyber-threat is 
monitored and analyzed, and those results are made available to GMD and 
other elements through a variety of mechanisms including daily 
summaries and, for GMD, presentations at the quarterly GMD System 
Protection Working Group.
    In addition to the simulated threat, it is worthwhile noting that, 
from a threat mitigation perspective, the Agency works closely with our 
Intelligence Community partners, and service counterparts to identify 
the foreign threat to all of our tests--this includes cyber. The exact 
details of this are classified, but we generally ensure that safeguards 
are in-place to identify, and where possible counter every level of 
threat including technical.
    Ms. Tauscher. Do you plan to fly the CE1 version of the GMD EKV 
against a target with countermeasures in an intercept?

      If not, why?

      If we don't conduct such a test, how will we have 
confidence that the system will work in a real-time combat situation?

    General O'Reilly. MDA is currently reexamining its flight testing 
program and expects to include additional flight testing of the 
Capability Enhancement CE-I exoatmospheric kill vehicle (EKV). A BMDS 
test review is now underway to determine the complete body of data 
necessary to validate the BMDS models and simulations and the data 
needed to validate operational effectiveness, suitability and 
survivability. The integrated master test plan will be revised 
following the BMDS test review and it is expected that testing of the 
CE-I EKV will be accomplished and include the specific objective to 
discriminate and intercept a dynamic lethal object from an 
operationally realistic target scene with countermeasures.
    Ms. Tauscher. In fiscal year 2008, due to technical challenges, the 
GMD program was unable to conduct any intercept tests, despite the fact 
that Congress had authorized and appropriated more than $200 million to 
conduct such tests.

      Did the Missile Defense Agency provide the prime 
contractor (Boeing) an award fee for its fiscal year 2008 performance?

      If so, how much and what was the justification for such 
an award?

    General O'Reilly. Boeing was awarded $182.48M (66%) out of a 
potential fee pool of $276.45M for their performance during fiscal year 
2008.
    Boeing's less than satisfactory performance during fiscal year 2008 
resulted in the removal of $95.08M from the award fee pool. More 
specifically, the lost fee opportunities were attributed to:

      Failure to achieve any flight test intercepts during 
fiscal year 2008;

      Missed commitments to deploy up to six new interceptors;

      Delayed deployment of new capability to the warfighter;

      Programmatic and budgetary impact within GMD and the 
Agency due to restructures of the integrated ground and flight test 
program; and

      Failure to provide a joint product for establishing a 
common architecture for the Common Avionics Module.

    $33.57M of the $95.08M lost fee opportunity in fiscal year 2008 was 
authorized to be carried over to the fiscal year 2009 award fee period. 
$25M is authorized to be applied to GMD Flight Test-05 (FTG-05), $5M to 
Distributed Ground Test-03 (GTD-03), $3.57M to Sea-based X-Band 
shipyard performance parameters, Simultaneous Test and Operations long 
haul communications and safety certification, and Upgraded Early 
Warning Radar documentation for Transition and Transfer.
    The Boeing Company significantly contributed to the BMDS mission in 
the following areas:

      Excellent job planning and conducting ground test events 
GTD-02 and GTI-03;

      Exceptional planning and execution of the BMDS system-
level Sensor Characterization Flight Test (FTX-03) and associated data 
analysis even though the target flew an off-nominal trajectory; and

      Noteworthy support of real-world events such as Operation 
Fast Shield.

    Ms. Tauscher. The Director of Operational Test and Evaluation has 
indicated in its fiscal year 2008 annual report, that testing for the 
MDA is not yielding enough data to support certification of the 
elements at an individual level and at the integrated system level.

      How does MDA plan to ensure that the BMDS is fully 
tested--including operationally effective and suitable--prior to 
continuing production?

      How is MDA working with DOT&E to improve the data that 
DOT&E needs to certify the BMDS and its elements?

      Will MDA continue the approach of concurrent testing and 
fielding under the new block structure?

      Is that approach still necessary given that the 
Presidential directive to field an early capability has been met?

    General O'Reilly. A. How does MDA plan to ensure that the BMDS is 
fully tested-including operationally effective and suitable-prior to 
continuing production?
    In the on-going three-phase test review, MDA, the DOT&E and the 
BMDS Operational Test Agency Team are defining how operational testing 
attributes can be incorporated within the BMDS test program. As part of 
the review, critical operational issues are driving future test events, 
to include multiple simultaneous engagements, salvo launches, and more 
complex target presentations. MDA is developing detailed test planning 
requirements for meeting a more robust system assessment, with inputs 
from the BMDS OTA Team and the DOT&E. The review participants are 
planning tests with verifiable, quantifiable results, which will take 
place over the next three to four years. The BMDS Integrated Master 
Test Plan will be approved by MDA, the DOT&E and the BMDS Operational 
Test Agency Team and delivered at the end of May.
    MDA works with USSTRATCOM, DOT&E and the Military Departments to 
ensure adequate integrated development and operational testing. MDA has 
sought an appropriate balance between developmental maturity and 
production stability, technical risks and costs, to provide a 
capability to the warfighter where none existed. The goal is to add 
capabilities with demonstrated military utility, as they mature.
    B. How is MDA working with DOT&E to improve the data that DOT&E 
needs to certify the BMDS and its elements?
    One of MDA's highest priorities is to refocus the BMDS test and 
evaluation program to determine what data are needed to validate our 
models and simulations, so that our warfighter commanders, the DOT&E, 
the BMDS OTA Team, and other decision-makers on the Missile Defense 
Executive Board have confidence in the predicted performance of the 
BMDS. The results of the on-going three phase test review will be a 
top-down-driven, event-oriented plan that extends until the collection 
of all identified data is complete.
    The BMDS test review to date confirms our need to significantly 
improve the rigor of the BMDS digital models and simulations of threat 
missiles, the phenomenology, and operational environments. The BMDS 
Integrated Master Test Plan (IMTP) will define the test program that 
will produce the data needed by DOT&E and the BMDS OTA Team to assess 
the BMDS capabilities, and will be signed by the DOT&E and the BMDS OTA 
Team.
    In order to assure close working relationships, the great majority 
of the BMDS OTA Team members are collocated with the MDA testing staff 
in Huntsville, and the MDA Director for Test meets on a bi-weekly basis 
with his counterpart in the Office of the Director, Operational Test 
and Evaluation.
    C. Will MDA continue the approach of concurrent testing and 
fielding under the new block structure?
    No. MDA intends to complete DT/OT prior to development programs 
being considered for fielding and operational acceptance decisions by 
the Services. However, when a contingency need arises (such as 
protection of the U.S. from long-range North Korean missiles) the 
appropriate COCOM Commanders and Joint Chiefs of Staff consider the 
capability and limitations of our developmental systems. If ordered, 
MDA will employ components of the BMDS on a contingency basis.
    D. Is that approach still necessary given that the Presidential 
directive to field an early capability has been met?
    MDA uses a capability-based acquisition process that allows MDA to 
address emerging, real-world threats as expeditiously as possible. Our 
process is based on collaboration with the warfighter community 
throughout development, testing and fielding. The priorities of the 
warfighter are based on the need to respond to real world threat. The 
results of the Joint Capability Mix Phase II study are evidence that 
DoD supports the importance of responding to the threat quickly.
    In some cases, such as GMD, we fielded limited capabilities to 
protect the Nation where portions of the system performance had been 
demonstrated in early tests. MDA fielded parts of GMD to provide a 
limited capability, and we continue to test in parallel for a full 
capability. In other cases, such as Aegis BMD 3.6.1, we have fielded an 
operational capability that has been tested and evaluated by the Navy's 
COMOPTEVFOR, and continue to field additional capabilities for optimum 
BMDS integration and multi-area-of-responsibility use. With THAAD, we 
developed an initial capability that we have demonstrated against most 
short range threats, but have just begun our test campaigns to address 
medium range threats. MDA plans to continue this approach, to provide 
critical capability in increments to the warfighter based on their 
priorities.
    Ms. Tauscher. To what extent has the BMD sensors been tested in 
harsh weather environments (e.g., rain, snow, fog, etc)?

      What information can be learned from such testing?

      If you have not conducted such testing, when to you plan 
to initiate such testing?

    General O'Reilly. Sea-Based X-Band (SBX) radar and AN/TPY-2 radars:
    A. SBX equipment and procedures were thoroughly tested in a 
detailed Winter Shakedown test from 3 Jan through 20 Feb 07 in the 
harsh northern Pacific Ocean, to include wave heights up to 50 feet and 
sustained winds of 60 knots, gusting to 102 knots. The assessment 
demonstrated payload performance in Alaska environments; safety at sea; 
vessel navigation; sustainment operations; and COCOM and external 
agency inter-operability.
    B. The AN/TPY-2 radar has been performing very well in austere 
environments in both Japan (since 2006) and Israel (since 2008).
    C. Cobra Dane and Upgraded Early Warning Radars were designed, 
built, and tested by the Air Force. The operating frequencies of UEWR 
(UHF) and CDU (L-Band) are minimally affected by weather environments 
(e.g., rain, snow, fog). The UEWR and CDU use the same external 
facilities (e.g.., array face, structure) that housed the Early Warning 
radars and COBRA DANE. The facilities have been in place for more than 
20 years and have successfully operated and survived in all 
environments during that period. Therefore, specific weather related 
testing is unnecessary for UEWR and CDU.
    Ms. Tauscher. To what extent was the Sea-based X-Band radar 
designed to be survivable?

      Have you tested and run exercises to understand this 
issue?

      What about other BMDS sensors?

    General O'Reilly. [The information referred to is classified and 
retained in the committee files.]
    Ms. Tauscher. DOT&E's latest report indicated that target 
reliability was a continuing problem in 2008. For example, in two 
recent flight tests, FTX-03 and FTG-05, target missiles did not 
successfully deploy the planned countermeasures, which prevented the 
elements from developing algorithms needed for advance discrimination. 
DOT&E reported that until these target problems are solved, this poses 
a risk to future flight tests using countermeasures.

      Please provide a status on developing advanced algorithms 
for discrimination.

      What capability does MDA currently possess for 
discrimination? How was this capability verified?

      Will additional flight tests need to be scheduled to 
understand the discrimination capabilities of the currently fielded 
interceptors?

    General O'Reilly. [The information referred to is classified and 
retained in the committee files.]
    Ms. Tauscher. Last year, the operational test authorities 
accredited the models for Aegis BMD version 3.6.

      Does the Aegis BMD do modeling and simulations 
differently from other BMDS elements?

      If so, what are the key differences?

      Are there lessons from the Aegis BMD modeling and 
simulation program that could be applied across the BMDS, particularly 
to the GMD system?

    General O'Reilly. The primary difference in accreditation status 
resulted when the Navy's Commander Operational Test and Evaluation 
Force (COMOPTEVFOR), accredited Aegis Element models, primarily MEDUSA, 
for Navy operational effectiveness. While in a MDA system-level 
performance simulation venue, Performance Assessment 2007 (PA07), 
EADSIM was used as the Aegis 3.6 representation which the MDA 
Operational Test Agency (OTA) did not accredit for BMDS system 
performance, primarily due to model limitations.
    In Performance Assessment 2009 (PA09), the Navy is using MEDUSA as 
the Aegis representation and will be the first opportunity for the OTA 
to evaluate the performance of the Aegis MEDUSA model during a MDA 
system-level performance event.
    Aegis BMD has a successful test program that provides numerous 
opportunities to collect test data to validate their M&S 
representations. This is their main advantage that can be shared with 
GMD. The lesson learned is that a lack of test data, especially from 
flight tests, does not allow OTA to accredit their representations. I 
have placed personal emphasis and scrutiny on tightly linking test 
events for elements and the BMD system to validating MDA models and 
simulations.
    Ms. Tauscher. System-level performance assessments are a 
comprehensive means to fully understand the performance capabilities 
and limitations of the BMDS. In order to have high confidence in 
system-level models and simulations, MDA relies on an independent 
entity, the BMDS Operational Test Agency, to provide an accreditation. 
This organization depends on the verification and validation work 
performed by MDA's elements. Accreditation is an official decision of 
how much confidence there is in a model or simulation used in the 
performance assessment. Currently, the BMDS Operational Test Agency has 
fully accredited 6 out of 40 models and simulations, which are used for 
annual performance assessments. MDA intends to complete Performance 
Assessment 2009 by the end of the calendar year, but it is highly 
unlikely that this performance assessment will be fully accredited.

      What is MDA doing to make progress in validating models 
and simulations?

      When do you anticipate that MDA will have a fully-
accredited, system-level performance assessment?

    General O'Reilly. The Missile Defense Agency (MDA) has implemented 
a Modeling and Simulation (M&S) System Post Flight Reconstruction 
(SPFR) program to better leverage the performance data that is gained 
through Flight Testing for model validation. During SPFR assessments, 
BMDS M&S representations are exercised under day-of-flight conditions 
to compare model performance to actual system performance. For calendar 
year 2009, MDA will implement SPFR validation assessments for system 
level flight tests in both Hardware-in-the-Loop (HWIL) and end-to-end 
digital performance assessment representations.
    The BMDS is a capability based continuously evolving architecture--
a spiral development process. Each delivery of a new missile defense 
capability requires the delivery of new models and simulations. The 
delivery of new models and simulations requires additional 
accreditation. Thus, as new versions of components emerge, their 
modeling and simulation representations must be anchored back to real-
world events and data. Utilizing the SPFR program, and through analysis 
of the models and simulations database, the level of accreditation and 
confidence in the representation of BMDS performance will continue to 
increase. As we complete the Performance Assessment 2009 (PA09) effort 
in late Calendar Year 2009, we will complete accreditation review, 
based on OTA criteria, of the models that represent the December 2009 
BMDS configuration. Any model structure or real world validation data 
shortcomings indentified in this process will be addressed through 
anchoring back to real world events and data when available. Validation 
data requirements will be provided to the test planning process. The 
completion of accreditation for models of the December 2009 BMDS 
configuration will not be precisely known until this accreditation 
review is complete. The PA09 model ensemble which represents the BMDS 
December 2009 configuration will be maintained and improved to meet any 
shortcomings indentified in the accreditation process. The Agency is 
restructuring the test program to provide data for Modeling & 
Simulation (M&S) Validation. The M&S Verification and Test Design 
Process will allow for collection of data parameters through flight and 
ground tests. As part of this on-going effort, the system level 
simulations, Digital Simulation Architecture (DSA) and the Single 
Stimulation Framework (SSF), will provide a fully capable 
representation of the fielded 2010 BMDS configuration in October of 
2010. The data to support Verification, Validation & Accreditation 
(VV&A) of the DSA and SSF is being addressed as part of the scheduling 
activity during Phase III of the M&S Test Verification and Design 
Process. The product of the Phase III activity is a revised BMDS 
Integrated Master Test Plan (IMTP) which identifies the test events 
providing M&S validation data. The schedule for completion of BMDS 
Block Validation and Accreditation will be completed in conjunction 
with the revised IMTP.
    Ms. Tauscher. In its Fiscal Year 2008 Annual Report to Congress, 
DOT&E noted that theater missile defense systems (e.g., Aegis BMD, 
THAAD, and PAC-3) continued to make progress, while strategic systems 
(e.g., GMD) continue to face challenges in regards to testing.

      What are the key reasons for these differences?

      To what extent have Aegis BMD and THAAD's success been a 
result of using their original operational requirements document to 
guide their testing and development?

      Are there lessons from the Aegis BMD and THAAD programs 
that we could apply to the GMD program?

    General O'Reilly. What are the key reasons for these differences?
    The relative level of maturity between the programs is the key 
reason for differences noted in the DOT&E Annual Report. GMD was an 
advanced concept program in 2002, when National Security Presidential 
Directive-23 directed MDA to deploy a set of initial missile defense 
capabilities beginning in 2004. GMD early development assets were 
placed into operational service to provide this initial capability. 
Continuing the spiral development process, while at the same time 
responding to real world demands, has slowed some of GMD's planned 
progress. For example as a first priority, the initial GMD test program 
focused on establishing confidence that the system would in fact meet 
the challenges of the early threat.
    In contrast, the first Aegis ship was commissioned in 1983. 
Starting in 2002, MDA developed the necessary modifications to add 
Aegis Ballistic Missile Defense capability into an already existing 
Aegis fleet. Sound systems engineering in support of performance 
cornerstones was and remains essential to how the Aegis project, and 
now Aegis BMD, organizes and executes the ballistic missile defense 
mission. A combination of development, system engineering, integration, 
testing, training, logistics, technical support, operations and 
sustainment has been operating successfully for close to forty years. 
Leadership, to include communication, responsibility, authority and 
accountability, is a hallmark of the Aegis BMD program.
    THAAD was defined as a program in 1992 and went through an eight 
year concept definition phase before entering full scale development in 
2000, and is expected to deliver its first operational assets later 
this year.
    Both Aegis and THAAD were significantly more mature programs at the 
time MDA was created and given the mission to provide Limited Defensive 
Capability through accelerated development, testing and deployment of 
the GMD system.
    To what extent have Aegis BMD and THAAD's success been a result of 
using their original operational requirements document to guide their 
testing and development?
    Greater system maturity, not the existence of operational 
requirements documents, accounts for the greater success of the Aegis 
and THAAD test programs. The Secretary of Defense cancelled all missile 
defense Operational Requirements Documents in 2002. Since then, MDA 
specification documents and test plans have guided development and 
testing for GMD, Aegis BMD and THAAD. For each element, testing under 
operationally realistic conditions is an important part of maturing the 
BMDS. The MDA has been fielding test assets in operational 
configurations in order to conduct increasingly complex, end-to-end 
tests of the system. Comprehensive ground tests of the elements and 
components precede each flight test. MDA increasingly introduces 
operational realism into BMDS flight tests, bound only by consideration 
of and compliance with environmental and safety regulations.
    Aegis BMD uses a number of different BMDS and Aegis BMD documents 
for testing and development. However, system maturity and the Aegis BMD 
test program philosophy drive their success rate. Throughout its 
development, Aegis BMD has employed a deliberate, rigorous and 
disciplined technical approach to testing. There is tight coupling of 
modeling and simulation, ground testing and flight tests. Modeling and 
simulation are anchored with ground and flight test data. Aegis BMD 
philosophy to ``test as we fight'' institutes operational realism in 
all flight tests. Aegis BMD involves the operational test agent and 
warfighter in the early planning and conduct of each mission. Following 
each mission, critical Fleet feedback is provided to engineering 
development.
    The THAAD Project Office had a JROC approved Operational 
Requirements Document (ORD) on 1 May 2000. The ORD was the principal 
tool to guide the THAAD Project Office through the design phase. The 
ORD was used to conduct requirement trades for the System Preliminary 
Design Review in 2002 and was used as a guide for System Critical 
Design Review (CDR) in 2003. The ORD was not used to write critical 
operational issues and criteria for use in current test designs and 
operational assessments.
    Are there lessons from the Aegis BMD and THAAD programs that we 
could apply to the GMD program?
    GMD has drawn some lessons learned from Aegis BMD. In 2005 there 
were two early GMD flight test failures attributable to flaws, first in 
the software, and then with a fixture in a test silo. A mission 
readiness task force was established to set standards for rigor in test 
reviews. Drawing on lessons learned from the Aegis test program, these 
standards were applied not just to GMD, but implemented throughout MDA, 
and have resulted in successful flight tests from that point. MDA 
encourages the staff at all levels to collaborate and apply lessons 
learned both within their elements, as well as across the board, to 
improve mission success.
    Ms. Tauscher. In 2009, MDA plans to demonstrate the ABL during a 
lethality demonstration in which the system will attempt to shoot down 
a short-range ballistic missile. The KEI element also has a key 
decision point--a booster flight test--scheduled for 2009. In 
preparation for this test, the program conducted static fire tests and 
wind tunnel tests in fiscal year 2007 to better assess booster 
performance. Upon completion of KEI's 2009 booster flight test and 
ABL's 2009 lethality demonstration, MDA will compare the progress of 
the two programs and decide their futures.
      Do you believe that the lethality demonstration scheduled 
for 2009 will constitute proof that an operational ABL is feasible and 
should be acquired for the boost phase system?
    General O'Reilly. No. The lethality demonstration is necessary but 
not sufficient to determine if the ABL should be acquired. A successful 
lethal demonstration by the ABL will answer two vital questions. First, 
is the technology ready? A successful demonstration will prove the 
technology is available to engage and destroy a ballistic missile 
during a missile's most vulnerable phase before a payload can be 
employed or countermeasures can be deployed. Second, is the lethality 
concept feasible? A successful demonstration will increase the value of 
a layered missile defense while reducing the viability and 
effectiveness of enemy ballistic missiles. DoD's intent is to continue 
to test and comprehensively assess the current prototype ABL as a 
research test bed while refining the design prior to a Tail 2 
production decision in the future.
    Ms. Tauscher. The 2009 lethality demonstration is a key knowledge 
point for the ABL. Upon completion of the demonstration, MDA will 
decide the future of the program. Even with a successful demonstration, 
MDA will need to determine whether an operationally effective and 
suitable ABL can be developed.
      Given that the 2009 lethality demonstration is successful 
and the ABL continues through development and into fielding, how does 
MDA plan to proceed with the development of the system's unique 
operations and support requirements?
      Starting with the lethality demonstration, please lay out 
the key questions that must be answered on ABL in order for it to be 
considered technically practical (that is, it can do its job reliably), 
operationally practical (that is, it can actually be where it needs to 
be and when), and practically supportable (that is, its maintenance can 
be performed by military personnel, its maintenance and support is 
affordable, and the basing/support operation is feasible in terms of 
human safety)?

      How does MDA plan to minimize the difficulty of 
relocation and unique handling difficulties associated with the ABL?

      What safety concerns currently exist with the ABL?

    General O'Reilly. Q1) How does MDA plan to proceed with the 
development of the system's unique operations and support requirements?
    A1) MDA will work closely with Air Combat Command (ACC) to refine 
the Concept of Operations (CONOPS). ACC developed the current ABL 
CONOPs (January 2007) and has been instrumental in providing user 
requirements into the ABL element of BMDS.
    The ABL aircraft is a Boeing 747-400F that requires minimal ABL 
airframe-unique maintenance and support. These aircraft are in service 
throughout the world with a robust support structure. The weapon system 
utilizes commercially available chemicals (hydrogen peroxide, salt, 
chlorine, ammonia, iodine and helium) that are available globally. ABL 
has developed a prototype deployable/transportable chemical mix 
facility that will allow the manufacturing of laser fuel at any Forward 
Operating Location. A demonstration of ABLs ability to deploy will 
occur after lethal demonstration. ABL will continue to mature the life-
cycle operation and support plans through continued tests, studies and 
user inputs. ABL will meet warfighter operational and support 
requirements and provide a globally deployable capability.
    Q2) What are the key questions that must be answered in order for 
ABL to be considered technically practical (that is, it can do its job 
reliably)?
    Key Questions include: Is the ABL lethal against all classes of 
ballistic missiles? Can ABL detect and track ballistic missiles 
threats? Can ABL compensate for the effects of atmospheric effects 
between ABL and the target? Does the ABL provide a capability that 
meets warfighter needs?
    ABL will address the most critical question of lethality during 
ABL's lethality test/demonstration against a threat-representative 
ballistic missile in Fall of 2009. Data from this demonstration can 
then be used to anchor models to predict lethality against different 
types/classes of missiles. Successful completion of this test will also 
demonstrate ABL's ability to detect and track ballistic missiles.
    The most technically challenging requirement for ABL has been 
compensating for the atmospheric effects between ABL and the target. 
ABL repeatedly demonstrated its ability to perform this critical 
function in 2006 during dozens of flight tests against special aircraft 
designed to assess ABL's atmospheric compensation capability.
    After lethal demonstration, MDA will continue to generate knowledge 
concerning ABL by addressing military utility issues. These follow-on 
ABL efforts will develop and demonstrate more robust capability within 
the ABL design and address key warfighter requirements such as: BMDS 
interoperability, weapon system maneuverability, ABL deployability, 
survivability, Reliability, Maintainability and Sustainability (RM&S). 
Key technology areas that will be addressed are improving performance 
of the optics, optimum chemical utilization, increased laser power, 
modularization of the onboard chemical supply system, and increased 
beam quality. In combination with the existing RM&S program, these 
efforts will ensure that future ABLs provide a revolutionary 
warfighting capability.
    Q3) What are the key questions that must be answered in order for 
ABL to be considered operationally practical (that is, it can actually 
be where it needs to be and when)?
    Key questions include: Is ABL deployable? Is ABL maintainable at a 
Forward Operating Location? Are the laser chemicals available at 
Forward Operating Locations?
    Actions to address questions: According to the ABL Concept of 
Operations, operational ABLs will be primarily based in the continental 
United States. Operational ABLs will be able to deploy to Forward 
Operating Locations world-wide. The deployment of ABL is dependent on 
threat. ACC and STRATCOM will utilize current intelligence to ensure 
ABL, as a critical component of the BMDS, is in the appropriate theater 
of operations to engage targets to defend U.S. interests, and to 
provide critical tactical information to other BMDS components. ABL 
plans to demonstrate its deployment capability after lethal 
demonstration. Deployment requirements have been documented, a 
prototype chemical mix facility has been built and some unique support 
equipment is available to support deployment. ABL will utilize common 
support equipment available at bases that already support heavy 
aircraft. This will reduce the size of the deployment package. During 
deployments, required maintenance will be accomplished on the flight 
line or in maintenance facilities located just off the flight line. 
Deployed ABLs will have technical support and supply reach-back to the 
home bases to ensure operational availability. Continued collection and 
analysis of ABL sustainment data will ensure refinement of operational 
support needs. ABL will have the ability to deploy its entire support 
requirements on short notice to anywhere in the world in 72 hours.
    Q4) What are the key questions that must be answered in order for 
ABL to be considered practically supportable (that is, its maintenance 
can be performed by military personnel)? Are its maintenance and 
support affordable? Is the basing/support operation feasible in terms 
of human safety?
    Key question to determine ABL being practically supportable . . . 
utilizing military personnel . . . include: Are ABL maintenance 
requirements significantly different than other complex weapon systems? 
Are maintenance tasks that are required for normal ABL operations 
within the capability of Air Force maintenance personnel?
    ABL maintenance by Military personnel: ABL, like other complex 
weapon systems will have multiple levels of maintenance. Future ABL 
logistics support will be a mix of contractor support (depot and 
complex repairs) and organic Air Force support. As the program 
continues to gain knowledge via the existing ABL, we will be better 
able to determine the appropriate level of support required by both Air 
Force and contractor support. For deployed ABLs, Air Force maintenance 
personnel will perform aircraft maintenance and basic weapon system 
maintenance. ABL, with the support of ACC, has performed initial 
studies on the various skills required to operate and maintain the 
weapon system. ACC will continue to be actively involved in development 
of ABL maintenance. ABL will further refine maintenance requirements 
during the Characterization and Capabilities Demonstration period 
following lethal demonstration.
    Key questions to determine ABLs maintenance and support is 
affordable: Does the ABL program have a Reliability, Maintainability 
and Sustainability program in place? Does the RM&S program address 
life-cycle cost of maintenance and associated support? Are efforts in 
place or planned to reduce cost drivers?
    ABL maintenance and support affordable: During the Characterization 
and Capabilities Demonstration period, ABL will perform affordability 
studies to include maintenance and support of ABL within the BMDS 
layered defense environment. The key components that will be addressed 
during this period are the life-cycle cost drivers. The studies 
undertaken by the ABL System Program Office and industry partners will 
utilize the existing Reliability, Maintainability and Sustainability 
(RM&S) program to evaluate factors that contribute to life-cycle costs. 
Management of these life-cycle cost drivers will provide the efficient 
and effective support of future ABLs.
    Key question to determine that ABL basing/support operation is 
feasible in terms of human safety: Does the ABL safety program 
adequately mitigate potential personnel hazards associated with 
operation and maintenance of ABL?
    Safe ABL operation and maintenance: At Edwards AFB, ABL has 
successfully demonstrated the ability to safely support all ABL 
operations since testing of the high power laser in 2004. This 
excellent safety record is a result of ABL safety program. We will 
utilize this safety program wherever ABL is located. ABL has performed 
initial deployment studies addressing issues related to safe operation 
and maintenance at forward operating locations. ABL has planned a 
deployment demonstration that will demonstrate ABL's ability to conduct 
safe operations at various U.S. and foreign bases.
    Q5) How does MDA plan to minimize the difficulty of relocation and 
unique handling difficulties associated with the ABL?
    A5) Operational ABL will be able to deploy to Forward Operating 
Locations world-wide. These locations can be at any operational 
location where heavy cargo aircraft operate. Runways, taxiways, and 
instrument approach requirements are similar to those for other heavy 
aircraft. ABL support equipment is comprised of typical military and 
commercial aircraft ground support equipment (air conditioning, 
auxiliary power, etc.) and equipment that is particular to ABL. ABL 
will utilize common support equipment available at bases in-theater 
wherever possible to reduce the size of the deployment package. ABL 
will require a limited number of specialized support equipment to 
service the laser weapon and mix laser chemicals. ABL has developed and 
demonstrated a prototype deployable chemical mixing facility. The 
chemicals required to operate the High Energy Laser are globally 
available in commercial markets (chlorine, hydrogen, ammonia, iodine 
and helium). To ensure the quality and availability of chemicals, pre-
positioning of ABL laser fuels at pre-designated FOLs will ensure 
immediate availability upon arrival of ABL aircraft. The laser fuel 
chemicals have an unlimited shelf-life prior to mixing. Air mobility or 
sea transportation can then be used to replenish those laser fuels not 
locally available. Transport of specialized equipment will require 
military airlift. The amount of deployed equipment will depend on the 
operational scenario, but will be smaller than the footprint of the 
actual 747 aircraft.
    Q6) What safety concerns currently exist with the ABL?
    A6) Current ABL safety concerns are categorized into ten hazard 
areas--these include chemical containment on and off the aircraft, fire 
suppression on and off the aircraft, degradation of critical structures 
and critical systems, degradation of flying qualities, solar avoidance, 
incorrect pointing, and beam containment. The program utilizes a 
rigorous 4-prong safety approach to identify, assess, and mitigate all 
safety concerns. First, the Program Office identifies the level of risk 
associated with each hazard area, prior to each test series. These 
risks are then accepted at the proper level within the MDA. Second, the 
Air Force Flight Test Center at Edwards AFB identifies, documents, and 
accepts any risks associated with testing the system safely. Third, the 
Boeing Commercial Aircraft Group assesses the system to determine the 
safety implications from modifications made to the `green aircraft' and 
subsequently issues a `safe-to-fly' letter prior to each flight test 
series. Lastly, an Executive Independent Review Team assesses ABL's 
compliance with airworthiness standards and assesses the safety of 
flight test risks and hazard mitigations; they also provide a `safe-to-
fly' recommendation prior to each flight test series.
                                 ______
                                 
                  QUESTIONS SUBMITTED BY MR. HEINRICH

    Mr. Heinrich. BMDS test schedules are driven by the costs and 
availability of targets. MDA has also experienced a number of failures 
with targets over the past several years. The lack of affordable threat 
representative targets is seriously impacting the adequacy of 
operationally realistic flight testing.

      What is MDA doing to ensure affordable targets are 
available to support adequate developmental and operational test 
objectives?

    General O'Reilly. MDA is taking a number of steps beginning this 
year to ensure targets are more affordable and available including:

      Identifying cost drivers in requirements and challenging 
their need or identifying other/cheaper ways to obtain data (e.g., 
range sensors already participating in test)

      Reviewing and refining the acquisition strategy, 
identifying industry capabilities and considering alternative 
approaches for supporting the test program. In the industry request for 
information released in January, we emphasized that we will focus on 
target reliability, affordability, flexibility, and threat 
representation. Based on the results of industry input and the 
performance of current target development programs, MDA will determine 
this summer if recompeting current target contracts is warranted.

      Evaluating test campaigns, grouping by threat and by 
range, and improving efficiencies in mission planning and execution.

      Increased quantity buys for economy of scale.

      Increasing the availability of targets by providing a 
rolling spare for each mission.

      Improving the long-term requirements development process 
to allow better target planning across the Future Year's Defense 
Program to reduce perturbations in target requirements (a major target 
cost/schedule driver).

    Mr. Heinrich. A recent study indicated that the Army and Sandia 
National Laboratory provided MDA with targets on time and at a 
reasonable cost before the targets management and procurements efforts 
were moved to MDA.

      Has MDA considered giving the targets development and 
acquisition program back to the Army Targets Office and Sandia National 
Laboratory?

      What are the pros and cons of giving responsibility for 
development and acquisition of missile targets program back to the Army 
Targets Office?

    General O'Reilly. Target production is an integral part of the MDA 
test mission and there are no ongoing discussions with the Army 
regarding assuming this function. The targets procured from the Army 
were primarily through a Missile Defense Targets office whose personnel 
were direct funded by MDA and used Space and Missile Defense Center 
contract vehicles. The Target Vehicles were shorter range, lower 
fidelity, targets and did not represent threat capabilities as 
accurately as current target configurations under development by MDA. 
The lower fidelity targets are, by their very nature, less costly to 
develop or manufacture. With the establishment of an MDA targets 
office, Army personnel have either become MDA employees or found other 
employment and the Army contract vehicles have lapsed or were 
transferred to MDA. In addition, targets from Sandia National 
Laboratories (SNL) have come under increased scrutiny with the failure 
of the last two SNL targets (FTG-05 and FTX-03) to deliver associated 
objects. As a result, not all critical test objectives were achieved 
despite achieving most test objectives for FTX-03 and a successful 
intercept test with FTG-05.
    MDA is assessing recent inputs from industry and other sources in 
response to an MDA Request for Information to determine the need to 
recompete target delivery contracts. All respondents, including SNL, 
are being considered.
    Mr. Heinrich. Given the greater complexity, difficulty, and risk of 
intercepting missiles during the midcourse phase, how important do you 
think it is to invest in technologies focused on boost phase?
    General O'Reilly. [The information referred to is classified and 
retained in the committee files.]