HR001122S0039 Amendment 01
HR001122S0039 Amendment 01
HR001122S0039 Amendment 01
2
TABLE OF CONTENTS
3
VIII. Other Information ..............................................................................................................46
IX. APPENDIX 1 – PROPOSAL SUMMARY SLIDE ..........................................................48
4
PART I: OVERVIEW INFORMATION
Federal Agency Name – Defense Advanced Research Projects Agency (DARPA),
Information Innovation Office (I2O)
Funding Opportunity Title – Assured Neuro Symbolic Learning and Reasoning
(ANSR)
Announcement Type – Initial Announcement
Funding Opportunity Number – HR001122S0039
Catalog of Federal Domestic Assistance Numbers (CFDA) – 12.910 Research and
Technology Development
Dates
o Posting Date: June1, 2022
o Proposers Day: June 1, 2022
o Abstract Due Date and Time: June 13, 2022, 12:00 noon (ET)
o Questions Due: June 24, 2022, 12:00 noon (ET)
o Proposal Due Date and Time: July 26, 2022, 12:00 noon (ET)
Agency contact
o Points of Contact
The BAA Coordinator for this effort can be reached at:
Email: ANSR@darpa.mil
DARPA/I2O
ATTN: HR001122S0039
675 North Randolph Street
Arlington, VA 22203-2114
5
PART II: FULL TEXT OF ANNOUNCEMENT
I. Funding Opportunity Description
The Defense Advanced Research Projects Agency (DARPA) is soliciting innovative proposals in
the following areas of interest: Artificial Intelligence (AI) algorithms and architectures that
deeply integrate symbolic reasoning with data-driven machine learning to create robust, assured,
and, therefore, trustworthy AI-based systems. Proposed research should investigate innovative
approaches that enable revolutionary advances in science, devices, or systems. Specifically
excluded is research that primarily results in evolutionary improvements to the existing state of
practice.
Autonomy and highly autonomous systems are a desired capability for many Department of
Defense (DoD) missions – Intelligence, Surveillance and Reconnaissance (ISR), Logistics,
Planning, Command and Control among others. The purported benefits are many, including – (1)
improved operational tempo and mission speeds; (2) reduced cognitive demands on warfighter in
operation and supervision of autonomous systems; and (3) increased standoff for improved
warfighter safety. A crucial desideratum associated with autonomy is the need for
trustworthiness and trust, as emphasized by the 2016 Defense Science Board (DSB) Report on
Autonomy1. Informally, trust is an expression of confidence in an autonomous system’s ability to
perform an underspecified task. Assuring that autonomous systems will operate safely and
perform as intended is integral to trust, which is key to DoD’s success in adoption of autonomy.
In the six years since the publication of the DSB report on Autonomy, significant improvements
have been made in machine learning (ML) algorithms that are central to achieving autonomy.
Simultaneously, innovations in assurance technologies have delivered mechanisms to assess the
correctness and safety trustworthiness of systems at design time and be resilient at operation
time. In spite of this progress, high levels of autonomy remain elusive, which we attribute to
fundamental limitations of data-driven ML (discussed below), motivating new thinking and
approaches that will take ML beyond data-driven pattern recognition and augment it with
knowledge-driven reasoning that includes context, physics, and other background information.
The last decade witnessed tremendous progress in applications of data-driven ML, fueled by
growth in compute power and data, in areas that span a wide spectrum ranging from board games
to protein folding, language translation to medical image analysis. In several of these
6
applications, ML and related techniques have demonstrated performance that rivals, and
occasionally surpasses, human capability with respect to a set of narrowly curated metrics.
However, despite these apparent successes, there are a number of concerns associated with state-
of-the-art (SOTA) ML algorithms. It is well known, for example, that SOTA ML algorithms do
not generalize well2, lack transparency and interpretability, and are not robust to environmental3
and adversarial perturbations. Some of the limitations, such as a lack of robustness to adversarial
examples, have been theoretically determined to be fundamental4 in nature.
The prevailing trend in industrial ML research is towards scaling up to Giga- and Tera- scale
models (100’s of billions of parameters) as a means to improve accuracies and performance.
These trends are not sustainable because of the extremely high computational5 and data needs for
training such models, as well as scaling laws6. These trends are also not responsive to the needs
of DoD applications, which are typically data- and compute-starved with limited access to cloud-
scale compute resources. Furthermore, DoD applications are safety and mission-critical, need to
operate in unseen environments, need to be auditable, and need to be trustable by human
operators. In sum, the prevailing trends in ML research are not conducive to the assurability and
trustworthiness needs of DoD applications.
The traditional approaches to building intelligent applications and autonomous systems7,8 rely
heavily on knowledge representations and symbolic reasoning. For example, complex decision-
making in these approaches is often implemented with programmed condition-based rules,
stateful logic encoded in finite state machines, and physics-based dynamics of environments and
objects represented using ordinary differential equations. There are numerous advantages of
these classical techniques:
they use rich abstractions that are grounded in domain theories and associated
formalisms and that are supported by advanced tools and methods (Statecharts,
Stateflow, Simulink, etc.);
they can be modular and composable in ways supported by software engineering
practices that promote reuse, precision, and automated analyses; and
they can be analyzable and assurable in ways supported by formal specification and
verification technologies that have been demonstrated in hardening mission and
safety critical systems against cyber attacks9.
2 Recht et al, “Do ImageNet Classifiers Generalize to ImageNet?,” arXiv, 2019, https://arxiv.org/abs/1902.10811.
3 Zhao et al, “Sim-to-Real Transfer in Deep Reinforcement Learning for Robotics: a Survey,” arXiv 2021,
https://arxiv.org/abs/2009.13303.
4 Shafahi et al, “Are Adversarial Examples Inevitable?”, arXiv, 2020, https://arxiv.org/abs/1809.02104.
5 https://cset.georgetown.edu/wp-content/uploads/AI-and-Compute-How-Much-Longer-Can-Computing-Power-
Drive-Artificial-Intelligence-Progress.pdf.
6 Thompson et al, “Deep Learning’s Diminishing Returns”, IEEE Spectrum, September 2021, Part of special report
on The Great AI Reckoning - Deep learning has built a brave new world—but now the cracks are showing
https://spectrum.ieee.org/deep-learning-computational-cost.
7 Schrage et al, “Software-enabled control for intelligent UAVs,” Proceedings of the 1999 IEEE International
Symposium on Computer Aided Control System Design (Cat. No.99TH8404), 1999, pp. 528-532, doi:
10.1109/CACSD.1999.808703.
8 Venugopalan et al, “Autonomous landing of an Unmanned Aerial Vehicle on an autonomous marine vehicle,”
http://loonwerks.com/publications/cofer2021defcon.html.
7
However, these approaches also have limitations when used in real-world autonomy
applications. They fare poorly when dealing with real-world uncertainty and high dimensional
sensory data, which is integral to perception and situation-understanding applications10, The rule-
set and stateful logic in these decision-making applications are often incomplete and insufficient
when exposed to unanticipated situations. Further, it is well understood that common-sense
knowledge is intractable to codify. For example, the Cyc11 knowledge base includes millions of
concepts and tens of millions of rules and yet is inadequate for many real-world tasks.
The challenge of assuring cyber physical systems (CPS) with ML components has been an active
area of research supported by DARPA’s ongoing Assured Autonomy program as well as other
research initiatives. Specifically, in Assured Autonomy, the assurance approach developed by the
program has resulted in: (1) formal and simulation-based verification tools that can
comprehensively explore the behavior of a CPS; (2) monitoring tools that can detect deviations
of ML components from expected inputs and behavior; resilience and recovery strategies to
avoid worst-case safety consequences; and (3) an assurance case framework that enables
structured argumentation backed by evidence in support of the claim that major safety hazards
have been identified and their root causes have been adequately mitigated.
The ANSR program seeks breakthrough innovations in the form of new hybrid AI algorithms
that deeply integrate symbolic reasoning with data-driven learning to create robust, assured, and
therefore trustworthy systems. We define a system as trustworthy, if it is: (a) robust to domain-
informed and adversarial perturbations; (b) supported by an assurance framework that creates
and analyzes heterogenous evidence towards safety and risk assessments; and (c) predictable
with respect to some specification and models of "fitness."
We hypothesize that several of the limitations in ML today are a consequence of (1) the inability
to incorporate contextual and background knowledge; and (2) treating each data set as an
independent uncorrelated input. In the real-world, observations are often correlated and a product
of an underlying causal mechanism, which can be modeled and understood. We posit that hybrid
AI algorithms capable of acquiring and integrating symbolic knowledge and performing
symbolic reasoning at scale, will deliver robust inference, generalize to new situations, and
10 Davies, Alex, “An Oral History of the DARPA Grand Challenge, the Grueling Robot Race That Launched the
Self-Driving Car,” Wired Magazine, https://www.wired.com/story/darpa-grand-challenge-2004-oral-history/
11 Lenat, D., “CYC: a large-scale investment in knowledge infrastructure,” Commun. ACM 38, 11 (Nov. 1995), 33–
38. https://doi.org/10.1145/219717.219745
12 Dreossi et al, “Counterexample-Guided Data Augmentation,” in 27th International Joint Conference on Artificial
8
provide evidence for assurance and trust.
We envision modifying both the training and inference procedures to interleave symbolic and
neural representations for iterative inference and mutual adaptation of the representations to
exploit the benefits and reduce the limitations of each representation. The modified training
procedure will result in representations that are grounded in domain-specific symbols, essentially
a symbolic equivalent of the Neural Network’s (NN) implicit data representation. The modified
inference procedure iteratively converges to a response that is conformant to both the symbolic
and neural representations. The symbolic representation can explicitly include prior knowledge
and domain-specific rules and constraints and enables verification against specification and
construction of assurance arguments.
Some recent results for specific applications provide the basis for confidence. For example, a
recent study13 prototyped a hybrid reinforcement learning (RL) architecture that acquires a set of
symbolic policies through data-driven learning. The symbolic policies are in the form of a small
program that is interpretable and verifiable. The approach demonstrably inherits the best of both
worlds: it learns policies that are highly performant in a known environment, and it generalizes
well by remaining safe (crash-free) in an unknown environment. Another recent approach14 uses
symbolic reasoning to fix errors in a NN in estimating the object-poses in a scene, and it achieves
substantially higher (30-40% above baseline) accuracy in several cases.
The hybrid AI techniques developed by the program will enable new mission capabilities. The
program intends to demonstrate assured execution of an unaided ISR mission to develop a
Common Operating Picture (COP) of a highly dynamic dense urban environment. The
autonomous system performing the ISR mission will carry an effects payload to reduce sensor-
to-effects delivery time. While the delivery of effects is gated by human on-the-loop, an effects-
carrying system is quintessentially a safety and mission-critical system and, therefore, requires
strong guarantees of collision avoidance and mission performance. The capabilities required of
the autonomous system in terms of deep situational understanding and decision-making are not
achievable by SOTA machine learning or standalone symbolic reasoning systems. The training
data is sparse, further motivating the use of hybrid AI methods.
B. Program Description
The overarching goal of the ANSR program is to advance hybrid AI algorithms and develop
evidence-based techniques that support confident assurance judgments for these algorithms. The
program intends to explore diverse hybrid architectures that can be seeded with prior knowledge,
acquire both statistical and symbolic knowledge through learning, and adapt learned
representations. The program intends to demonstrate and evaluate hybrid AI techniques through
DoD mission-relevant use-cases where assurance and autonomy are mission-critical.
The program envisions a new take on representation learning and inference to lead towards
hybrid AI. The SOTA ML, specifically NNs, can be viewed as learning a low-dimensional
13 “Neurosymbolic Reinforcement Learning with Formally Verified Exploration”, Anderson G. et al., NeurIPS,
2020.
14 “3DP3: 3D Scene Perception via Probabilistic Programming”, Gothoskar et al., arXiv, 2021,
https://arxiv.org/abs/2111.00312
9
representation of a high-dimensional data set. Figure 1 provides a grossly oversimplified
rendering of the training process. The gray hill is a depiction of an objective function (loss
function) with respect to the parameters (or weights) of the NN. Each dot on the gray hill is a
point in this parametric space and represents the value of the objective function with respect to
the current parametric configuration of the NN.
Standalone neural machine learning entails a climb up the gradient to optimize the objective
function. The optimal configuration represents a best fit for the training data. The representation,
while being a good fit to the underlying training data, remains agnostic about the causality or the
underlying mechanisms that produced the data. In the absence of any knowledge about the
underlying mechanisms, inference tasks remain bound by the distribution of the training data and
are unable to generalize beyond the training data distribution.
10
It is worth noting that the approach articulated above is only a notional approach towards
learning hybrid neural and symbolic representations, and it is not intended as a prescription for a
solution. The program anticipates many architectures – best suited for the specific application
task – that combine neural and symbolic representations with different approaches to acquire,
optimize, and use in inference the closely coupled neural and symbolic representations.
The development in the program will be orchestrated in four technical areas (TAs) summarized
below:
TA1. Algorithms and Architecture – The goal of TA1 is to develop and model new AI
algorithms and architectures that deeply integrate symbolic reasoning with data-driven machine
learning. TA1 will explore and evaluate a range of possible algorithms and architecture patterns
that are suitable for different tasks.
TA2. Specification and Assurance – The goal of TA2 is to develop an assurance framework and
methods for deriving and integrating evidence of correctness and quantifying mission-specific
risks. TA2 will establish a pipeline that abstracts the hybrid neuro-symbolic representations into
formally analyzable representations and analyzes them with respect to a set of mission-dependent
specifications. TA2 will also explore techniques to estimate and quantify mission-specific risks.
TA3. Platforms and Capability Demonstration – The goal of TA3 is to develop use-cases and an
architecture for engineering mission-relevant applications of hybrid AI algorithms suitable for
the demonstration and evaluation of robust and assured performance. Specifically, the program
intends to pursue demonstration through assured execution of an unaided ISR mission to develop
a Common Operating Picture (COP) of a highly dynamic dense urban environment.
TA4. Assurance Assays and Evaluation – The goal of TA4 is to 1) develop an assurance test
harness with adversarial AI; and 2) evaluate the technologies in individual technical areas and
their compositions in systems. TA4 will act as a red team that probes the validity of assurance
claims through adversarial evaluations. TA4 will also refine the proposed program metrics and
define measures to characterize the trustworthiness of the system. TA4 will need to assess
robustness, generalizability, and assurance claims through adversarial evaluation that employs
confounding perturbations and quantify the loss of system performance.
The guiding challenge for the program will be the assured execution of an unaided ISR mission
in a highly dynamic dense urban environment. At present, ISR missions are conducted by
warfighters either through forward presence, or through teleoperated ISR assets such as drones.
The ISR asset in these cases simply provides a video feed to the warfighter, who then has to
process and analyze video feeds. The warfighter needs to distinguish adversaries from non-
combatants, understand adversarial activities, analyze the scene to identify additional scan paths
and focus areas, and maneuver the ISR asset to maximize stealth and safety. These are
challenging activities that currently impose high cognitive burden on the warfighter and require
them to continuously be in-the-loop. These activities cannot be realized with direct applications
of SOTA ML, and they will require substantial use of context and background knowledge,
interleaved reasoning with learning, and rigorous analyses and evidence to derive assurance
claims about safety and mission success.
11
The program will seek to demonstrate a fully autonomous ISR mission with the warfighter only
identifying the area of interest (AOI). The autonomous system provisioned with ANSR
technologies will need to develop a comprehensive situation understanding that encompasses
navigability and characterizes potential threat actors. The autonomous system will need to make
maneuver decisions that maximize the situation understanding while maintaining safety. The
system will autonomously and incrementally generate a comprehensive, timely, and accurate
COP and deliver insights that help characterize friendly, adversary, and neutral entities, the
operating environment, and threat and safe corridors. The program anticipates multiple
demonstrations beginning with simulation-based experiments (SIMexp) conducted in multi-
player role-playing gaming environments and concluding with live experiments at DoD facilities.
Phase 1 of the program will develop and prove out high-risk technology components – situation
understanding, activity recognition, and safe maneuver decision making. The Phase 1
experiments will be multiple partial threads conducted in gaming environments and through
SIMexp: (thread 1) demonstrate safe and assured decision making for maneuvers, while
assuming perfect perception; (thread 2) demonstrate both activity recognition and situation
understanding, while assuming human-guided safe maneuvers; and (thread 3) demonstrate COP
development, insights, and analytics, while assuming perfect perception and human guided safe
maneuvers. The evaluation will be performed against mission capability metrics as well as
against SOTA baselines.
Phase 2 of the program will integrate these individual threads and demonstrate close-loop
situation understanding, assured and safe maneuvers decision-making, COP building and
analytics for end-to-end demonstration of unaided ISR missions. The evaluation will be done
with respect to mission capability metrics (e.g., COP completeness, accuracy, and timeliness,
scan efficiency, human effort required, sensor-to-effect timeline), and technology metrics (e.g.,
assurance validity, robustness, generalizability, and accuracy).
Phase 3 of the program will demonstrate an end-to-end ISR mission with live exercises in DoD
facilities. The evaluation will include after-action reviews and soldier feedback in addition to the
mission capability and technology metrics.
C. Technical Areas
There are multiple points of essential collaboration among TAs, and the Government expects all
performers to collaborate effectively. Proposers should read the descriptions of all TAs and the
Program Assessments/Schedule section to ensure a full understanding of the program context,
structure, and anticipated relationships required among performers. To facilitate the open
12
exchange of information, all program performers will have Associate Contractor Agreement
(ACA) language included in their award. See Section VIII for further information. Proposers
may submit a single TA proposal or an integrated TA1 and TA2 proposal. No additional
combination of proposals will be accepted.
The applications for hybrid AI algorithms may include deep situational understanding, activity
recognition, and developing safe and optimal maneuvers. Situational understanding in these
applications may entail not just a labeling of the entities in a scene, but more complex
attributions such as trajectories, capabilities, and even intent. The tasks are expected to require
the fusion of diverse knowledge sources and modalities in addition to perceptual data alone.
The deep integration of interleaved symbolic and NN elements at multiple scales should enable
vastly superior performance in terms of accuracy, robustness, and generalizability over the
SOTA of either type of methods - symbolic alone or data-driven alone – in applications of
interest in this BAA such as perception, planning, and control. The feedback between symbolic
and neural components should enable performance improvements at all scales and quantities of
interest, for example, in perception, from entity identification and resolution to more complex
situational understanding, as well as in timeliness for decision making in closed-loop systems
such as real-time control.
It is desirable that the hybrid AI algorithms be generalizable to diverse symbolic domains both
for incorporating available prior knowledge as well as for being suitable for inter-working with
the domain tools for assurance reasoning (see TA2). The underlying knowledge representation
for the application domains of interest may include logic systems (e.g., first-order logic, linear
temporal logic), dynamical systems (e.g., differential equations), finite state machines, graphs,
visual concepts, etc.
The program seeks hybrid AI algorithms that are application-specific (algorithms for perception,
algorithms for planning, algorithms for control, etc.), as well as architectures in which many such
algorithms could be composed in a federated or hierarchical configuration. TA1 proposers
should present approaches for monitoring and regulating individual hybrid AI algorithms in such
a federated architecture.
Research challenges in TA1 include, but are not limited to: (a) symbolic knowledge extraction
from latent representation of NN so that such extracted knowledge could be pooled with other
available symbolic knowledge to support reasoning to derive context and for adaptation; (b)
cueing the NN for continual adaptation (parameters, architectures) based on contextual
knowledge for hypothesis resolution and better performance; and (c) integrated and iterative
reasoning over symbolic and neural representations.
13
These challenges are important and illustrative, but are not exhaustive, so proposals should
clearly articulate other challenges considered in the proposal and the methods to address them.
The BAA considers the deep integration of symbolic reasoning with data-driven ML sought here
to be a fertile ground for new ideas. Proposals may bring to bear multiple methods (for example,
imitation learning and mirror descent, hierarchical predictive processing, generative models,
neural network adaptation, etc.) or propose entirely new approaches. A single TA1 proposal is
not required to address all the challenges or all the tasks relevant to the mission context discussed
in TA3, however, stronger proposals will address multiple challenges and tasks. In all cases, the
proposals should explain, presenting any preliminary results as evidence, how the proposed
approaches are likely to be successful in terms of the desired algorithmic capabilities and the
ability to meet the program metrics.
The assurance challenges for the ANSR program include dealing with (a) composed hybrid
neural and heterogeneous symbolic representations; (b) dynamic mission-dependent
specifications; and (c) the integration of heterogeneous evidence to satisfy assurance claims and
quantify residual risk. The methods should be applicable to different symbolic representations
such as state machines or differential equations and across applications such as perception,
planning, and control.
The algorithmic technical challenges include: (a) mapping the composed hybrid neural and
symbolic representations into representations suitable for formal analysis, and (b) developing
methods for analysis to find proofs through evidence and claims and to characterize ‘assured
regions’ in the behavior space as well as ‘regions of no-assurance’ for given specifications. With
respect to assurance, a fundamental difference between ANSR and assurance in traditional
systems is the inherent uncertainty involved in learning systems, which has to be considered by
the algorithms, along with possible effects of feedback loops in deep neuro-symbolic integration.
The proposed methods should ensure that the ‘assured region’ so derived is maximal relative to
the ‘expected mission region’ implied by the specifications. Proposals should also address
methods for analyzing the fundamental tradeoffs involved between the specifications
(introducing relaxation/restrictions) and the ‘assured region’ (expansion/contraction), and they
should develop methods to change the specification and/or the neuro-symbolic parameters to
maximize the assurance envelop. Further, proposals should propose methods to quantify the risk
involved in regions of no-assurance and methods to minimize the risk.
14
The BAA considers the assurance methods for deeply-integrated symbolic reasoning and data-
driven machine learning sought here to be a fertile ground for new ideas. Proposals may bring to
bear a combination of multiple methods (for example, probabilistic and hybrid model checking,
dynamic assurance case technology, runtime risk assessment, etc.) or propose entirely new
approaches. In all cases, the proposals should describe, presenting any preliminary results, how
the proposed approaches are likely to be successful in terms of the assurance capabilities desired
here and the ability to meet the program metrics.
TA3 proposers should propose an autonomy platform on which ANSR technologies could be
deployed. The platform should be provisioned with a baseline autonomy stack that enables
autonomous navigation using a suite of perceptual sensors. TA3 proposers will need to develop
challenge problems and tasks for TA1 performers. TA3 proposers will also need to provide the
background knowledge and datasets necessary for developing the hybrid AI algorithms for these
challenge problems – which may include, but are not limited to, deep situational understanding,
activity recognition, maneuver planning, and COP construction and analytics – capabilities
integral to executing an unaided ISR mission. Further, each TA3 proposed platform should be
able to support simulation-based experiments and as well as be used in eventual live
experiments. Simulation-based experiments would require an embodiment of the platform
including the necessary sensors and actuators, high-fidelity dynamics, and an encapsulation of
the baseline autonomy stack in a simulation environment such as Gazebo or Airsim.
The mission and technology challenges for such an integrated platform include, but are not
limited to the abilities to: (a) generate comprehensive, timely, and accurate COP; (b) demonstrate
generalizability and adaptability to different urban environments and scenarios; (c) provide
activity and scene understanding with different sensory data; (d) ingest and reason with symbolic
knowledge, for example, maps, entity dynamics, mission specification, and rules of engagement;
and (e) provide assurance and quantify risk to mission performance to establish trustworthiness.
The TA3 performers should work closely and support TA1 and TA2 performers to address these
mission and technology challenges.
15
TA4: Assurance Assays and Evaluation
The goal of TA4 is to (a) develop an assurance test harness that includes adversarial AI, and (b)
evaluate ANSR technologies both in individual technical areas and in their compositions into
complete systems, especially for the use case described in TA3. TA4 proposals should address
both of these goals. TA4 proposals should clearly describe an approach to achieving these goals
and articulate the unique benefits of the proposed approach in comparison to other possible
approaches.
Assurance test harness with adversarial AI: TA4 will act as a red team that probes the validity of
assurance claims and assurance case through adversarial evaluations. Proposals must present the
approaches to generate adversarial methods for anticipated hybrid-AI algorithms (to be
developed by TA1) and methods for spoofing the ML, also considering the full range of the
‘expected mission region’ (see TA2 description). Proposal must present a technical rationale and
any preliminary evidence they have for the potential success of their approach.
Evaluate individual technologies and composed systems: The program seeks to characterize the
performance of individual hybrid AI components with respect to accuracy metrics using
benchmarks and the performance of composed systems using mission-specific metrics. TA4
proposals must present approaches to (a) develop benchmarks for individual technologies in the
domains of perception, planning, and control; (b) develop and refine mission-specific metrics for
the use case of unaided dense-urban ISR described in the TA3 section; and (c) develop a test
harness to characterize the performance of said systems in all program phases including field
experiments.
TA4 will also refine the proposed program metrics (listed in Section E), identify baselines, and
evaluate the program developed technologies (encompassing TA1, TA2, and TA3) with respect
to the proposed metrics. Trustworthiness is challenging to quantify, and measurement will be a
topic of research for TA4 performers. The program seeks to characterize trustworthiness along
two dimensions: (1) ability to operate with acceptable performance even when diverging from
the planned scenario; and (2) transparency and predictability in the behavior of the system
obtained by assurance argumentation and evidence. TA4 proposals should present in detail the
Test and Evaluation (T&E) approaches for these two dimensions. They are further encouraged to
present other compelling ways to perform T&E of the use case described in the TA3 section.
16
Figure 2: Schedule
The Government will specify the locations for Principal Investigator (PI) meetings during
program performance. There will be two PI meetings in Phase 1, held approximately 6 months
and 12 months after the kick-off meeting. There will be one PI meeting in both Phase 2 and
Phase 3, held roughly 7.5 months after the beginning of each phase. Each phase will end with a
demonstration workshop. PI meeting locations are likely to be spread across performer locations,
and the proposers should plan to host at least one PI meeting over the duration of the program.
The goals of the PI meetings will be to present new research findings and accomplishments,
review plans for the next period, discuss implementation milestones, and resolve any
programmatic, budgeting, or logistics issues.
In addition to these program-wide events, the Government team will conduct site visits and will
hold monthly teleconference meetings with each PI to enhance communications with the
Government team.
For travel planning and costing, assume eight (8) trips during the entire three (3) phases (2022-
2026) per the program schedule shown above, alternating between Washington, DC and San
Diego, CA, with each trip requiring 3-days and 2-nights.
E. Metrics
For the Government to evaluate the effectiveness of a proposed solution in achieving the stated
program objectives, proposers should note that the Government hereby promulgates the
following program metrics (Table 1 below) that may serve as the basis for determining whether
satisfactory progress is being made to warrant continued funding of the program. Although the
following program metrics are specified, proposers should also note that the Government has
identified these goals with the intention of bounding the scope of effort, while affording the
maximum flexibility, creativity, and innovation in proposing solutions to the stated problem.
17
Trustworthiness is challenging to quantify, and measurement remains a topic of research on its
own. Consistent with the definition offered earlier, the program will characterize trustworthiness
along two dimensions: (1) ability to operate with acceptable performance even when diverging
from the planned scenario; and (2) transparency and predictability in the behavior of the system
obtained by assurance argumentation and evidence. The program will characterize performance
of individual hybrid AI components with respect to accuracy metrics using benchmarks and
performance of composed systems using mission-specific metrics (e.g., completeness and
accuracy of COP), and target an order of magnitude improvement over current baselines. The
program will assess robustness, generalizability, and assurance claims by an adversarial
evaluation (akin to red teaming for cyber assurance) that employs semantic variations (for
example changing movement and types of entities in a scene) from baseline (training) scenarios
and quantify the loss of system performance (target degradation below 5%).
1Includes object classification, geolocation, and activity recognition (from a defined set).
2SOTA COPter (name given to the ISR platform for developing COP) will be teleoperation with SOTA ML. Training data will be from exercises
and synthetic. The projected SOTA accuracy for object recognition from video/images is 60% at moderate size training data; SOTA activity
recognition accuracy is only 30-40% or less in 2020 Activity-Net benchmarks (https://aiindex.stanford.edu/wp-content/uploads/2021/03/2021-AI-
Index-Report-_Chapter-2.pdf.).
F. Deliverables
All performers will be required to provide the following deliverables:
Technical papers covering work funded by the ANSR program;
Source code, build scripts, and any toolchains required to compile code, algorithms and
interface description documents, user guides, other necessary data, and documentation,
assumptions, and limitations for all software developed under this program;
Slide Presentations. Annotated slide presentations shall be submitted within one month
after the program kick-off meeting and after each program event (program reviews, PI
meetings, and technical interchange meetings);
Quarterly Progress Reports. A quarterly progress report describing technical progress
made, resources expended, major risks, planned activities, trip summaries, changes to key
personnel, and any potential issues or problem areas that require the attention of the
Government team shall be provided within 15 days after the end of each quarter;
Monthly Progress Reports. A monthly progress report in the form of a PowerPoint
document describing technical progress, planned activities for the next month, and any
18
technical, financial, and programmatic issues shall be provided and presented in a
teleconference with DARPA;
Monthly financial status reports;
A final phase report after each program phase. The final phase report shall concisely
summarize the effort conducted within that phase; and
Final Technical Report.
G. Intellectual Property
The program will emphasize creating and leveraging open source technology and architectures.
Intellectual property rights asserted by proposers are strongly encouraged to be aligned with
open source regimes.
A key goal of the program is to establish an open architecture and algorithms that allow for
integrating symbolic knowledge and domain-specific representations with data-driven machine
learning. This goal includes the ability to easily add, remove, substitute, and modify the type of
symbolic representation as well the type of data. This capability will facilitate rapid innovation
by providing a base for future users or developers of program technologies and deliverables to
build upon and rapidly customize and reuse for different AI applications in both DoD and non-
DoD use-cases. Therefore, it is desired that all noncommercial software (including source code),
software documentation, hardware designs and documentation, and technical data generated
under the program be provided as deliverables to the Government, with a minimum of
Government Purpose Rights (GPR), as lesser rights may adversely impact the lifecycle costs of
affected items, components, or processes.
Multiple awards are anticipated. The amount of resources made available under this BAA will
depend on the quality of the proposals received and the availability of funds.
The Government reserves the right to select for negotiation all, some, one, or none of the
proposals received in response to this solicitation and to make awards without discussions with
proposers. The Government also reserves the right to conduct discussions if it is later determined
to be necessary. Additionally, DARPA reserves the right to accept proposals in their entirety or
to select only portions of proposals for award. In the event that DARPA desires to award only
portions of a proposal, negotiations may be opened with that proposer. The Government reserves
the right to fund proposals in phases with options for continued work, as applicable.
The Government reserves the right to request any additional, necessary documentation once it
makes the award instrument determination. Such additional information may include but is not
limited to Representations and Certifications (see Section VI.B.2., “Representations and
Certifications”). The Government reserves the right to remove proposers from award
consideration should the parties fail to reach agreement on award terms, conditions, and/or
cost/price within a reasonable time, and the proposer fails to timely provide requested additional
19
information. Proposals identified for negotiation may result in a procurement contract, grant,
cooperative agreement, or other transaction, depending upon the nature of the work proposed, the
required degree of interaction between parties, whether or not the research is classified as
Fundamental Research, and other factors.
In accordance with 10 U.S.C. § 4003(f), the Government may award a follow-on production
contract or Other Transaction (OT) for any OT awarded under this solicitation if: (1) that
participant in the OT, or a recognized successor in interest to the OT, successfully completed the
entire prototype project provided for in the OT, as modified; and (2) the OT provides for the
award of a follow-on production contract or OT to the participant, or a recognized successor in
interest to the OT.
In all cases, the Government contracting officer shall have sole discretion to select award
instrument type, regardless of instrument type proposed, and to negotiate all instrument terms
and conditions with selectees. DARPA will apply publication or other restrictions, as necessary,
if it determines that the research resulting from the proposed effort will present a high likelihood
of disclosing performance characteristics of military systems or manufacturing technologies that
are unique and critical to defense. Any award resulting from such a determination will include a
requirement for DARPA permission before publishing any information or results on the
program. For more information on publication restrictions, see the section below on Fundamental
Research
B. Fundamental Research
It is DoD policy that the publication of products of fundamental research will remain unrestricted
to the maximum extent possible. National Security Decision Directive (NSDD) 189 defines
fundamental research as follows:
‘Fundamental research’ means basic and applied research in science and engineering, the
results of which ordinarily are published and shared broadly within the scientific
community, as distinguished from proprietary research and from industrial development,
design, production, and product utilization, the results of which ordinarily are restricted
for proprietary or national security reasons.
As of the date of publication of this solicitation, the Government expects that program goals as
described herein may be met by proposed efforts for fundamental research and non-fundamental
research. Some proposed research may present a high likelihood of disclosing performance
characteristics of military systems or manufacturing technologies that are unique and critical to
defense. Based on the anticipated type of proposer (e.g., university or industry) and the nature of
the solicited work, the Government expects that some awards will include restrictions on the
resultant research that will require the awardee to seek DARPA permission before publishing
any information or results relative to the program.
20
University or non-profit research institution performance under this solicitation may include
effort categorized as fundamental research. In addition to Government support for free and open
scientific exchanges and dissemination of research results in a broad and unrestricted manner, the
academic or non-profit research performer or recipient, regardless of tier, acknowledges that
such research may have implications that are important to U.S. national interests and must be
protected against foreign influence and exploitation. As such, the academic or non-profit
research performer or recipient agrees to comply with the following requirements:
(a) The University or non-profit research institution performer or recipient must establish
and maintain an internal process or procedure to address foreign talent programs,
conflicts of commitment, conflicts of interest, and research integrity. The academic or
non-profit research performer or recipient must also utilize due diligence to identify
Foreign Components or participation by Senior/Key Personnel in Foreign Government
Talent Recruitment Programs and agree to share such information with the Government
upon request.
i. The above described information will be provided to the Government as part of
the proposal response to the solicitation and will be reviewed and assessed prior
to award. Generally, this information will be included in the Research and Related
Senior/Key Personnel Profile (Expanded) form (SF-424) required as part the
proposer’s submission through Grants.gov.
1. Instructions regarding how to fill out the SF-424 and its biographical
sketch can be found through Grants.gov.
ii. In accordance with USD(R&E) direction to mitigate undue foreign influence in
DoD-funded science and technology, DARPA will assess all Senior/Key
Personnel proposed to support DARPA grants and cooperative agreements for
potential undue foreign influence risk factors relating to professional and financial
activities. This will be done by evaluating information provided via the SF-424,
and any accompanying or referenced documents, in order to identify and assess
any associations or affiliations the Senior/Key Personnel may have with foreign
strategic competitors or countries that have a history of intellectual property theft,
research misconduct, or history of targeting U.S. technology for unauthorized
transfer. DARPA’s evaluation takes into consideration the entirety of the
Senior/Key Personnel’s SF-424, current and pending support, and biographical
sketch, placing the most weight on the Senior/Key Person’s professional and
financial activities over the last 4 years. The majority of foreign entities lists used
to make these determinations are publicly available. The DARPA Countering
Foreign Influence Program (CFIP) “Senior/Key Personnel Foreign Influence Risk
Rubric” details the various risk ratings and factors. The rubric can be seen at the
following link:
https://www.darpa.mil/attachments/092021DARPACFIPRubric.pdf
iii. Examples of lists that DARPA leverages to assess potential undue foreign
influence factors include, but are not limited to:
21
1. Executive Order 13959 “Addressing the Threat From Securities
Investments That Finance Communist Chinese Military Companies”:
https://www.govinfo.gov/content/pkg/FR-2020-11-17/pdf/2020-25459.pdf
2. The U.S. Department of Education’s College Foreign Gift and Contract
Report: College Foreign Gift Reporting (ed.gov)
3. The U.S. Department of Commerce, Bureau of Industry and Security, List
of Parties of Concern: https://www.bis.doc.gov/index.php/policy-
guidance/lists-of-parties-of-concern
4. Georgetown University’s Center for Security and Emerging Technology
(CSET) Chinese Talent Program Tracker:
https://chinatalenttracker.cset.tech
5. Director of National Intelligence (DNI) “World Wide Threat Assessment
of the US Intelligence Community”: 2021 Annual Threat Assessment of
the U.S. Intelligence Community (dni.gov)
6. Various Defense Counterintelligence and Security Agency (DCSA)
products regarding targeting of US technologies, adversary targeting of
academia, and the exploitation of academic experts: https://www.dcsa.mil/
DARPA’s analysis and assessment of affiliations and associations of Senior/Key
Personnel is compliant with Title VI of the Civil Rights Act of 1964. Information
regarding race, color, or national origin is not collected and does not have bearing
in DARPA’s assessment.
University or non-profit research institutions with proposals selected for
negotiation that have been assessed as having high or very high undue foreign
influence risk, will be given an opportunity during the negotiation process to
mitigate the risk. DARPA reserves the right to request any follow-up information
needed to assess risk or mitigation strategies.
iv. Upon conclusion of the negotiations, if DARPA determines, despite any proposed
mitigation terms (e.g. mitigation plan, alternative research personnel), the
participation of any Senior/Key Research Personnel still represents high risk to
the program, or proposed mitigation affects the Government’s confidence in
proposer’s capability to successfully complete the research (e.g., less qualified
Senior/Key Research Personnel) the Government may determine not to award the
proposed effort. Any decision not to award will be predicated upon reasonable
disclosure of the pertinent facts and reasonable discussion of any possible
alternatives while balancing program award timeline requirements.
(b) Failure of the academic or non-profit research performer or recipient to reasonably
exercise due diligence to discover or ensure that neither it nor any of its Senior/Key
Research Personnel involved in the subject award are participating in a Foreign
Government Talent Program or have a Foreign Component with an a strategic competitor
or country with a history of targeting U.S. technology for unauthorized transfer may
result in the Government exercising remedies in accordance with federal law and
regulation.
22
i. If, at any time, during performance of this research award, the academic or non-
profit research performer or recipient should learn that it, its Senior/Key Research
Personnel, or applicable team members or subtier performers on this award are or
are believed to be participants in a Foreign Government Talent Program or have
Foreign Components with a strategic competitor or country with a history of
targeting U.S. technology for unauthorized transfer , the performer or recipient
will notify the Government Contracting Officer or Agreements Officer within 5
business days.
1. This disclosure must include specific information as to the personnel
involved and the nature of the situation and relationship. The Government
will have 30 business days to review this information and conduct any
necessary fact-finding or discussion with the performer or recipient.
2. The Government’s timely determination and response to this disclosure
may range anywhere from acceptance, to mitigation, to termination of this
award at the Government’s discretion.
3. If the University receives no response from the Government to its
disclosure within 30 business days, it may presume that the Government
has determined the disclosure does not represent a threat.
ii. The performer or recipient must flow down this provision to any subtier contracts
or agreements involving direct participation in the performance of the research.
(c) Definitions
i. Senior/Key Research Personnel
1. This definition would include the Principal Investigator or
Program/Project Director and other individuals who contribute to the
scientific development or execution of a project in a substantive,
measurable way, whether or not they receive salaries or compensation
under the award. These include individuals whose absence from the
project would be expected to impact the approved scope of the project.
2. Most often, these individuals will have a doctorate or other professional
degrees, although other individuals may be included within this definition
on occasion.
ii. Foreign Associations/Affiliations
1. Association is defined as collaboration, coordination or interrelation,
professionally or personally, with a foreign government-connected entity
where no direct monetary or non-monetary reward is involved.
2. Affiliation is defined as collaboration, coordination, or interrelation,
professionally or personally, with a foreign government-connected entity
where direct monetary or non-monetary reward is involved.
iii. Foreign Government Talent Recruitment Programs
1. In general, these programs will include any foreign-state-sponsored
attempt to acquire U.S. scientific-funded research or technology through
23
foreign government-run or funded recruitment programs that target
scientists, engineers, academics, researchers, and entrepreneurs of all
nationalities working and educated in the U.S.
2. Distinguishing features of a Foreign Government Talent Recruitment
Program may include:
a. Compensation, either monetary or in-kind, provided by the foreign
state to the targeted individual in exchange for the individual
transferring their knowledge and expertise to the foreign country.
b. In-kind compensation may include honorific titles, career
advancement opportunities, promised future compensation or other
types of remuneration or compensation.
c. Recruitment, in this context, refers to the foreign-state-sponsor’s
active engagement in attracting the targeted individual to join the
foreign-sponsored program and transfer their knowledge and
expertise to the foreign state. The targeted individual may be
employed and located in the U.S. or in the foreign state.
d. Contracts for participation in some programs that create conflicts
of commitment and/or conflicts of interest for researchers. These
contracts include, but are not limited to, requirements to attribute
awards, patents, and projects to the foreign institution, even if
conducted under U.S. funding, to recruit or train other talent
recruitment plan members, circumventing merit-based processes,
and to replicate or transfer U.S.-funded work in another country.
e. Many, but not all, of these programs aim to incentivize the targeted
individual to physically relocate to the foreign state. Of particular
concern are those programs that allow for continued employment
at U.S. research facilities or receipt of U.S. Government research
funding while concurrently receiving compensation from the
foreign state.
3. Foreign Government Talent Recruitment Programs DO NOT include:
a. Research agreements between the University and a foreign entity,
unless that agreement includes provisions that create situations of
concern addressed elsewhere in this section,
b. Agreements for the provision of goods or services by commercial
vendors, or
c. Invitations to attend or present at conferences.
iv. Conflict of Interest
1. A situation in which an individual, or the individual’s spouse or dependent
children, has a financial interest or financial relationship that could
directly and significantly affect the design, conduct, reporting, or funding
of research.
24
v. Conflict of Commitment
1. A situation in which an individual accepts or incurs conflicting obligations
between or among multiple employers or other entities.
2. Common conflicts of commitment involve conflicting commitments of
time and effort, including obligations to dedicate time in excess of
institutional or funding agency policies or commitments. Other types of
conflicting obligations, including obligations to improperly share
information with, or withhold information from, an employer or funding
agency, can also threaten research security and integrity and are an
element of a broader concept of conflicts of commitment.
vi. Foreign Component
1. Performance of any significant scientific element or segment of a program
or project outside of the U.S., either by the University or by a researcher
employed by a foreign organization, whether or not U.S. government
funds are expended.
2. Activities that would meet this definition include, but are not limited to:
a. Involvement of human subjects or animals;
b. Extensive foreign travel by University research program or project
staff for the purpose of data collection, surveying, sampling, and
similar activities;
c. Collaborations with investigators at a foreign site anticipated to
result in co-authorship;
d. Use of facilities or instrumentation at a foreign site;
e. Receipt of financial support or resources from a foreign entity; or
f. Any activity of the University that may have an impact on U.S.
foreign policy through involvement in the affairs or environment
of a foreign country.
3. Foreign travel is not considered a Foreign Component.
vii. Strategic Competitor
1. A nation, or nation-state, that engages in diplomatic, economic or
technological rivalry with the United States where the fundamental
strategic interests of the U.S are under threat.
Proposers should indicate in their proposal whether they believe the scope of the research
included in their proposal is fundamental or not. While proposers should clearly explain the
intended results of their research, the Government shall have sole discretion to determine
whether the proposed research shall be considered fundamental and to select the award
instrument type. Appropriate language will be included in resultant awards for non-fundamental
research to prescribe publication requirements and other restrictions, as appropriate. This
language can be found at http://www.darpa.mil/work-with-us/additional-baa.
25
For certain research projects, it may be possible that although the research to be performed by a
potential awardee is non-fundamental research, its proposed subawardee’s effort may be
fundamental research. It is also possible that the research performed by a potential awardee is
fundamental research while its proposed subawardee’s effort may be non-fundamental research.
In all cases, it is the potential awardee’s responsibility to explain in its proposal which proposed
efforts are fundamental research and why the proposed efforts should be considered fundamental
research.
A. Eligible Applicants
All responsible sources capable of satisfying the Government's needs may submit a proposal that
shall be considered by DARPA.
a) FFRDCs
FFRDCs are subject to applicable direct competition limitations and cannot propose to this
solicitation in any capacity unless they meet the following conditions. (1) FFRDCs must clearly
demonstrate that the proposed work is not otherwise available from the private sector. (2)
FFRDCs must provide a letter, on official letterhead from their sponsoring organization, that (a)
cites the specific authority establishing their eligibility to propose to Government solicitations
and compete with industry, and (b) certifies the FFRDC’s compliance with the associated
FFRDC sponsor agreement’s terms and conditions. These conditions are a requirement for
FFRDCs proposing to be awardees or subawardees.
b) Government Entities
Government Entities (e.g., Government/National laboratories, military educational institutions,
etc.) are subject to applicable direct competition limitations. Government Entities must clearly
demonstrate that the work is not otherwise available from the private sector and provide written
documentation citing the specific statutory authority and contractual authority, if relevant,
establishing their ability to propose to Government solicitations and compete with industry. This
information is required for Government Entities proposing to be awardees or subawardees.
26
2. Other Applicants
Non-U.S. organizations and/or individuals may participate to the extent that such participants
comply with any necessary nondisclosure agreements, security regulations, export control laws,
and other governing statutes applicable under the circumstances.
In accordance with FAR 9.5, proposers are required to identify and disclose all facts relevant to
potential OCIs involving the proposer’s organization and any proposed team member
(subawardee, consultant). Under this Section, the proposer is responsible for providing this
disclosure with each proposal submitted to the solicitation. The disclosure must include the
proposer’s, and as applicable, proposed team member’s OCI mitigation plan. The OCI mitigation
plan must include a description of the actions the proposer has taken, or intends to take, to
prevent the existence of conflicting roles that might bias the proposer’s judgment and to prevent
the proposer from having unfair competitive advantage. The OCI mitigation plan will
specifically discuss the disclosed OCI in the context of each of the OCI limitations outlined in
FAR 9.505-1 through FAR 9.505-4.
In addition, DARPA has a supplemental OCI policy that prohibits contractors/performers from
concurrently providing Scientific Engineering Technical Assistance (SETA), Advisory and
Assistance Services (A&AS) or similar support services and being a technical performer.
Therefore, as part of the FAR 9.5 disclosure requirement above, a proposer must affirm whether
the proposer or any proposed team member (subawardee, consultant) is providing SETA, A&AS,
or similar support to any DARPA office(s) under: (a) a current award or subaward; or (b) a past
award or subaward that ended within one calendar year prior to the proposal’s submission date.
If SETA, A&AS, or similar support is being or was provided to any DARPA office(s), the
proposal must include:
Identification of proposed team member (subawardee, consultant) providing the support; and
Government Procedures
In accordance with FAR 9.503, 9.504 and 9.506, the Government will evaluate OCI mitigation
plans to avoid, neutralize or mitigate potential OCI issues before award and to determine whether
it is in the Government’s interest to grant a waiver. The Government will only evaluate OCI
27
mitigation plans for proposals that are determined selectable under the solicitation evaluation
criteria and funding availability.
The Government may require proposers to provide additional information to assist the
Government in evaluating the proposer’s OCI mitigation plan.
If the Government determines that a proposer failed to fully disclose an OCI; or failed to provide
the affirmation of DARPA support as described above; or failed to reasonably provide additional
information requested by the Government to assist in evaluating the proposer’s OCI mitigation
plan, the Government may reject the proposal and withdraw it from consideration for award.
C. Cost Sharing/Matching
Cost sharing is not required; however, it will be carefully considered where there is an applicable
statutory condition relating to the selected funding instrument. Cost sharing is encouraged where
there is a reasonable probability of a potential commercial application related to the proposed
research and development effort.
For more information on potential cost sharing requirements for Other Transactions for
Prototype, see http://www.darpa.mil/work-with-us/contract-management#OtherTransactions.
The Government discourages proposers from submitting multiple prime contractor proposals for
the same TA. However, if appropriate the Government does not discourage subsidiaries of a
prime contractor from submitting a prime contractor proposal for any technical area.
While proposers may submit proposals for all four TAs, proposers selected for TA3 or TA4 as a
prime contractor cannot be selected for any portion of another TA unless there is a clear
deconfliction between the proposing teams. This policy is to avoid OCI situations between the
TAs and to ensure objective test and evaluation results. The decision as to which proposal to
consider for award is at the discretion of the Government.
This announcement, any attachments, and any references to external websites herein constitute
the total solicitation. If proposers cannot access the referenced material posted in the
announcement found at www.darpa.mil, contact the BAA Coordinator listed herein.
All submissions, including abstracts and proposals must be written in English with type not
smaller than 12 point font. Smaller font may be used for figures, tables, and charts. Copies of all
documents submitted must be clearly labeled with the DARPA BAA number, proposer
28
organization, and proposal title/proposal short title. All monetary references shall be in U.S.
Dollars.
1. Abstracts Format
Proposers are strongly encouraged to submit an abstract in advance of a full proposal. The
abstract is a concise version of the proposal comprising a maximum of 5 pages including all
figures, tables, and charts. The required cover sheet, and optional submission letter, table of
contents, or appendices are not included in the page count.
A. Cover Sheet (required): Include the administrative and technical points of contact
(title, name, address, phone, e-mail, lead organization). Also include the BAA
number, title of the proposed project (not the BAA title), Technical Area,
subcontractors, estimated cost, duration of the project, and the label “ABSTRACT.”
B. Executive Summary: Clearly describe what is being proposed and what difference it
will make (qualitatively and quantitatively).
C. Technical Plan: Outline and address all technical challenges inherent in the approach
and possible solutions for overcoming potential problems. Describe milestones and
how they will be achieved.
D. Management and Capabilities Plan: Identify the principal investigator, provide a brief
summary of expertise of the team, including subcontractors and key personnel, and
include relevant expertise to develop AI algorithms and architectures that deeply
integrate symbolic reasoning with data-driven machine learning to create robust,
assured, and therefore trustworthy AI-based systems.
E. Cost and Schedule: Provide a cost estimate for resources over the proposed timeline
of the project, broken down by phase and major cost items (e.g., labor, materials,
etc.). Include cost estimates for each potential subcontractor (it may be a rough order
of magnitude).
F. Executive Summary Slide: The slide template is provided as Appendix 1 to the BAA
posted at https://SAM.gov.
2. Proposals Format
All proposals should be in the format given below. The typical proposal should express a
consolidated effort in support of one or more related technical concepts or ideas. Disjointed
efforts should not be included into a single proposal. Proposals shall consist of two volumes: 1)
Volume I, Technical and Management Proposal (composed of 3 parts), and 2) Volume II, Cost
Proposal. The maximum page count for Volume I is 25 pages for a proposal addressing a single
technical area and 30 pages for a proposal addressing multiple technical areas (i.e., TA1 and
TA2), and excludes the cover page, summary slide, official transmittal letter, and any table of
contents or appendices, but does include figures, tables, and charts.
29
NOTE: Non-conforming submissions that do not follow the instructions herein may be rejected
without further review.
A. {10 pages [15 pages if addressing multiple TA]} Detailed technical approach enhancing and
completing the Summary of Proposal.
B. {1 page} Comparison with other ongoing research indicating advantages and disadvantages
of the proposed effort.
30
C. {2 pages} A clearly defined organization chart for the program team which includes, as
applicable: (1) the programmatic relationship of team member; (2) the unique capabilities of
team members; (3) the task of responsibilities of team members; (4) the teaming strategy
among the team members; and (5) the key personnel along with the amount of effort to be
expended by each person during each year.
Note: It is recommended that the SOW should be developed so that each Phase of the program is
separately defined.
31
(5) Proposer’s reference number (if any);
(6) Other team members (if applicable) and type of organization
for each;
(7) Proposal title;
(8) Technical point of contact to include: salutation, last name, first
name, street address, city, state, zip code, telephone, fax (if
available), electronic mail (if available);
(9) Administrative point of contact to include: salutation, last
name, first name, street address, city, state, zip code, telephone, fax
(if available), and electronic mail (if available);
(10) Award instrument requested: cost-plus-fixed-fee (CPFF),
cost-contract—no fee, cost sharing contract – no fee, or other type
of procurement contract (specify), grant, cooperative agreement, or
Other Transaction;
(11) Place(s) and period(s) of performance;
(12) Total proposed cost separated by basic award and option(s) (if
any);
(13) Name, address, and telephone number of the proposer’s
cognizant Defense Contract Management Agency (DCMA)
administration office (if known);
(14) Name, address, and telephone number of the proposer’s
cognizant Defense Contract Audit Agency (DCAA) audit office (if
known);
(15) Date proposal was prepared;
(16) Data Universal Numbering System (DUNS) number;
(17) Taxpayer Identification Number (TIN);
(18) Commercial and Government Entity (CAGE) Code;
(19) Subawardee information; and
(20) Proposal validity period.
32
Documentation supporting the reasonableness of the proposed equipment costs
(vendor quotes, past purchase orders/purchase history, detailed engineering
estimates, etc.) shall be provided.
An itemization of any information technology (IT) purchase, as defined by FAR
2.101 – Documentation supporting the reasonableness of the proposed equipment
costs (vendor quotes, past purchase orders/purchase history, detailed engineering
estimates, etc.) shall be provided, including a letter stating why the proposer
cannot provide the requested resources from its own funding for prime and all
sub-awardees.
A summary of projected funding requirements by month
The source, nature, and amount of any industry cost-sharing
Identification of pricing assumptions of which may require incorporation into the
resulting award instrument (e.g., use of Government Furnished
Property/Facilities/Information, access to Government subject matter experts,
etc.)
Tables included in the cost proposal in editable (e.g. MS Excel) format with
calculation formulas intact. NOTE: If PDF submissions differ from the Excel
submission, the PDF will take precedence.
The Government strongly encourages that proposers use the provided MS ExcelTM DARPA
Standard Cost Proposal Spreadsheet in the development of their cost proposals. A customized
cost proposal spreadsheet may be an attachment to this solicitation. If not, the spreadsheet can be
found on the DARPA website at http://www.darpa.mil/work-with-us/contract-management
(under “Resources” on the right-hand side of the webpage). All tabs and tables in the cost
proposal spreadsheet should be developed in an editable format with calculation formulas intact
to allow traceability of the cost proposal. This cost proposal spreadsheet should be used by the
prime organization and all subcontractors. In addition to using the cost proposal spreadsheet, the
cost proposal still must include all other items required in this announcement that are not covered
by the editable spreadsheet. Subcontractor cost proposal spreadsheets may be submitted directly
to the Government by the proposed subcontractor via e-mail to the address in Part I of this
solicitation. Using the provided cost proposal spreadsheet will assist the Government in a
rapid analysis of your proposed costs and, if your proposal is selected for a potential
award, speed up the negotiation and award execution process.
NOTE: The cost proposal spreadsheet is a supplement to, and not a substitution for, the Cost
Volume. The Cost Volume should be submitted as previously outlined.
Per FAR 15.403-4, certified cost or pricing data shall be required if the proposer is seeking a
procurement contract award per the referenced threshold, unless the proposer requests and is
granted an exception from the requirement to submit cost or pricing data. Certified cost or
pricing data is not required if the proposer proposes an award instrument other than a
procurement contract (e.g., a grant, cooperative agreement, or other transaction.)
33
The proposer is responsible for compiling and providing all subaward proposals for the
Procuring Contracting Officer (PCO)/Grants Officer (GO)/Agreements Officer (AO), as
applicable. Subaward proposals should include Interdivisional Work Transfer Agreements
(ITWA) or similar arrangements. Where the effort consists of multiple portions which could
reasonably be partitioned for purposes of funding, these should be identified as options with
separate cost estimates for each.
All proprietary subaward proposal documentation, prepared at the same level of detail as that
required of the proposer’s proposal and which cannot be uploaded with the proposal, shall be
provided to the Government either by the proposer or by the subawardee organization when the
proposal is submitted. Subaward proposals submitted to the Government by the proposer’s
awardee should be submitted electronically to ANSR@darpa.mil , and the proposed awardee will
not be allowed to view. The subawardee must provide the same number of electronic copies to
the PCO/GO/AO as is required of the awardee. See Section IV.B.5.b. of this BAA for proposal
submission information.
a) Proprietary Markings
Proposers are responsible for clearly identifying proprietary information. Submissions containing
proprietary information must have the cover page and each page containing such information
clearly marked with a label such as “Proprietary”.” NOTE: “Confidential” is a classification
marking used to control the dissemination of U.S. Government National Security Information as
dictated in Executive Order 13526 and should not be used to identify proprietary business
information.
b) Security Information
(1) Program Security Information
(a) Program Security (as applicable)
Proposers should include with their proposal any proposed solution(s) to program security
requirements unique to this program. Common program security requirements include but are
34
not limited to: operational security (OPSEC) contracting/sub-contracting plans; foreign
participation or materials utilization plans; program protection plans (which may entail the
following) manufacturing and integration plans; range utilization and support plans (air, sea,
land, space, and cyber); data dissemination plans; asset transportation plans; classified test
activity plans; disaster recovery plans; classified material / asset disposition plans
and public affairs / communications plans.
35
c) Disclosure of Information and Compliance with
Safeguarding Covered Defense Information Controls
The following provisions and clause apply to all solicitations and contracts; however, the
definition of “controlled technical information” clearly exempts work considered fundamental
research and therefore, even though included in the contract, will not apply if the work is
fundamental research.
DFARS 252.204-7000, “Disclosure of Information”
DFARS 252.204-7008, “Compliance with Safeguarding Covered Defense Information Controls”
DFARS 252.204-7012, “Safeguarding Covered Defense Information and Cyber Incident
Reporting”
The full text of the above solicitation provision and contract clauses can be found at
http://www.darpa.mil/work-with-us/additional-baa#NPRPAC.
Compliance with the above requirements includes the mandate for proposers to implement the
security requirements specified by National Institute of Standards and Technology (NIST)
Special Publication (SP) 800-171, “Protecting Controlled Unclassified Information in Nonfederal
Information Systems and Organizations” (see
https://nvlpubs.nist.gov/nistpubs/SpecialPublications/NIST.SP.800-171r2.pdf) and DoDI
8582.01 that are in effect at the time the solicitation is issued.
For awards where the work is considered fundamental research, the contractor will not have to
implement the aforementioned requirements and safeguards. However, should the nature of the
work change during performance of the award, work not considered fundamental research will
be subject to these requirements.
In accordance with FAR 4.1102 and 4.1201, proposers requesting a procurement contract must
complete electronic annual representations and certifications at https://www.sam.gov/.
In addition, all proposers are required to submit for all award instrument types supplementary
DARPA-specific representations and certifications at the time of proposal submission. See
http://www.darpa.mil/work-with-us/reps-certs for further information on required representation
and certification depending on your requested award instrument.
Proposers that anticipate involving human subjects or animals in the proposed research must
comply with the approval procedures detailed at http://www.darpa.mil/work-with-us/additional-
baa, to include providing the information specified therein as required for proposal submission.
Proposers that do not have a Cost Accounting Standards (CAS) complaint accounting system
considered adequate for determining accurate costs that are negotiating a cost- type procurement
contract must complete a Standard Form, (SF 1408). For more information on CAS compliance,
see http://www.dcaa.mil. To facilitate this process, proposers should complete the SF 1408
36
found at http://www.gsa.gov/portal/forms/download/115778 and submit the completed form with
the proposal.
Pursuant to Section 8(d) of the Small Business Act (15 U.S.C. § 637(d)) and FAR 19.702(a)(1),
each proposer who submits a proposal for a procurement contract and includes subcontractors
might be required to submit a subcontracting plan with their proposal. The plan format is
outlined in FAR 19.704.
All electronic and information technology acquired or created through this BAA must satisfy the
accessibility requirements of Section 508 of the Rehabilitation Act (29 U.S.C. § 749d)/FAR 39.2.
i) Intellectual Property
All proposers must provide a good faith representation that the proposer either owns or possesses
the appropriate licensing rights to all intellectual property that will be utilized under the proposed
effort.
(1) For Procurement Contracts
Proposers responding to this BAA requesting procurement contracts will need to complete the
certifications at Defense Federal Acquisition Regulation Supplement (DFARS) 252.227-7017.
See http://www.darpa.mil/work-with-us/additional-baa for further information. If no restrictions
are intended, the proposer should state “none.” The table below captures the requested
information:
37
j) System for Award Management (SAM) and Universal
Identifier Requirements
All proposers must be registered in SAM unless exempt per FAR 4.1102. FAR 52.204-7,
“System for Award Management” and FAR 52.204-13, “System for Award Management
Maintenance” are incorporated into this solicitation. See http://www.darpa.mil/work-with-
us/additional-baa for further information.
International entities can register in SAM by following the instructions in this link:
https://www.fsd.gov/sys_attachment.do?sys_id=c08b64ab1b4434109ac5ddb6bc4bcbb8.
4. Submission Information
For abstract and proposal submission dates, see Part 1, Overview Information. Submissions
received after these dates and times may not be reviewed.
Abstracts must be received via DARPA's BAA Website (https://baa.darpa.mil) on or before the
submission dated stated in Part 1, Overview Information.
The proposal must be received via DARPA's BAA Website (https://baa.darpa.mil) on or before
the submission dated stated in Part 1, Overview Information., in order to be considered during
the initial round of selections. Proposals received after this deadline may be received and
evaluated up to 6-months (180 calendar days) from date of posting on the SAM, Contract
Opportunities website (https://SAM.gov). Proposals submitted after the due date specified in the
BAA, but before the solicitation closing date, may be selected. Proposers are warned that the
likelihood of available funding is greatly reduced for proposals submitted after the initial closing
date deadline.
DARPA will acknowledge receipt of all submissions and assign an identifying control number
that should be used in all further correspondence regarding the submission. DARPA intends to
use electronic mail correspondence regarding HR001122S0039. Submissions may not be
submitted by fax or e-mail; any submission received through fax or e-mail will be disregarded.
Submissions will not be returned. An electronic copy of each submission received will be
retained at DARPA and all other non-required copies destroyed. A certification of destruction
may be requested, provided the formal request is received by DARPA within five (5) business
days after notification that a proposal was not selected.
Proposers may encounter heavy traffic on the web server, it is highly recommended that
proposers not wait until the day proposals are due to request an account and/or upload a
submission. Full proposals should not be submitted via e-mail. Any full proposals submitted by
e-mail will not be accepted or evaluated.
a) Abstract Submission
38
b) Proposal Submission
Refer to Section VI.A.2 for how DARPA will notify proposers as to whether or not their
proposal has been selected for potential award.
Proposers requesting cooperative agreements must submit proposals through one of the
following methods: (1) electronic upload per the instructions at
https://www.grants.gov/applicants/apply-for-grants.html (DARPA-preferred); or (2) hard-copy
mailed directly to DARPA. If proposers intend to use Grants.gov as their means of submission,
then they must submit their entire proposal through Grants.gov; applications cannot be submitted
in part to Grants.gov and in part as a hard-copy. Proposers using Grants.gov do not submit hard-
copy proposals in addition to the Grants.gov electronic submission.
39
o Education and Training.
o Research and Professional Experience.
o Collaborations and Affiliations (for conflict of interest).
o Publications and Synergistic Activities.
Current and Pending Support: Mandatory for all Senior/Key Personnel including the
PD/PI. This attachment should include the following information:
o A list of all current projects the individual is working on, in addition to any future
support the individual has applied to receive, regardless of the source.
o Title and objectives of the other research projects.
o The percentage per year to be devoted to the other projects.
o The total amount of support the individual is receiving in connection to each of
the other research projects or will receive if other proposals are awarded.
o Name and address of the agencies and/or other parties supporting the other
research projects
o Period of performance for the other research projects.
Additional senior/key persons can be added by selecting the “Next Person” button at the bottom
of the form. Note that, although applications without this information completed may pass
Grants.gov edit checks, if DARPA receives an application without the required information,
DARPA may determine that the application is incomplete and may cause your submission to be
rejected and eliminated from further review and consideration under the solicitation. DARPA
reserves the right to request further details from the applicant before making a final
determination on funding the effort.
Form 3: Research and Related Personal Data, available on the Grants.gov website at
https://apply07.grants.gov/apply/forms/sample/RR_PersonalData_1_2-V1.2.pdf. Each applicant
must complete the name field of this form, however, provision of the demographic information is
voluntary. Regardless of whether the demographic fields are completed or not, this form must be
submitted with at least the applicant’s name completed.
Unclassified full proposals sent in response to this BAA may be submitted via DARPA's BAA
Website (https://baa.darpa.mil). Note: If an account has recently been created for the DARPA
BAA Website, this account may be reused. Accounts are typically disabled and eventually
deleted following 75-90 days of inactivity – if you are unsure when the account was last used, it
is recommended that you create a new account. If no account currently exists for the DARPA
BAA Website, visit the website to complete the two-step registration process. Submitters will
need to register for an Extranet account (via the form at the URL listed above) and wait for two
separate e-mails containing a username and temporary password. The “Password Reset” option
40
at the URL listed above can be used if the password is not received in a timely fashion. After
accessing the Extranet, submitters may then create an account for the DARPA BAA website (via
the "Register your Organization" link along the left side of the homepage), view submission
instructions, and upload/finalize the proposal. Note: Even if a submitter’s organization has an
existing registration, each user submitting a proposal must create their own Organization
Registration.
All unclassified proposals submitted electronically through DARPA’s BAA Website must be
uploaded as zip archives (i.e., files with a .zip or .zipx extension). The final zip archive should be
no greater than 100 MB in size. Only one zip archive will be accepted per submission –
subsequent uploads for the same submission will overwrite previous uploads, and submissions
not uploaded as zip archives will be rejected by DARPA.
Classified submissions and proposals requesting grants or cooperative agreements should NOT
be submitted through DARPA's BAA Website (https://baa.darpa.mil), though proposers will
likely still need to visit https://baa.darpa.mil to register their organization (or verify an existing
registration) to ensure the BAA office can verify and finalize their submission. Proposal abstracts
will not be accepted if submitted via Grants.gov.
Proposers using the DARPA BAA Website may encounter heavy traffic on the submission
deadline date; proposers should start this process as early as possible. Technical support for
DARPA's BAA Website may be reached at BAAT_Support@darpa.mil, and is typically
available during regular business hours (9:00 AM – 5:00 PM Eastern Time).
5. Funding Restrictions
Not Applicable.
DARPA will post a consolidated Frequently Asked Questions (FAQ) document. To access the
posting go to: http://www.darpa.mil/work-with-us/opportunities. Under the HR001122S0039
summary will be a link to the FAQ. Submit your question/s by E-mail to ANSR@darpa.mil.
Questions must be received by the FAQ/Questions due date listed in Part 1, Overview
Information.
Not Applicable.
A. Evaluation Criteria
41
Proposals will be evaluated using the following criteria, listed in descending order of importance:
The proposed technical team has the expertise and experience to accomplish the proposed tasks.
Task descriptions and associated technical elements provided are complete and in a logical
sequence with all proposed deliverables clearly defined such that a final outcome that achieves
the goal can be expected as a result of award. The proposal identifies major technical risks and
planned mitigation efforts are clearly defined and feasible.
The proposal clearly explains the technical approach(es) that will be employed to meet or exceed
each program goal and metric listed in Section I.B. and provides ample justification as to why
the approach(es) is feasible. The Government will also consider the structure, clarity, and
responsiveness to the Statement of Work; the quality of proposed deliverables; and the linkage of
the Statement of Work, technical approach(es), risk mitigation plans, costs, and deliverables of
the prime awardee and all subawardees through a logical, well structured, and traceable technical
plan.
The potential contributions of the proposed effort are relevant to the national technology base.
Specifically, DARPA’s mission is to make pivotal early technology investments that create or
prevent strategic surprise for U.S. National Security.
3. Cost Realism
The proposed costs are realistic for the technical and management approach and accurately
reflect the technical goals and objectives of the solicitation. The proposed costs are consistent
with the proposer's Statement of Work and reflect a sufficient understanding of the costs and
level of effort needed to successfully accomplish the proposed technical approach. The costs for
the prime proposer and proposed subawardees are substantiated by the details provided in the
proposal (e.g., the type and number of labor hours proposed per task, the types and quantities of
materials, equipment and fabrication costs, travel and any other applicable costs and the basis for
the estimates).
B. Review of Proposals
1. Review Process
42
DARPA will conduct a scientific/technical review of each conforming proposal. Conforming
proposals comply with all requirements detailed in this solicitation; proposals that fail to do so
may be deemed non-conforming and may be removed from consideration. Proposals will not be
evaluated against each other since they are not submitted in accordance with a common work
statement. DARPA’s intent is to review proposals as soon as possible after they arrive; however,
proposals may be reviewed periodically for administrative reasons.
Award(s) will be made to proposers whose proposals are determined to be the most
advantageous to the Government, consistent with instructions and evaluation criteria specified in
the BAA herein, and availability of funding.
DARPA policy is to treat all submissions as source selection information (see FAR 2.101 and
3.104), and to disclose their contents only for the purpose of evaluation. Restrictive notices
notwithstanding, during the evaluation process, submissions may be handled by support
contractors for administrative purposes and/or to assist with technical evaluation. All DARPA
support contractors performing this role are expressly prohibited from performing DARPA-
sponsored technical research and are bound by appropriate nondisclosure agreements.
Subject to the restrictions set forth in FAR 37.203(d), input on technical aspects of the proposals
may be solicited by DARPA from non-Government consultants/experts who are strictly bound
by the appropriate non-disclosure requirements.
Per 41 U.S.C. § 2313, as implemented by FAR 9.103 and 2 C.F.R. § 200.205, prior to making an
award above the simplified acquisition threshold, DARPA is required to review and consider any
information available through the designated integrity and performance system (currently
FAPIIS). Awardees have the opportunity to comment on any information about themselves
entered in the database, and DARPA will consider any comments, along with other information
in FAPIIS or other systems prior to making an award.
DARPA’s CFIP is an adaptive risk management security program designed to help protect the
critical technology and performer intellectual property associated with DARPA’s research
projects by identifying the possible vectors of undue foreign influence. The CFIP team will
create risk assessments of all proposed Senior/Key Personnel selected for negotiation of a
fundamental research grant or cooperative agreement award. The CFIP risk assessment process
will be conducted separately from the DARPA scientific review process and adjudicated prior to
final award.
43
A. Selection Notices and Notifications
1. Abstracts
DARPA will respond to abstracts with a statement as to whether DARPA is interested in the
idea. If DARPA does not recommend the proposer submit a full proposal, DARPA will provide
feedback to the proposer regarding the rationale for this decision. Regardless of DARPA’s
response to an abstract, proposers may submit a full proposal. DARPA will review all
conforming full proposals using the published evaluation criteria and without regard to any
comments resulting from the review of an abstract.
2. Proposals
As soon as the evaluation of a proposal is complete, the proposer will be notified that (1) the
proposal has been selected for funding pending award negotiations, in whole or in part, or (2) the
proposal has not been selected. These official notifications will be sent via e-mail to the
Technical Point of Contact (POC) and/or Administrative POC identified on the proposal
coversheet.
There will be a program kickoff meeting and all key participants are required to attend.
Performers should also anticipate regular program-wide PI Meetings and periodic site visits at
the Program Manager’s discretion.
Solicitation clauses in the FAR and DFARS relevant to procurement contracts and FAR and
DFARS clauses that may be included in any resultant procurement contracts are incorporated
herein and can be found at http://www.darpa.mil/work-with-us/additional-baa.
44
specific terms and conditions at http://www.darpa.mil/work-with-us/contract-
management#GrantsCooperativeAgreements.
C. Reporting
The number and types of reports will be specified in the award document, but will include at a
minimum monthly technical and financial status reports. The reports shall be prepared and
submitted in accordance with the procedures contained in the award document and mutually
agreed on before award. A final report that summarizes the project and tasks will be required at
the conclusion of the period of performance for the award.
D. Electronic Systems
Performers will be required to submit invoices for payment directly to https://piee.eb.mil/, unless
an exception applies. Performers must register in WAWF prior to any award under this BAA.
2. i-Edison
The award document for each proposal selected for funding will contain a mandatory
requirement for patent reports and notifications to be submitted electronically through i-Edison
(https://public.era.nih.gov/iedison).
Awardees, pursuant to this solicitation, may be eligible to participate in the DARPA Embedded
Entrepreneurship Initiative (EEI) during the award’s period of performance. EEI is a limited
scope program offered by DARPA, at DARPA’s discretion, to a small subset of awardees. The
goal of DARPA’s EEI is to increase the likelihood that DARPA-funded technologies take root in
the U.S. and provide new capabilities for national defense. EEI supports DARPA’s mission “to
make pivotal investments in breakthrough technologies and capabilities for national security” by
accelerating the transition of innovations out of the lab and into new capabilities for the
Department of Defense (DoD). EEI investment supports development of a robust and deliberate
Go-to-Market strategy for selling technology product to the government and commercial markets
and positions DARPA awardees to attract U.S. investment. The following is for informational
and planning purposes only and does not constitute solicitation of proposals to the EEI.
There are three elements to DARPA’s EEI: (1) A Senior Commercialization Advisor (SCA)
from DARPA who works with the Program Manager (PM) to examine the business case for the
awardee’s technology and uses commercial methodologies to identify steps toward achieving a
successful transition of technology to the government and commercial markets; (2) Connections
to potential industry and investor partners via EEI’s Investor Working Groups; and (3)
Additional funding on an awardee’s contract for the awardee to hire an embedded entrepreneur
to achieve specific milestones in a Go-to-Market strategy for transitioning the technology to
45
products that serve both defense and commercial markets. This embedded entrepreneur’s
qualifications should include business experience within the target industries of interest,
experience in commercializing early stage technology, and the ability to communicate and
interact with technical and non-technical stakeholders. Funding for EEI is typically no more than
$250,000 per awardee over the duration of the award. An awardee may apportion EEI funding to
hire more than one embedded entrepreneur, if achieving the milestones requires different
expertise that can be obtained without exceeding the awardee’s total EEI funding. The EEI effort
is intended to be conducted concurrent with the research program without extending the period
of performance.
DARPA Commercial Strategy will then contact the performer, assess fitness for EEI, and in
consultation with the DARPA technical office, determine whether to invite the performer to
participate in the EEI. Factors that are considered in determining fitness for EEI include
DoD/Government need for the technology; competitive approaches to enable a similar capability
or product; risks and impact of the Government’s being unable to access the technology from a
sustainable source; Government and commercial markets for the technology; cost and
affordability; manufacturability and scalability; supply chain requirements and barriers;
regulatory requirements and timelines; Intellectual Property and Government Use Rights, and
available funding.
Invitation to participate in EEI is at the sole discretion of DARPA and subject to program
balance and the availability of funding. EEI participants’ awards may be subsequently modified
bilaterally to amend the Statement of Work to add negotiated EEI tasks, provide funding, and
specify a milestone schedule which will include measurable steps necessary to build, refine, and
execute a Go-to-Market strategy aimed at delivering new capabilities for national defense.
Milestone examples are available at: https://www.darpa.mil/work-with-us/contract-management
Awardees under this solicitation are eligible to be considered for participation in EEI, but
selection for award under this solicitation does not imply or guarantee participation in EEI.
46
Administrative, technical, or contractual questions should be sent via email to ANSR@darpa.mil.
All requests must include the name, email address, and phone number of a point of contact.
Points of Contact
The BAA Coordinator for this effort may be reached at ANSR@darpa.mil.
The Technical POC for this effort is Dr. Sandeep Neema, Program Manager
DARPA/I2O
ATTN: HR001122S0039
675 North Randolph Street
Arlington, VA 22203-2114
Proposers Day
A Proposers Day for this effort will be held on June 1, 2022 virtually through Zoom conference.
The Special Notice regarding this Proposers Day can be found at
https://sam.gov/opp/ed5a2c8ff5004e5d97f1f61a575296bf/view. For further information
regarding the ANSR Proposers Day, including slides from the event, please see
http://www.darpa.mil/work-with-us/opportunities under HR001122S0039.
This same or similar language will be included in procurement contract awards against
HR001122S0039. Awards other than FAR based contracts will contain similar agreement
language:
(a) It is recognized that success of the ANSR research effort depends in part upon the open
exchange of information between the various Associate Contractors involved in the effort. This
language is intended to ensure that there will be appropriate coordination and integration of work
by the Associate Contractors to achieve complete compatibility and to prevent unnecessary
duplication of effort. By executing this contract, the Contractor assumes the responsibilities of an
Associate Contractor. For the purpose of this ACA, the term Contractor includes subsidiaries,
affiliates, and organizations under the control of the contractor (e.g. subcontractors).
(b) Work under this contract may involve access to proprietary or confidential data from an
Associate Contractor. To the extent that such data is received by the Contractor from any
Associate Contractor for the performance of this contract, the Contractor hereby agrees that any
proprietary information received shall remain the property of the Associate Contractor and shall
be used solely for the purpose of the ANSR research effort. Only that information which is
received from another contractor in writing and which is clearly identified as proprietary or
confidential shall be protected in accordance with this provision. The obligation to retain such
47
information in confidence will be satisfied if the Contractor receiving such information utilizes
the same controls as it employs to avoid disclosure, publication, or dissemination of its own
proprietary information. The receiving Contractor agrees to hold such information in confidence
as provided herein so long as such information is of a proprietary/confidential or limited rights
nature.
(c) The Contractor hereby agrees to closely cooperate as an Associate Contractor with the other
Associate Contractors on this research effort. This involves as a minimum:
(2) maintenance of a free and open information network with all Government-identified
associate Contractors;
(4) entering into a written agreement with the other Associate Contractors setting forth
the substance and procedures relating to the foregoing, and promptly providing the
Agreements Officer/Procuring Contracting Officer with a copy of same; and,
(5) receipt of proprietary information from the Associate Contractor and transmittal of
Contractor proprietary information to the Associate Contractors subject to any applicable
proprietary information exchange agreements between associate contractors when, in
either case, those actions are necessary for the performance of either.
(d) In the event that the Contractor and the Associate Contractor are unable to agree upon any
such interface matter of substance, or if the technical data identified is not provided as scheduled,
the Contractor shall promptly notify the DARPA Program Name Program Manager. The
Government will determine the appropriate corrective action and will issue guidance to the
affected Contractor.
(e) The Contractor agrees to insert in all subcontracts hereunder which require access to
proprietary information belonging to the Associate Contractor, a provision which shall conform
substantially to the language of this ACA, including this paragraph (e).
(f) Associate Contractors for the Program Name research effort include:
Contractor Technical Area
48
IX. APPENDIX 1 – PROPOSAL SUMMARY SLIDE
49
Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.
Alternative Proxies: