Add to favorites:
Share:
The proposals must address the organisation of a technological challenge on autonomous drone navigation in non-permissive environments based on the preliminary evaluation plan provided as part of the call documents (cf. Annex 4a). This includes the collection of data recorded by the participating teams during field tests, the annotation of this data, and the sharing of the resulting databases.
Scope:
The proposals should address the organisation of a technological challenge on HLT based on the preliminary evaluation plan provided as part of the call document (cf. Annex 4a). This includes the collection, annotation and distribution of data, and the writing of the evaluation plans.
Types of activities
The following table lists the types of activities which are eligible for this topic, and whether they are mandatory or optional (see Article 10(3) EDF Regulation):
Types of activities (art 10(3) EDF Regulation) | Eligible? | |
(a) | Activities that aim to create, underpin and improve knowledge, products and technologies, including disruptive technologies, which can achieve significant effects in the area of defence (generating knowledge) | Yes(optional) |
(b) | Activities that aim to increase interoperability and resilience, including secured production and exchange of data, to master critical defence technologies, to strengthen the security of supply or to enable the effective exploitation of results for defence products and technologies (integrating knowledge) | Yes(mandatory) |
(c) | Studies, such as feasibility studies to explore the feasibility of new or upgraded products, technologies, processes, services and solutions | Yes(optional) |
(d) | Design of a defence product, tangible or intangible component or technology as well as the definition of the technical specifications on which such a design has been developed, including any partial test for risk reduction in an industrial or representative environment | Yes(optional) |
(e) | System prototyping of a defence product, tangible or intangible component or technology | No |
(f) | Testing of a defence product, tangible or intangible component or technology | No |
(g) | Qualification of a defence product, tangible or intangible component or technology | No |
(h) | Certification of a defence product, tangible or intangible component or technology | No |
(i) | Development of technologies or assets increasing efficiency across the life cycle of defence products and technologies | No |
Accordingly, the proposals must cover at least the following tasks as part of mandatory activities:
- Integrating knowledge:
- Setting up of the hardware and software infrastructures for testing autonomous navigation technologies in the framework of the technological challenge.
- Collection of sensor data from the participating teams, labelling/annotation of the data with the expected outputs against which the system outputs will be evaluated (“ground truth”) or establishment of such expected outputs as needed, and quality assessment, distribution, and curation of databases.
- Organisation of the evaluation campaigns, and in particular:
- coordination of the exchanges with the other stakeholders on the evaluation plans and elaboration of these plans;
- management of the field and data-based test campaigns and of the objective measurements of the performances of the systems submitted to the tests by the participating teams according to the protocols and metrics described in the evaluation plans;
- organisation of the debriefing workshops.
The proposals should substantiate synergies and complementarities with foreseen, ongoing or completed activities for the objective and comparative evaluation of the performances of autonomous navigation technologies.
Functional requirements
The proposed solutions should enable to measure the performances of the tested systems according to detailed evaluation plans based on the preliminary evaluation plan provided as part of the call documents (see Annex 4a). Key aspects of the foreseen detailed evaluation plans and associated data management should be described in the proposals.
Proposals should in particular describe:
- the scenarios, nature and size of the test ranges, and the environmental conditions;
- the set up for establishing the reference positions of drones during field tests and the expected positioning accuracy;
- the nature and volume of data annotation;
- the quality control of the annotations;
- the framework for trusted sharing of data;
- the detailed programme of the data-based and field test campaigns;
- the evaluation procedures (rules and tools to implement the metrics) and significance tests to be performed on measurements.
The proposed scenarios should be representative of a wide range of situations encountered in military operations, including communications and GNSS loss, possibly due to jamming and spoofing attacks.
Trust should be ensured in the quality of the data annotation. Part of the data should be subject to double annotation by two independent annotators and the inter-annotator agreement should be analysed. The statistical significance of the measured results should be estimated.
The detailed programme of the field test campaigns should be based on the hypothesis that at least four teams will participate. The possibility to accommodate for additional participants beyond this baseline and the impact on the field test programme should be described in the proposals.
During the challenge, drafts of the detailed evaluation plans should be submitted for discussion to the participating teams and to any stakeholder designated by the funding authority, early enough to take into account the feedback for the actual evaluation campaigns. Any evolution of the evaluation plans should take into account several factors: technical possibilities and costs, scientific relevance of the measurement, and representativeness of the metrics and protocols with respect to military needs. The justification of any change that is not subject to a consensus should be documented.
Expected Impact:
The outcome should contribute to:
- collaboration, knowledge sharing, and new partnerships that drive collective progress in autonomous drone navigation at the EU level;
- improved knowledge and understanding on the capabilities of European industry to integrate sensors in UAS;
- improved technologies for autonomous navigation of drone swarms, and more generally improved performance of combat drones;
- certification of technologies for autonomous drone navigation;
- improved capabilities of the European Member State armed forces to prepare the use of drones in difficult environments involving GNSS jamming, communications jamming, and various obstacles.
Expected Outcome
Scope
The proposals should address the organisation of a technological challenge on HLT based on the preliminary evaluation plan provided as part of the call document (cf. Annex 4a). This includes the collection, annotation and distribution of data, and the writing of the evaluation plans.
Types of activities
The following table lists the types of activities which are eligible for this topic, and whether they are mandatory or optional (see Article 10(3) EDF Regulation):
Types of activities (art 10(3) EDF Regulation) | Eligible? | |
(a) | Activities that aim to create, underpin and improve knowledge, products and technologies, including disruptive technologies, which can achieve significant effects in the area of defence (generating knowledge) | Yes(optional) |
(b) | Activities that aim to increase interoperability and resilience, including secured production and exchange of data, to master critical defence technologies, to strengthen the security of supply or to enable the effective exploitation of results for defence products and technologies (integrating knowledge) | Yes(mandatory) |
(c) | Studies, such as feasibility studies to explore the feasibility of new or upgraded products, technologies, processes, services and solutions | Yes(optional) |
(d) | Design of a defence product, tangible or intangible component or technology as well as the definition of the technical specifications on which such a design has been developed, including any partial test for risk reduction in an industrial or representative environment | Yes(optional) |
(e) | System prototyping of a defence product, tangible or intangible component or technology | No |
(f) | Testing of a defence product, tangible or intangible component or technology | No |
(g) | Qualification of a defence product, tangible or intangible component or technology | No |
(h) | Certification of a defence product, tangible or intangible component or technology | No |
(i) | Development of technologies or assets increasing efficiency across the life cycle of defence products and technologies | No |
Accordingly, the proposals must cover at least the following tasks as part of mandatory activities:
- Integrating knowledge:
- Setting up of the hardware and software infrastructures for testing autonomous navigation technologies in the framework of the technological challenge.
- Collection of sensor data from the participating teams, labelling/annotation of the data with the expected outputs against which the system outputs will be evaluated (“ground truth”) or establishment of such expected outputs as needed, and quality assessment, distribution, and curation of databases.
- Organisation of the evaluation campaigns, and in particular:
- coordination of the exchanges with the other stakeholders on the evaluation plans and elaboration of these plans;
- management of the field and data-based test campaigns and of the objective measurements of the performances of the systems submitted to the tests by the participating teams according to the protocols and metrics described in the evaluation plans;
- organisation of the debriefing workshops.
The proposals should substantiate synergies and complementarities with foreseen, ongoing or completed activities for the objective and comparative evaluation of the performances of autonomous navigation technologies.
Functional requirements
The proposed solutions should enable to measure the performances of the tested systems according to detailed evaluation plans based on the preliminary evaluation plan provided as part of the call documents (see Annex 4a). Key aspects of the foreseen detailed evaluation plans and associated data management should be described in the proposals.
Proposals should in particular describe:
- the scenarios, nature and size of the test ranges, and the environmental conditions;
- the set up for establishing the reference positions of drones during field tests and the expected positioning accuracy;
- the nature and volume of data annotation;
- the quality control of the annotations;
- the framework for trusted sharing of data;
- the detailed programme of the data-based and field test campaigns;
- the evaluation procedures (rules and tools to implement the metrics) and significance tests to be performed on measurements.
The proposed scenarios should be representative of a wide range of situations encountered in military operations, including communications and GNSS loss, possibly due to jamming and spoofing attacks.
Trust should be ensured in the quality of the data annotation. Part of the data should be subject to double annotation by two independent annotators and the inter-annotator agreement should be analysed. The statistical significance of the measured results should be estimated.
The detailed programme of the field test campaigns should be based on the hypothesis that at least four teams will participate. The possibility to accommodate for additional participants beyond this baseline and the impact on the field test programme should be described in the proposals.
During the challenge, drafts of the detailed evaluation plans should be submitted for discussion to the participating teams and to any stakeholder designated by the funding authority, early enough to take into account the feedback for the actual evaluation campaigns. Any evolution of the evaluation plans should take into account several factors: technical possibilities and costs, scientific relevance of the measurement, and representativeness of the metrics and protocols with respect to military needs. The justification of any change that is not subject to a consensus should be documented.