Add to favorites:
Share:
Project results are expected to contribute to the following expected outcomes:
- Novel AI-enabled Cloud and Edge management solutions tailored for the processing needs of AI workloads across the cognitive cloud-edge-IoT continuum.
- Strategic industrial cooperation across the Cloud-Edge-IoT cognitive computing continuum to support future hyper-distributed AI applications.
- Seamless and trustworthy integration and interoperability across diverse computing and data environments spanning from core cloud (including HPC) to edge to IoT and across different technology stacks.
- Enhanced openness and open strategic autonomy in the evolving data and AI-economies across the computing continuum validated through key business/societal sectors.
- Guaranteeing a minimum level of interoperability and portability thereby facilitating European access to foreign markets.
Scope:
The Cloud to Edge Continuum needs to provide seamless and trustworthy integration of diverse computing and data environments spanning from core cloud to edge to IoT and support the enormous data, processing needs, and new resource types brought by next generation AI technologies.
Different types of AI processes pose different requirements that compute infrastructures need to meet to execute them. The state-of-the-art in generative AI and large language models is heavily reliant on high-performance processing and very large AI models. Cutting-edge hardware accelerators that power these processing systems are scarce on the market and only available in highly specialised, high-performance infrastructures in certain cloud and HPC environments at considerable costs. At the same time, the requirement to gather, process, and transmit massive amounts of data to the central data processing environment remains a barrier for many AI applications. All these factors urge the emergence of efficient tools and mechanisms to empower the distribution of AI training and inference processes throughout the computing continuum.
Empowering the next generation AI technologies with on-demand, agile and situation-aware infrastructure that brings data- and computing power to where and when it is needed will let end-users exploit Artificial Intelligence across the computing continuum without compromising on security and trust and optimising their energy use. These challenges span various aspects of the continuum, including on-device data processing, data orchestration and sharing, AI integration, decentralised intelligent management, decentralized and global optimization, energy and resource heterogeneity support, data management, security/privacy, and synergies with 5G/6G. Addressing these challenges is crucial for realizing the vision of a cognitive cloud-to-edge continuum as a key enabler for any emerging trends such as AI/generative AI.
The Cognitive Computing Continuum could eventually be extended to include other computational resources, such as HPC, and provide abstraction layers to maximize the benefits of available hardware.
Addressing all the above complexities calls for innovative research to overcome these challenges. The aim is to develop generic and AI-enabled cloud-edge technologies encompassing the whole computing continuum to empower the development of AI/generative AI technologies and applications. The proposals should demonstrate the generic applicability of the proposed technological solutions across various application domains such as but not limited to, manufacturing, healthcare, robotics, transportation and smart cities.
The following (one or more) research areas should be addressed:
- Development of novel mechanisms for the efficient development, deployment, and operation of AI workflows across heterogeneous and distributed infrastructures along the Edge to Cloud to HPC continuum that optimise training times, model accuracy and data management while factoring in performance metrics such as memory usage, energy efficiency, application processing and data transfer latency, and network overheads. These should factor in virtualisation and orchestration techniques that seamlessly integrate heterogeneous processor architectures and cater for the explainability of the applied cognitive optimisations.
- Decentralised and federated computing continuum tools and mechanisms to enable distributed AI architectures. These include scheduling, orchestration, and placement mechanisms that leverage the wide range of Edge computing environments available in the compute continuum, including on-device edge. Tools and mechanisms should take into consideration - where appropriate - data security and privacy aspects. The focus is on enhancing AI process execution through techniques such as model, data, hybrid parallelism and data compression, gossip, swarm, and federated training, or conditional computing.
- Cloud and edge processing tools and techniques to reduce AI processing power usage and emissions across the cognitive computing continuum, relying on hardware efficiency (for example, thanks to special-purpose accelerators and heterogeneous hardware processor architectures) and energy optimisation techniques, such as hardware and software approximation.
This topic implements the co-programmed European Partnership on AI, Data, and Robotics.
Projects are expected to develop synergies and relate to activities and outcomes of the Digital Europe Programme (DEP) and any existing or emerging Important Projects of Common European Interest (IPCEI) initiative, such as IPCEI-CIS.
All proposals are expected to share communicable results with the European R&D community, through the AI-on-demand platform, and if necessary other relevant digital resource platforms in order to enhance the European AI, Data and Robotics ecosystem through the sharing of results and best practice.
Expected Outcome
Project results are expected to contribute to the following expected outcomes:
- Novel AI-enabled Cloud and Edge management solutions tailored for the processing needs of AI workloads across the cognitive cloud-edge-IoT continuum.
- Strategic industrial cooperation across the Cloud-Edge-IoT cognitive computing continuum to support future hyper-distributed AI applications.
- Seamless and trustworthy integration and interoperability across diverse computing and data environments spanning from core cloud (including HPC) to edge to IoT and across different technology stacks.
- Enhanced openness and open strategic autonomy in the evolving data and AI-economies across the computing continuum validated through key business/societal sectors.
- Guaranteeing a minimum level of interoperability and portability thereby facilitating European access to foreign markets.
Scope
The Cloud to Edge Continuum needs to provide seamless and trustworthy integration of diverse computing and data environments spanning from core cloud to edge to IoT and support the enormous data, processing needs, and new resource types brought by next generation AI technologies.
Different types of AI processes pose different requirements that compute infrastructures need to meet to execute them. The state-of-the-art in generative AI and large language models is heavily reliant on high-performance processing and very large AI models. Cutting-edge hardware accelerators that power these processing systems are scarce on the market and only available in highly specialised, high-performance infrastructures in certain cloud and HPC environments at considerable costs. At the same time, the requirement to gather, process, and transmit massive amounts of data to the central data processing environment remains a barrier for many AI applications. All these factors urge the emergence of efficient tools and mechanisms to empower the distribution of AI training and inference processes throughout the computing continuum.
Empowering the next generation AI technologies with on-demand, agile and situation-aware infrastructure that brings data- and computing power to where and when it is needed will let end-users exploit Artificial Intelligence across the computing continuum without compromising on security and trust and optimising their energy use. These challenges span various aspects of the continuum, including on-device data processing, data orchestration and sharing, AI integration, decentralised intelligent management, decentralized and global optimization, energy and resource heterogeneity support, data management, security/privacy, and synergies with 5G/6G. Addressing these challenges is crucial for realizing the vision of a cognitive cloud-to-edge continuum as a key enabler for any emerging trends such as AI/generative AI.
The Cognitive Computing Continuum could eventually be extended to include other computational resources, such as HPC, and provide abstraction layers to maximize the benefits of available hardware.
Addressing all the above complexities calls for innovative research to overcome these challenges. The aim is to develop generic and AI-enabled cloud-edge technologies encompassing the whole computing continuum to empower the development of AI/generative AI technologies and applications. The proposals should demonstrate the generic applicability of the proposed technological solutions across various application domains such as but not limited to, manufacturing, healthcare, robotics, transportation and smart cities.
The following (one or more) research areas should be addressed:
- Development of novel mechanisms for the efficient development, deployment, and operation of AI workflows across heterogeneous and distributed infrastructures along the Edge to Cloud to HPC continuum that optimise training times, model accuracy and data management while factoring in performance metrics such as memory usage, energy efficiency, application processing and data transfer latency, and network overheads. These should factor in virtualisation and orchestration techniques that seamlessly integrate heterogeneous processor architectures and cater for the explainability of the applied cognitive optimisations.
- Decentralised and federated computing continuum tools and mechanisms to enable distributed AI architectures. These include scheduling, orchestration, and placement mechanisms that leverage the wide range of Edge computing environments available in the compute continuum, including on-device edge. Tools and mechanisms should take into consideration - where appropriate - data security and privacy aspects. The focus is on enhancing AI process execution through techniques such as model, data, hybrid parallelism and data compression, gossip, swarm, and federated training, or conditional computing.
- Cloud and edge processing tools and techniques to reduce AI processing power usage and emissions across the cognitive computing continuum, relying on hardware efficiency (for example, thanks to special-purpose accelerators and heterogeneous hardware processor architectures) and energy optimisation techniques, such as hardware and software approximation.
This topic implements the co-programmed European Partnership on AI, Data, and Robotics.
Projects are expected to develop synergies and relate to activities and outcomes of the Digital Europe Programme (DEP) and any existing or emerging Important Projects of Common European Interest (IPCEI) initiative, such as IPCEI-CIS.
All proposals are expected to share communicable results with the European R&D community, through the AI-on-demand platform, and if necessary other relevant digital resource platforms in order to enhance the European AI, Data and Robotics ecosystem through the sharing of results and best practice.
Partner Requests
Explore Real Collaboration Opportunities
🔍 As a logged-in member, you now have exclusive access to all active Partner Requests for this Funding Call.
See who’s looking for collaborators, explore exciting project ideas, and discover how others are planning to make an impact.
💡 Use these insights to get inspired—or take the next step and start a request of your own (first 3 entries for free).
Log in or registrate here for free.
Ask our experts about this call
Connect with the Listing Owner!
💬 Please log in now to send a direct message to our experts and ask your questions. Not a member yet? Sign up for free and start connecting today!
Related Funding and Finance Opportunities
Unlock Exclusive Funding Opportunities!
🔑 Get instant access to tailored funding opportunities that perfectly match your needs. This powerful feature is exclusively available to our premium members—helping you save time, stay ahead of the competition, and secure the right funding faster.
Upgrade to Premium now and never miss an important opportunity again! Already a premium member? Log in here to explore your matches.
Related Innovation Offers
Discover Tailored Innovation Offers!
🚀 Gain access to technology solutions that match your specific needs and interests—carefully selected to support your innovation goals. These offers are exclusively available to our premium members, helping you identify relevant technologies faster and start the right conversations with potential partners.
Upgrade to Premium now and explore your personalized technology matches today! Already a premium member? Log in here to view your tailored offers.
Related Knowledgeable Resources
Discover More with Premium: Related Knowledge Resources
🔒 You’re missing out on expert-curated knowledge specifically matched to this topic. As a Premium member, you gain exclusive access to in-depth articles, guides, and insights that help you make smarter decisions, faster.
Whether you’re preparing a funding proposal, researching a new market, or just need reliable information—our Premium knowledge matches save you hours of research and point you directly to what matters.
Upgrade to Premium now and instantly unlock relevant knowledge tailored to your needs! Already a member? Log in here to view your personalized content.
