Aarhus University Seal

Multimodal Extreme Scale Data Analytics for Smart Cities Environments - MARVEL

 

Smart City is a broad description of a city that employs wide ranging digital solutions to become more efficient, increase the residents’ quality of life and safety. With the increase of digital solutions in a smart city, the city itself becomes a producer of vast amount of data from various sensors and devices. These extreme scale heterogenous datasets pose the challenge of extracting valuable information from data, but also provide an unprecedented opportunity to shift the traditional methodologies into new, extreme scale and real-time data analytics.

The MARVEL project aims to converge a set of technologies in the areas of AI, analytics, multimodal perception, software engineering and Edge-Fog-Cloud Computing in order to support data-driven real-time application workflows and decision making on large scale datasets, ultimately empowering smart cities authorities to address societal challenges effectively. This can be done by addressing the rising challenges in the Big Data Value chain regarding data acquisition, data analysis and processing, data storage and curation, and data visualisation, by prioritising and strengthening open science and open data through sharing a Data Corpus to drive research and innovation in multimodal processing and analytics, and by engaging citizens to promote breakthrough innovation.

MARVEL proposes a comprehensive framework for extreme scale multi-modal AI-based analytics in smart cities environments, achieving multimodal perception and intelligence for audio-visual scene recognition, event detection and situational awareness. The framework utilizes innovative Big Data technologies along the complete data processing chain, including:

  • privacy aware multimodal AI tools and methods
  • multimodal audio-visual data capture and processing
  • co-design of edge and fog ML/DL models through federated learning
  • extreme-scale multimodal analytics for real-time decision making at all infrastructure levels (Edge, Fog, Cloud)
  • continuously optimized, resource adapted edge and fog ML/DL deployment
  • advanced visualization techniques including text-annotated audio-visual attention maps, enabled by multimodal analytics and content oriented processing.

The MARVEL project gathers 17 partners from 12 European countries, and its solutions will be tested in the cities of Trento and Malta, and the campus of the University of Novi Sad. The Aarhus University team is focusing on providing novel and efficient multimodal (audio-visual) data analysis methodologies.

You can find more information and news about the MARVEL project on its website: https://www.marvel-project.eu/