This page has only limited features, please log in for full access.
A reliable environment perception is a crucial task for autonomous driving, especially in dense traffic areas. Recent improvements and breakthroughs in scene understanding for intelligent transportation systems are mainly based on deep learning and the fusion of different modalities. In this context, we introduce OLIMP: A heterOgeneous Multimodal Dataset for Advanced EnvIronMent Perception. This is the first public, multimodal and synchronized dataset that includes UWB radar data, acoustic data, narrow-band radar data and images. OLIMP comprises 407 scenes and 47,354 synchronized frames, presenting four categories: pedestrian, cyclist, car and tram. The dataset includes various challenges related to dense urban traffic such as cluttered environment and different weather conditions. To demonstrate the usefulness of the introduced dataset, we propose a fusion framework that combines the four modalities for multi object detection. The obtained results are promising and spur for future research.
Amira Mimouna; Ihsen Alouani; Anouar Ben Khalifa; Yassin El Hillali; Abdelmalik Taleb-Ahmed; Atika Menhaj; Abdeldjalil Ouahabi; Najoua Essoukri Ben Amara. OLIMP: A Heterogeneous Multimodal Dataset for Advanced Environment Perception. Electronics 2020, 9, 560 .
AMA StyleAmira Mimouna, Ihsen Alouani, Anouar Ben Khalifa, Yassin El Hillali, Abdelmalik Taleb-Ahmed, Atika Menhaj, Abdeldjalil Ouahabi, Najoua Essoukri Ben Amara. OLIMP: A Heterogeneous Multimodal Dataset for Advanced Environment Perception. Electronics. 2020; 9 (4):560.
Chicago/Turabian StyleAmira Mimouna; Ihsen Alouani; Anouar Ben Khalifa; Yassin El Hillali; Abdelmalik Taleb-Ahmed; Atika Menhaj; Abdeldjalil Ouahabi; Najoua Essoukri Ben Amara. 2020. "OLIMP: A Heterogeneous Multimodal Dataset for Advanced Environment Perception." Electronics 9, no. 4: 560.