ENM-MCL: Improving Indoor Localization Accuracy by Using an Efficient Implicit Neural Map Representation
Haofei Kuang · Yue Pan · Xingguang Zhong · Louis Wiesmann · Jens Behley · Cyrill Stachniss
Accuarate and efficient indoor global localization with ENM-MCL in the Lab of Uni.Bonn.
[Details (click to expand)]
Globally localizing a mobile robot in a known map is often a foundation for enabling robots to navigate and operate autonomously. In indoor environments, traditional Monte Carlo localization based on occupancy grid maps is considered the gold standard, but its accuracy is limited by the representation capabilities of the occupancy grid map. In this paper, we address the problem of building an effective map representation that allows to accurately perform probabilistic global localization. To this end, we propose an implicit neural map representation that is able to capture positional and directional geometric features from 2D LiDAR scans to efficiently represent the environment and learn a neural network that is able to predict both, the non-projective signed distance and a direction-aware projective distance for an arbitrary point in the mapped environment. This combination of neural map representation with a light-weight neural network allows us to design an efficient observation model within a conventional Monte Carlo localization framework for pose estimation of a robot in real time. We evaluated our approach to indoor localization on a publicly available dataset for global localization and the experimental results indicate that our approach is able to more accurately localize a mobile robot than other localization approaches employing occupancy or existing neural map representations. In contrast to other approaches employing an implicit neural map representation for 2D LiDAR localization, our approach allows to perform real-time pose tracking after convergence and near real-time global localization.The code was tested with Ubuntu 22.04 with:
- python version 3.10.
- pytorch version 2.6.0 with CUDA 11.8
We recommend using Conda to install the dependencies:
cd ~ && git clone https://github.com/PRBonn/enm-mcl.git
cd ~/enm-mcl
conda env create -f environment.yml
conda activate enmmcl
Or install manually:
conda create --name enmmcl python=3.10
conda activate enmmcl
pip install torch torchvision torchaudio --index-url https://download.pytorch.org/whl/cu118
pip install matplotlib scipy open3d opencv-python
pip install evo --upgrade --no-binary evo
conda install -c conda-forge pybind11
Follow the instructions to compile the motion model and resampling modules:
cd ~/enm-mcl/localization/mcl/ && conda activate enmmcl
make -j4
Please refer to PREPARE_DATA to prepare the datasets
-
Run ENM training on the mapping sequence:
cd ~/enm-mcl && conda activate enmmcl python run_mapping.py --config_file configs/mapping_enm.yaml
-
Evaluate the trained ENM model:
cd ~/enm-mcl && conda activate enmmcl python eval_map.py --config_file configs/mapping_enm.yaml
- Run ENM-MCL on a test sequence:
Results will be saved to the
cd ~/enm-mcl && conda activate enmmcl python run_localization.py --config_file configs/global_localization/loc_config_test1.yaml
results/
folder.
- Download the pre-trained ENM model from here and save it to the
results/
folder.cd ~/enm-mcl && mkdir results && cd results wget https://www.ipb.uni-bonn.de/html/projects/kuang2025icra/enm_map.pth
- Run the following command to evaluate the ENM-MCL on the test sequences:
cd ~/enm-mcl && conda activate enmmcl ./run_loc.sh
If you use this library for any academic work, please cite our original paper.
@inproceedings{kuang2025icra,
author = {H. Kuang and Y. Pan and X. Zhong and L. Wiesmann and J. Behley and Stachniss, C.},
title = {{Improving Indoor Localization Accuracy by Using an Efficient Implicit Neural Map Representation}},
booktitle = {Proceedings of the IEEE International Conference on Robotics and Automation (ICRA)},
year = {2025}
}
If you have any questions, please contact:
- Haofei Kuang {[email protected]}
IR-MCL (RAL 23): Implicit Representation for Monte Carlo Localization
LocNDF (RAL 23): Neural Distance Field Mapping for Robot Localization
SHINE-Mapping (ICRA 23): Large-Scale 3D Mapping Using Sparse Hierarchical Implicit Neural Representations
This work has partially been funded by:
- the Deutsche Forschungsgemeinschaft (DFG, German Research Foundation) under Germany's Excellence Strategy, EXC-2070 -- 390732324 -- PhenoRob,
- and by the German Federal Ministry of Education and Research (BMBF) in the project "Robotics Institute Germany" under grant No. 16ME0999.