Skip to content

sair-lab/AirLoc

Repository files navigation

AirLoc: Object Based Indoor Relocalisation

License: BSD 3-Clause Air Series

Introduction

Indoor relocalization is vital for both robotic tasks such as autonomous exploration and civil applications such as navigation with a cell phone in a shopping mall. Some previous approaches adopt geometrical information such as key-point features or local textures to carry out indoor relocalization, but they either easily fail in environments with visually similar scenes or require many database images. Inspired by the fact that humans often remember places by recognizing unique landmarks, we resort to objects, which are more informative than geometry elements. In this work, we propose a simple yet effective object-based indoor relocalization approach, dubbed AirLoc. To overcome the critical challenges including the object reidentification and remembering object relationships, we extract object-wise appearance embedding and inter-object geometric relationship. The geometry and appearance features are integrated to generate cumulative scene features. This results in a robust, accurate, and portable indoor relocalization system, which outperforms the state-of-the-art methods in room-level relocalization by 12% of PR-AUC and 8% of accuracy. Besides, AirLoc shows robustness in challenges like severe occlusion, perceptual aliasing, viewpoint shift, deformation, and scale transformation.

Live relocalization demo

AirLoc

Dependencies

Simply run the following commands:

git clone https://github.com/sair-lab/AirLoc.git
conda create --channel conda-forge --name airloc --file ./AirLoc/conda_requirements.txt
conda activate airloc
pip install torch==1.8.0+cu111 torchvision==0.9.0+cu111 torchaudio==0.8.0 -f https://download.pytorch.org/whl/torch_stable.html
pip install pyyaml opencv-python scipy tqdm tensorboard
pip install kornia==0.5.0

Data

For Data Loading, we use dataloaders present in the datasets folder. The dataloader support preprocessed pkl files from the Reloc110 scenes.

Please download data.zip (Preprocessed Queries) and database_raw.zip (Database)

Note: Data preprocessing is not required if you download preprocessed dataset from above link. But to preprocess dataset from scratch please refer to Preprocessing

The expected directory structure after preprocessing (or directly downloading preprocessed data):

data_collection/
   data/
      RPmz2sHmrrY.pkl
      S9hNv5qa7GM.pkl
            .
            .
   database_raw/
      mp3d/
         RPmz2sHmrrY/
               rooms/
         S9hNv5qa7GM/
              .
              .     

Pre-trained Models for Inference

For inference, please download the models.zip file:

Expected directory structure:

\models
   netvlad_model.pth
   gcn_model.pth
         .

Indoor Relocalization Evaluation

Accuracy

Please modify the eval_Airloc.yaml config file to test for different methods and datasets.

  • method: The method you want to generate test results

  • scenes: The scneces(Reloc110) you want to get test results for

  • base_dir: path to data folder

  • db_raw_path: path to database_raw folder

  • db_path: empty folder for saving preprocessed database

We save the preprocessed dataset at db_path in the first run to save time in further runs.

python eval_airloc.py -c config/eval_Airloc.yaml

PR-AUC

Please modify the eval_Airloc_prauc.yaml config file to test for different methods and datasets.

python eval_airloc_prauc.py -c config/eval_airloc_prauc.yaml

Training

To train AirLoc Geometry Module: (Please refer to train_airloc.yaml)

python train/train_airloc_geometry.py -c config/train_airloc.yaml

Watch Video

Puplication

@article{aryan2023airloc,
  title = {AirLoc: Object-based Indoor Relocalization},
  author = {Aryan and Li, Bowen and Scherer, Sebastian and Lin, Yun-Jou and Wang, Chen},
  journal = {arXiv preprint arXiv:2304.00954},
  year = {2023},
}

You may also download this paper.

About

Object-based Indoor Relocalization

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages