This repository contains resources and research artifacts for the paper "SiTAR: Situated Trajectory Analysis for In-the-Wild Pose Error Estimation" that appeared in Proceedings of IEEE ISMAR 2023. It includes the code required to implement SiTAR, as well as samples of the new open-source VI-SLAM datasets we created to evaluate our pose error estimation method.
To create our new VI-SLAM datasets we used our previously published game engine-based emulator, Virtual-Inertial SLAM. For more information on this tool, implementation code and instructions, and examples of the types of projects it can support, please visit the Virtual-Inertial SLAM GitHub repository.
Our SiTAR system provides situated visualizations of device pose error estimates, on real AR devices (implemented here for ARCore). Our code facilitates three types of pose error visualizations, illustrated in the image below -- 1) trajectory-only (left), 2) trajectory + exclamation points (middle), 3) trajectory + warning signs (right):
The system architecture for SiTAR is shown below. The system frontend which generates situated trajectory visualizations is implemented on the user AR device, and the system backend which generates pose error estimates is implemented on a server and playback AR device(s). The system backend can be implemented using an edge or cloud server.
Below is a short demo video of our SiTAR system in action, using an edge-based architecture. A Google Pixel 7 Pro is used as the User AR device, an Apple Macbook Pro as the server, and a Google Pixel 7 as the playback AR device. The video shows the following steps:
- Creation of a trajectory on the user AR device ('Trajectory creation').
- Replaying of the visual and inertial input data for that trajectory on the playback AR device to obtain multiple trajectory estimates ('Sequence playback').
- Situated visualization of the trajectory on the user AR device before pose error estimates are added ('Trajectory visualization without error estimates').
- Our uncertainty-based pose error estimation running on the server ('Uncertainty-based error estimation').
- Situated visualization of the trajectory on the user AR device once pose error estimates are added, with high pose error associated with the blank wall highlighted using our 'trajectory + exclamation points' visualization ('Trajectory visualization with error estimates').
Our implementation code and associated resources for SiTAR are provided in three parts, for the user AR device, the server and the playback AR device respectively. The code for each can be found in the repository folders named 'user-AR-device', 'server', and 'playback-AR-device'. The implementation resources consist of the following:
User AR device: A C# script DrawTrajectory.cs, which implements the 'Trajectory creation' and 'Trajectory visualization' modules in SiTAR. Unity prefabs for base trajectory visualization, Start.prefab, Stop.prefab, Cylinder.prefab, Joint.prefab and Frustum.prefab. Unity prefabs and materials for pose error visualizations, ErrorAreaHigh.prefab, ErrorAreaMedium.prefab, ErrorPatchHigh.prefab, ErrorPatchMedium.prefab, ErrorHigh.mat and ErrorMedium.mat.
Server: a Python script SiTAR-Server.py, which implements the 'Sequence assignment' and 'Uncertainty-based error estimation' modules in SiTAR.
Playback AR device: a C# script TrajectoryPlayback.cs, which implements the 'Sequence playback' module in SiTAR.
Prerequisites: 2 or more Android devices running ARCore v1.3 or above; server with Python 3.8 or above and the evo (https://github.com/MichaelGrupp/evo) and FastAPI (https://fastapi.tiangolo.com/lo/) Python packages installed, and Android SDK Platform Tools installed (https://developer.android.com/tools/releases/platform-tools). For building the necessary apps to AR devices, Unity 2021.3 or later is required, with the AR Foundation framework v4.2 or later and the ARCore Extensions v1.36 or later packages installed.
Tested with Google Pixel 7 and Google Pixel 7 Pro devices running ARCore v1.31, and Apple Macbook Pro as edge server (Python 3.8).
User AR device:
- Create a Unity project with the AR Foundation template. Make sure the ARCore Extensions is fully set up by following the instructions here: https://developers.google.com/ar/develop/unity-arf/getting-started-extensions.
- Add the DrawTrajectory.cs script (in the user-AR-device folder) to the AR Session Origin GameObject.
- Drag the AR Camera GameObject to the 'Camera Manager' and 'Camera' slots in the Draw Trajectory inspector panel.
- Add the Start.prefab, Stop.prefab, Cylinder.prefab, Joint.prefab and Frustum.prefab files (in the user-AR-device folder) to your Assets folder, and drag them to the 'Start Prefab', 'Stop Prefab', 'Cylinder Prefab', 'Joint Prefab' and 'Frustum Prefab' slots in the Draw Trajectory inspector panel.
- (Optional) If using the exclamation points or warning signs visualizations, add the ErrorAreaHigh.prefab, ErrorAreaMedium.prefab, ErrorPatchHigh.prefab, and ErrorPatchMedium.prefab files (in the user-AR-device folder) to your Assets folder, and drag them to the 'Error Area High Prefab', 'Error Area Medium Prefab', 'Error Patch High Prefab', and 'Error Patch Medium Prefab' slots in the Draw Trajectory inspector panel.
- Add the ErrorHigh.mat and ErrorMedium.mat files (in the user-AR-device folder) to your Assets folder, and drag them to the 'Error High' and 'Error Medium' slots in the Draw Trajectory inspector panel.
- Add Start and Stop UI buttons, drag them to the 'Start Button' and 'Stop Button' slots in the Draw Trajectory inspector panel, and set their OnClick actions to 'DrawTrajectory.HandleStartClick' and 'DrawTrajectory.HandleStopClick' respectively.
- Either hardcode your server IP address into line 481 of DrawTrajectory.cs, or add a UI panel with a text field to capture this data from the user.
- (Optional) Add UI text objects to display SiTAR status, trajectory duration, length, average environment depth, and drag them to the 'Status', 'Trajectory Duration', 'Trajectory Length' and 'Trajectory Depth' slots in the Draw Trajectory inspector panel.
- (Optional) Add audio clips for notifying when error estimates are ready, user captures image, and user has captured all regions, and drag them to the 'Audio Results', 'Audio Capture' and 'Audio Complete' slots in the Draw Trajectory inspector panel.
- Set the Build platform to Android, select your device under Run device, and click Build and Run.
Server:
- Create a folder on the server where SiTAR files will be located. Add an additional sub-folder named 'trajectories'.
- Download the server folder in the repository to your SiTAR folder.
- Open the SiTAR-Server.py file in the server folder, complete the required configuration parameters on lines 20-29, and save.
- In Terminal or Command Prompt, navigate to your SiTAR folder.
- Start the server using the following command:
uvicorn server.SiTAR-Server:app --host 0.0.0.0
Playback AR device:
- Create a Unity project with the AR Foundation template. Make sure the ARCore Extensions is fully set up by following the instructions here: https://developers.google.com/ar/develop/unity-arf/getting-started-extensions.
- (Optional) Add the AR Plane Manager and AR Point Cloud Manager scripts (included in AR Foundation) to the AR Session Origin GameObject if you wish to visualize planes and feature points during playback.
- Add the TrajectoryPlayback.cs script (in the playback-AR-device folder) to the AR Session GameObject.
- Create a UI text object to display log messages, and drag it to the 'Log' slot in the Trajectory Playback inspector panel.
- Drag the AR Camera GameObject to the 'Camera Manager' and 'Camera' slots in the Trajectory Playback inspector panel.
- Set the Build platform to Android, select your device under Run device, and click Build and Run.
Our Hall and LivingRoom VI-SLAM datasets that we created to evaluate our uncertainty-based pose error estimation method can be downloaded here: https://drive.google.com/drive/folders/1VwAgcCly0RDUmyME4MHDrcBfkXRbitpC?usp=sharing .
Each dataset is contained in a separate folder (e.g., Hall.zip), which contains sub-folders for each sequence, along with the required ORB-SLAM3 configuration file, config.yaml (which contains camera intrinsics and extrinsics, imu noise parameters, ORB extractor parameters and visualization settings). Each sequence folder contains the following (formatted to streamline execution in ORB-SLAM3):
- groundtruth folder, containing formatted ground truth pose for sequence (data.csv) plus sensor characteristics from original SenseTime dataset (sensor.yaml).
- mav0 folder, containing cam0/data folders with camera images, and imu0 folder with formatted IMU data (data.csv) plus sensor characteristics from original SenseTime dataset (sensor.yaml).
- sequence_name.txt file (e.g., A1.txt), containing list of camera image timestamps (format required by ORB-SLAM3).
If you use SiTAR in an academic work, please cite:
@inproceedings{SiTAR,
title={SiTAR: Situated trajectory analysis for in-the-wild pose error estimation},
author={Scargill, Tim and Chen, Ying and Hu, Tianyi and Gorlatova, Maria},
booktitle={Proceedings of IEEE ISMAR 2023},
year={2022}
}
The authors of this repository are Tim Scargill and Maria Gorlatova. Contact information of the authors:
- Tim Scargill (timothyjames.scargill AT duke.edu)
- Maria Gorlatova (maria.gorlatova AT duke.edu)
This work was supported in part by NSF grants CSR-1903136, CNS-1908051 and CNS-2112562, NSF CAREER Award IIS-2046072, a Meta Research Award and a CISCO Research Award.