Remote 6 DoF Simultaneous Localization and Mapping for Network-Enabled Low-Compute Devices

Proceedings of The 3rd International Academic Conference on Research in Engineering and Technology

Year: 2024

DOI:

[PDF]

 

Remote 6 DoF Simultaneous Localization and Mapping for Network-Enabled Low-Compute Devices

PiotrWójcik

 

 

ABSTRACT:

This paper presents a 6 DoF real time Simultaneous Localization and Mapping (SLAM) system operating entirely outside of the localized device, developed by 1000 realities as part of the “Edge Realities 2.0” project. The main novelty is the ability of our SLAM system to provide visual or visual-inertial inside-out tracking and mapping to a device from an entirely external server, based solely on the raw input from the device’s single RGB camera and IMU transmitted over a network. The output is a 6 DoF pose that is in turn transmitted back to the device in real time. This manner of operation results in enabling a wide range of low-compute, network-enabled devices with various capabilities derived from SLAM (e.g. autonomous navigation, augmented reality etc.), as well as significant compute offload and battery life extension for devices already capable of running onboard SLAM. Our experiments indicate that the system is capable of robust and reliable operation over commercial networks with a wide variety of devices, providing state of the art accuracy, with the additional capabilities of mapping and tracking environments of unprecedented scale, as well as providing out of the box accurate shared SLAM for multiple devices.

keywords: slam, cloud, edge, navigation, ar