NavigationNet

Shanghai Jiao Tong University, Machine Vision and Intelligence Group(MVIG)

Introduction

NavigationNet is a large-scale interactive indoor navigation dataset. It contains 15 real-world scenes. Each scene consists a map and thousands of images taken from the corresponding real-world scene. To comprehensively describe the real-world scene, we built a smart robot to capture the views of eight different directions from hundreds of positions in the real-world scene (with a granularity of ~20cm). Using any scenes in NavigationNet, the system can move among the positions and change the viewpoint to virtually walking around the scene.

Developed by He Huang, Yujing Shen, Jiankai Sun, and Cewu Lu (corresponding authors)

Data Structure

Robot Movement

Functionality

  • Real-world indoor navigation dataset
  • Data output: real-world images from indoor scenes
  • Contorl Panel: MOVE FORWARD, MOVE BACKWARD, MOVE LEFT, MOVE RIGHT, TURN LEFT, TURN RIGHT
  • Stereo vision available

Demo

Dataset and Papers

Our dataset is available on OneDrive, and our paper can be downloaded from arXiv.

Bibtex

Please cite these papers if you use NavigationNet:

@article{
                          author = {He Huang, Yujing Shen, Jiankai Sun, Cewu Lu},
                          title = {NavigationNet: A Large-scale Interactive Indoor Navigation Dataset},
                          journal = {ArXiv e-prints},
                          eprint = {1808.08374},
                          year = {2018}
                          }