photo of the Las Vegas Strip at night
October 25, 2020

Mixed Reality and Robotics – Tutorial @ IROS 2020

Location: On-demand Virtual

Register

Tutorial Contents

We cover “big picture” ideas of Mixed Reality and how we envision that it will transform how we interact with robots, along with technical details on a few different ways to do colocalization to allow any Mixed or Augmented Reality device to share a coordinate frame with a robot.   Finally, there is a practical portion where we introduce a few of the tools that are necessary to create full Mixed Reality experiences with robotics. This takes the form of several demos that attendees will be able to build and run on their own, and adapt to use with their own robots.

The tutorial features five videos on the IROS 2020 streaming site (opens in new tab):

  1. Introduction to MR and Robotics [Direct Link (opens in new tab)]
  2. Interaction [Direct Link (opens in new tab)]
    • Mixed Reality as an intuitive bridge between robots and humans
    • MR, AR, VR, a brief overview of differences and sample devices
    • Modes of Interaction in MR
  3. Colocalization [Direct Link (opens in new tab)]
    • Co-localization with Mixed Reality devices
      • AR-tag-based
      • Vision-based
      • Shared-map-based
    • Azure Spatial Anchors
      • Technical introduction
      • How to use ASA to colocalize different devices
  4. Demo 1: Interaction [Direct Link (opens in new tab)] [Source Code (opens in new tab)]
    • Writing and deploying phone and Hololens apps
      • Unity
      • ROS# and ROS bridge for interfacing with ROS
    • Interacting with a virtual robot through AR and MR
  5. Demo 2: Colocalization [Direct Link (opens in new tab)] [Source Code (opens in new tab)]
    • Azure Spatial Anchors SDK for localization of robots and MR devices
    • Creating and querying spatial anchors using sample data
    • How to use this code with your own camera

Demo Materials

Demo 1 – Interaction

Sample code for the exercises in this demo can be found here: https://github.com/microsoft/mixed-reality-robot-interaction-demo (opens in new tab)

This repo contains an extensive wiki (opens in new tab) with instructions on how to run the demo with pre-built apps and docker containers, how to set up your system to develop and deploy MR apps, and how to adapt the sample code to your own robot.

Demo 2 – Colocalization

This demo focuses on a special research-only software package: the Azure Spatial Anchors Linux SDK.  Instructions for obtaining the closed-source binaries and open-source ROS wrapper can be found at the wrapper’s github page: https://github.com/microsoft/azure_spatial_anchors_ros (opens in new tab)

The wiki (opens in new tab) in this repo contains instructions for running the demo using sample datasets, an overview of the structure of the ASA interface and features of the ROS node, as well as some tips for using ASA from a live camera.

Conclusion

We hope that this information and these tools help you to incorporate Mixed Reality into your robotics projects, for colocalization and/or human-robot interaction.  We would like to encourage you to send us feedback on your experience with the tutorial.  Please engage with us on github by filing issues (for questions or problems not covered in the wikis) or contributing to the two repositories.

As a reminder, please register on the event registration site (opens in new tab)! This is to help us get an estimate of how many people will use the course materials and will help us to share more information with attendees.