Table of Contents

Yu Hang He Tesla 2019 Journal

Author: Yu Hang He, Email: hey6@unlv.nevada.edu
Date Last Modified: 10/16/2019

Week 11

New Person

This week, I had the opportunity to meet Andrew Payne. He did his PhD at Harvard and is an expert in designing microelectronics and integrating them into larger computer based systems. Matt and Andrew seems to know each other for a while and he brought Andrew on the team as a contractor to help develop more payloads for Spot. Andrew will be focusing on developing the electronic system to allow payloads to be connected to the payload port on Spot's back. I very much appreciate the skill sets that Andrew brought to the team since I do not have experience or confidence to design an electrical system to be integrated with Spot. Andrew already drafted a prototype electric system and I hope to hear good news from him.

What I Learned about Myself

I was always interested in the topic of SLAM since it is such an important area of research in robotics field. According Boston Dynamics, Spot has visual SLAM capability, however, they are not released to the developers until the next update (near end of October they promised). In the meanwhile, I started reading some research papers on visual odometry and visual SLAM. It is very fascinating to learn about these new concepts. I hope I apply them with the RBG-D camera.

Project Status

This week I focused on the integrating Ricoh Theta V camera with Spot. Ricoh Theta V is a popular 360 camera on the market. Matt is hoping to test Ricoh Theta V mounted on Spot to perform construction site monitoring. He was inspired by a company called HoloBuilder that developed a workflow of using 360 cameras to monitor and document construction progress. This Thursday, they had a demo with Boston Dynamics at the San Francisco Airport's new terminal. I could not attend the demo myself, but one of my collegue, Bahir, was there to check out their system. His general impression was that their system could potentially integrated into Tesla's BIM development. Their roadmap looks promising. However, a lot of work would be required.

I started looking into the SDK for the RT V. I was hoping that they have a framework for wireless live streaming so that an operator may have a 360 awareness of the surrounding while operating Spot. However, their live streaming solution seems to only works with online streaming services like YouTube and Facebook. Since the Spot network does not have access to internet and online streaming is prohibited inside factory. I must find another way to live stream.

Besides the 360 camera, I looked into the SDK for Intel RealSense D435 camera. The D435 camera active stereo technique to create a depth map in addition to its color camera. The depth map can be treated as a very low resolution point cloud. The idea behind RGB-D camera is to use it as an additional camera for computer vision and visual SLAM. I spent sometime trying to learn some basic concepts about visual odometry and visual SLAM. I found an open source VSLAM algorithm called ORB-SLAM. I am working on integrating ORB-SLAM algorithm and D435.

Project Agenda

The goal for next week would be to complete the integration between Ricoh Theta V and Spot GUI. In addition, I want to find a solution to wireless live streaming the 360 camera feed to the operator. Although Spot has cameras mounted on each side, they are busy being used for Spot's obstacle detection algorithm. A 360 camera view can give operator a better understanding of the surrounding. Finally, I want to test the performance of ORB-SLAM with the D435 camera.