Table of Contents

HEBI Robotic Arm VR Teleoperation

Author: Yu Hang He Email: yuhangheedu@gmail.com

Date: Last modified on <12/21/20>

Keywords: HEBI robotic arm, VR, Teleoperation, Unity, Mixed Reality Toolkit 2, C#, ROS, RealSense D435

This tutorial will demonstrate how to create a VR teleoperation system for HEBI 6 DoF robotic arm. This includes a VR system programmed in Unity using Mixed Reality Toolkit and a ROS program to control HEBI robotic arm. This tutorial take approximately 5 hours to complete.

Motivation and Audience

This tutorial's motivation is to demonstrate how to create a VR teleoperation system for controlling manipulator. This tutorial assumes the reader has the following background and interests:

The rest of this tutorial is presented as follows:

Prerequisite

The author assume that the reader already setup the ROS environment for HEBI arm on a separate Linux machine. If not, the reader should follow the instruction from HEBI Robotic, and try the Arm Node example. You can also check out the tutorial by Jason Kreitz on HEBI arm.

The author also assume that the reader are familiar with Unity and Mixed Reality Toolkit before starting this tutorial. Unity provides some very beginner friendly tutorial for you to get started: Unity Tutorials. You can also learn about Mixed Reality Toolkit by Microsoft from these tutorials

Import and Apply Kinematic to Robot Model

1. Obtain a URDF of the manipulator

2. Add BioIK resource to Unity project

3. Add ROS # resources to Unity project

4. Add URDF resources to Unity project

5. Import the robot model into Unity through URDF using ROS #

6. Add Bio IK component to the imported manipulator game object

7. Specify joints in Bio Ik

8. Create a GameObject as the Child of the last joint and name it End_effector

9. Add Position and Orientation objectives to the End_effector in Bio Ik

10. Create a Sphere GameObject and name it Target

11. To Test if the Bio IK is working properly, play the scene and move the Target around. The arm should follow the position and orientation of the Target

12. Apply joint limit of each individual joint in Bio IK.

13. In Bio IK options, change Motion Type to Realistic

14. Set the home position of the gripper by editing the joint position

VR Teleoperation

1. Add MRTK to the Unity project

2. Following the instruction to properly configure MRTK to the Unity Project

3. Enable Virtual Reality Support

4. Set the Target to track the controller

5. Assign custom input actions to controller buttons

6. Start and stop controller tracking base on input action

7. Move the Target to the initial position of the gripper

8. Test the VR teleoperation

Setup ROS Communication

1. Install rosbridge-suite via

 $ sudo apt-get install ros-melodic-rosbridge-server

2. Place the file_server package in the src folder of Catkin workspace, then build by running

 $ catkin_make

3. Modified the IP address and port number in ros_sharp_communication.launch in the file_server/launch

| ros_sharp_communication.launch
<launch>
 
	<arg name="port" default="9090" />
 
	<include file="$(find rosbridge_server)/launch/rosbridge_websocket.launch">
		<arg name="port" value="9090" />
                <arg name="address" value="192.168.201.45" />
	</include>
 
	<node name="file_server" pkg="file_server" type="file_server" output="screen" />
 
</launch>

4. After source setup.bash, try running

 $ roslaunch ros_sharp_communication.launch

If running into error with No module name builtins, try

 $ pip install future

5. If the rosbridge websocket launched successfully, the following message should appear 6. Test websocket connection

7. The console in Unity should display the following on successful connection

Publish JointTrajectory Message through ROS #

1. The HEBI ROS example code Arm Node accepts ROS message type trajectory_msgs/JointTrajectory published on the topic /joint_waypoints

2. The following is an example of the ROS JointTrajectory message that the example code can accept

rostopic pub /joint_waypoints trajectory_msgs/JointTrajectory "header:
  seq: 0
  stamp:
    secs: 0
    nsecs: 0
  frame_id: ''
joint_names:
- ''
points:
- positions: [0.51, 2.09439, 2.09439, 0.01, 1.5707963, 0.01]
  velocities: [0, 0, 0, 0, 0, 0]
  accelerations: [0, 0, 0, 0, 0, 0]
  effort: []
  time_from_start: {secs: 0, nsecs: 0}

3. Copied from Hebi CPP example ““Note: for the JointTrajectory messages, you can ignore the header and the “names” fields, as well as the “efforts”. You must fill in the “positions”, “velocities”, and “accelerations” vectors for each waypoint, along with the desired time_from_start for each waypoint (these must be monotonically increasing).””

4. To simplify, for this project, each JointTrajectory message will be treated as a trajectory with only one point. The VR teleoperating system will continuously update the trajectory at high Hz to achieve continuous motion.

5. Create a custom ROS # publisher that can publish JointTrajectory message and update the position parameter with joint angles from BIO IK.

| JointTrajectoryPublisher.cs
using System.Collections;
using System.Collections.Generic;
using UnityEngine;
 
namespace RosSharp.RosBridgeClient
{
    public class JointTrajectoryPublisher : UnityPublisher<MessageTypes.Trajectory.JointTrajectory>
    {
        private string FrameId = "Unity";
        private BioIK.BioIK hebi;
 
        private MessageTypes.Trajectory.JointTrajectory message;
 
        protected override void Start()
        {
            base.Start();
            hebi = this.gameObject.GetComponent<BioIK.BioIK>();
            InitializeMessage();
        }
 
        private void InitializeMessage()
        {
            message = new MessageTypes.Trajectory.JointTrajectory
            {
                header = new MessageTypes.Std.Header { frame_id = FrameId },
                points = new MessageTypes.Trajectory.JointTrajectoryPoint[1],
            };
            message.points[0] = new MessageTypes.Trajectory.JointTrajectoryPoint();
            message.points[0].positions = new double[] { 0, 0, 0, 0, 0, 0 };
            message.points[0].velocities = new double[] { 0, 0, 0, 0, 0, 0 };
            message.points[0].accelerations = new double[] { 0, 0, 0, 0, 0, 0 };
            message.points[0].effort = new double[] { 0, 0, 0, 0, 0, 0 };
            message.points[0].time_from_start = new MessageTypes.Std.Duration { nsecs= 20000000 };
        }
 
        private void FixedUpdate()
        {
            int i = 0;
            foreach (BioIK.BioSegment segment in hebi.Segments)
            {
                if (segment.Joint != null)
                {
                    double angle = 0.0;
                    if (segment.Joint.X.IsEnabled())
                    {
                        angle = segment.Joint.X.GetCurrentValue();
                    }
                    else if (segment.Joint.Y.IsEnabled())
                    {
                        angle = segment.Joint.Y.GetCurrentValue();
                    }
                    else if (segment.Joint.Z.IsEnabled())
                    {
                        angle = segment.Joint.Z.GetCurrentValue();
                    }
                    message.points[0].positions[i] = angle * Mathf.PI / 180;
                    i++;
                }
            }
            Publish(message);
        }
    }
}

6. Add JointTrajectoryPublisher component to the model of HEBI arm that contains the BIO IK. 7. Test the system while only running the file_server on the Ubuntu machine controlling the HEBI arm.

$ rostopic echo /joint_waypoints

The following ROS msg should be updating on the terminal

Realsense Camera Feed

1. Install the ROS Wrapper for Intel RealSense SDK following the instruction

2. Start camera node in ROS

 $ roslaunch realsense2_camera rs_camera.launch
 

3. Check that /camera messages are being published successfully

4. Add a MRTK Slate Prefab to the project

5. Add a ImageSubscriber component along with ROSconnector

6. The camera feed from RealSense can be now seen on the Slate prefab in the VR scene

Gripper Control

1. Following the last section of tutorial in HEBI arm to set up gripper control

2. Test gripper node

 rostopic pub /gripper_strength std_msgs/Float64 "data: 0.5"

3. Create C# script in Unity to send the proper ROS msg

| GripperStrengthPublisher.cs
using System.Collections;
using System.Collections.Generic;
using UnityEngine;
using Microsoft.MixedReality.Toolkit.Input;
using Microsoft.MixedReality.Toolkit;
 
namespace RosSharp.RosBridgeClient
{
    public class GripperStrengthPublisher : UnityPublisher<MessageTypes.Std.Float64>
    {
        private MessageTypes.Std.Float64 message;
 
        protected override void Start()
        {
            base.Start();
            InitializeMessage();
        }
 
        private void InitializeMessage()
        {
            message = new MessageTypes.Std.Float64
            {
                data = 0.0f
            };
        }
 
        private void FixedUpdate()
        {
            // Listening to all controller inputs detected by MRTK
            foreach (var controller in CoreServices.InputSystem.DetectedControllers)
            {
                // Interactions for a controller is the list of inputs that this controller exposes
                foreach (MixedRealityInteractionMapping inputMapping in controller.Interactions)
                {
                    // Isolate Thumbstick signal
                    if (inputMapping.Description == "Trackpad-Thumbstick Position")
                    {
                        // Small deadzone
                        if (inputMapping.Vector2Data.y > 0.05f)
                        {
                            // Convert input signal to gripper control signal
                            message.data = inputMapping.Vector2Data.y * 0.5;
                            Publish(message);
                            break;
                        }
                    }
 
                }
            }
        }
    }
}

4. Add GripperStrengthPublisher.cs as a component

5. While the gripper node is running, test the code by moving the thumbstick on the Index controller up.

Demonstration

In this demonstration, I teleoperated the HEBI arm using HTC Vive to grab a 3D printed component.

Final Words

For questions, clarifications, etc, Email: yuhangheedu@gmail.com