====== HEBI Robotic Arm VR Teleoperation====== **Author:** Yu Hang He **Email:** **Date:** Last modified on <12/21/20> **Keywords:** HEBI robotic arm, VR, Teleoperation, Unity, Mixed Reality Toolkit 2, C#, ROS, RealSense D435 This tutorial will demonstrate how to create a VR teleoperation system for HEBI 6 DoF robotic arm. This includes a VR system programmed in Unity using Mixed Reality Toolkit and a ROS program to control HEBI robotic arm. This tutorial take approximately 5 hours to complete. ===== Motivation and Audience ===== This tutorial's motivation is to demonstrate how to create a VR teleoperation system for controlling manipulator. This tutorial assumes the reader has the following background and interests: * Familiar with forward and inverse kinematic of robotic manipulator * Familiar with HEBI robotic arm * Familiar with C# and ROS * Familiar with Unity and Mixed Reality Toolkit 2 The rest of this tutorial is presented as follows: * [[hebi_vr_teleo#Prerequisite | Prerequisite]] * [[hebi_vr_teleo#Import_and_Apply_Kinematic_to_Robot_Model | Import and Apply Kinematic to Robot Model]] * [[hebi_vr_teleo#VR_Teleoperation | VR Teleoperation]] * [[hebi_vr_teleo#Setup_ROS_Communication | Setup ROS Communication]] * [[hebi_vr_teleo#Publish_JointTrajectory_Message_through_ROS_# | Publish JointTrajectory Message through ROS #]] * [[hebi_vr_teleo#Realsense_Camera_Feed | Realsense Camera Feed]] * [[hebi_vr_teleo#Gripper Control | Gripper Control]] * [[hebi_vr_teleo#Demonstration | Demonstration]] * [[hebi_vr_teleo#Final_Words | Final Words]] ===== Prerequisite ===== The author assume that the reader already setup the ROS environment for HEBI arm on a separate Linux machine. If not, the reader should follow the instruction from [[http://wiki.ros.org/hebi_cpp_api_examples| HEBI Robotic]], and try the [[http://wiki.ros.org/hebi_cpp_api_examples/ArmNode|Arm Node]] example. You can also check out the tutorial by Jason Kreitz on [[https://www.daslhub.org/unlv/wiki/doku.php?id=hebi_arm_tutorial| HEBI arm]]. The author also assume that the reader are familiar with Unity and Mixed Reality Toolkit before starting this tutorial. Unity provides some very beginner friendly tutorial for you to get started: [[https://learn.unity.com/tutorials|Unity Tutorials]]. You can also learn about Mixed Reality Toolkit by Microsoft from these [[https://docs.microsoft.com/en-us/windows/mixed-reality/develop/unity/unity-development-overview?tabs=mrtk%2Carr%2Chl2 | tutorials]] ===== Import and Apply Kinematic to Robot Model ===== 1. Obtain a URDF of the manipulator * [[https://github.com/HebiRobotics/hebi_description | HEBI Robotic URDF]] * Convert xacro to URDF using * ''rosrun xacro xacro --xacro-ns -o model.urdf model.xacro'' 2. Add BioIK resource to Unity project * [[https://assetstore.unity.com/packages/tools/animation/bio-ik-67819 | BIO IK]] 3. Add ROS # resources to Unity project * [[https://github.com/siemens/ros-sharp | ROS #]] 4. Add URDF resources to Unity project 5. Import the robot model into Unity through URDF using ROS # * move the model.urdf file to the outside of hebi_description file structure {{ ::yuhang:hebi_vr:urdf_org.png?800 |}} * Select the model.urdf file within Unity's Project explorer * Select ''Assets -> Import Robot from URDF'' * You should see the imported HEBI Arm in Unity {{ :yuhang:hebi_vr:hebi_imported.png?600 |}} * In the Urdf Robot properties of the imported model, make sure that ''Is Kinematic'' option is ''Enable'' and ''Use Gravity'' option is ''Disable'' 6. Add Bio IK component to the imported manipulator game object 7. Specify joints in Bio Ik * Select component corresponds to a joint and click ''Add Joint'' {{ :yuhang:hebi_vr:bioik_add_joint2.png?800 |}} * Make sure that joint orientation and position are appropriate * Enable X, Y, or Z Motion that aligns with the axis of rotation of the joint {{ :yuhang:hebi_vr:bioik_enable_motion.png?800 |}} * Repeat the same step for all rotational joints 8. Create a GameObject as the Child of the last joint and name it End_effector {{ :yuhang:hebi_vr:end_effector.png?400 |}} 9. Add Position and Orientation objectives to the End_effector in Bio Ik {{ :yuhang:hebi_vr:objective.png?600 |}} 10. Create a Sphere GameObject and name it Target * Drag the Target into ''Target Transform'' of both Position and Orientation objectives {{ :yuhang:hebi_vr:target.png?600 |}} 11. To Test if the Bio IK is working properly, play the scene and move the Target around. The arm should follow the position and orientation of the Target 12. Apply joint limit of each individual joint in Bio IK. * Limit can be found at [[hebi_arm_tutorial|HEBI Arm Tutorial]] {{ :yuhang:hebi_vr:joint_limit.png?500 |}} 13. In Bio IK options, change ''Motion Type'' to ''Realistic'' {{ :yuhang:hebi_vr:motion_type.png?500 |}} 14. Set the home position of the gripper by editing the joint position * The home position of 6 joints in radians [0, 1.57, 2.14, 0, 1.58, 0] * J2 example {{ :yuhang:hebi_vr:home_position.png?500 |}} ===== VR Teleoperation ===== 1. Add MRTK to the Unity project * Following instruction here https://github.com/microsoft/MixedRealityToolkit-Unity/releases or https://microsoft.github.io/MixedRealityToolkit-Unity/version/releases/2.5.1/Documentation/usingupm.html 2. Following the instruction to properly configure MRTK to the Unity Project * https://microsoft.github.io/MixedRealityToolkit-Unity/Documentation/Installation.html 3. Enable Virtual Reality Support * On the toolbar, click ''Edit -> Project Settings -> Player -> XR Settings -> Virtual Reality Supported'' * Play the scene and the camera should follow the headset 4. Set the Target to track the controller * Add ''Orbital'' component to the Target * In ''SolverHandler'' component, set ''Tracked Target Type'' to ''Controller Ray'' and ''Tracked Handness'' to ''Right'' * In ''Orbital'' component, set ''Local Offset'' to 0 for all axes {{ :yuhang:hebi_vr:solver_setting2.png?600 |}} 5. Assign custom input actions to controller buttons * Create a new configuration profile by inspecting the components of ''MixedRealityToolkit'' GameObject in Unity's Hierarchy tree * Make a clone of the default configuration profile and name it something else {{ :yuhang:hebi_vr:clone_configuration.png?600 |}} * Similarly, clone the default input system profile {{ :yuhang:hebi_vr:input_config_clone.png?600 |}} * Clone the default input actions profile * Under ''Input Actions'' tab within Input Systems, ''Add a New Action'' * Scroll down the list of Actions to find the newly added action. Edit the name to ''Manipulation'' and change the ''Axis Constraint'' to ''Digital'' {{ :yuhang:hebi_vr:input_action2.png?600 |}} * Under the ''Controllers'' tab in Input System, clone the controller mapping profile * Expand ''Controller Definitions''. Go to ''Generic Open VR Right Hand Controller'' and click on ''Edit Input Action Map'' * Set entry 10's Action to ''Manipulation'' and KeyCode to ''Joystick Button 5'' (this corresponds to button ''A'' on the right index controller). {{ :yuhang:hebi_vr:controller_button_mapping2.png?600 |}} 6. Start and stop controller tracking base on input action * Select Target GameObject, disable ''SolverHandler'' and ''Orbital'' components and add ''InputActionHandler'' component * Set the ''Input Action'' to ''Manipulation'' and uncheck the ''Is Focus Required'' option {{ :yuhang:hebi_vr:input_action_handler2.png?600 |}} * Under ''On Input Action Started'' function in ''InputActionHandler'' add two callback function by clicking the plus sign on the button right corner {{ :yuhang:hebi_vr:input_action_callback.png?600 |}} * Add the ''Orbital'' and the ''SolverHandler'' components * On the ''Function'' option, select the ''bool enabled'' of the respective components, this will enable the components when the input action is triggered (the input action was assigned to button press on the controller). Make sure the check mark is checked which will set the bool to true {{ :yuhang:hebi_vr:component_enabled.png?600 |}} * Repeat similar steps to disable the components on button release 7. Move the Target to the initial position of the gripper * Apply the following transforms in ''Transform'' and ''SolverHandler'' components to match the orientation of gripper at the start {{ :yuhang:hebi_vr:initial_transform.png?400 |}} 8. Test the VR teleoperation * At this point, you should be able to control the HEBI robotic arm using right Index controller while holding the ''A'' button on the controller ===== Setup ROS Communication ===== 1. Install rosbridge-suite via $ sudo apt-get install ros-melodic-rosbridge-server 2. Place the [[https://github.com/siemens/ros-sharp/tree/master/ROS|file_server]] package in the src folder of Catkin workspace, then build by running $ catkin_make 3. Modified the IP address and port number in ros_sharp_communication.launch in the file_server/launch 4. After source setup.bash, try running $ roslaunch ros_sharp_communication.launch If running into error with ''No module name builtins'', try $ pip install future 5. If the rosbridge websocket launched successfully, the following message should appear {{ :yuhang:hebi_vr:websocket_start.png?600 |}} 6. Test websocket connection * Add ''Ros Connector'' Component * Modify the ''ROS Bridge Server Url'' with the correct IP address and port {{ :yuhang:hebi_vr:ros_connector.png?600 |}} 7. The console in Unity should display the following on successful connection {{ :yuhang:hebi_vr:websocket_success.png?400 |}} ===== Publish JointTrajectory Message through ROS # ===== 1. The HEBI ROS example code ''Arm Node'' accepts ROS message type ''trajectory_msgs/JointTrajectory'' published on the topic ''/joint_waypoints'' 2. The following is an example of the ROS JointTrajectory message that the example code can accept rostopic pub /joint_waypoints trajectory_msgs/JointTrajectory "header: seq: 0 stamp: secs: 0 nsecs: 0 frame_id: '' joint_names: - '' points: - positions: [0.51, 2.09439, 2.09439, 0.01, 1.5707963, 0.01] velocities: [0, 0, 0, 0, 0, 0] accelerations: [0, 0, 0, 0, 0, 0] effort: [] time_from_start: {secs: 0, nsecs: 0} 3. Copied from Hebi CPP example ""Note: for the JointTrajectory messages, you can ignore the header and the "names" fields, as well as the "efforts". You must fill in the "positions", "velocities", and "accelerations" vectors for each waypoint, along with the desired time_from_start for each waypoint (these must be monotonically increasing)."" 4. To simplify, for this project, each JointTrajectory message will be treated as a trajectory with only one point. The VR teleoperating system will continuously update the trajectory at high Hz to achieve continuous motion. 5. Create a custom ROS # publisher that can publish JointTrajectory message and update the position parameter with joint angles from BIO IK. using System.Collections; using System.Collections.Generic; using UnityEngine; namespace RosSharp.RosBridgeClient { public class JointTrajectoryPublisher : UnityPublisher { private string FrameId = "Unity"; private BioIK.BioIK hebi; private MessageTypes.Trajectory.JointTrajectory message; protected override void Start() { base.Start(); hebi = this.gameObject.GetComponent(); InitializeMessage(); } private void InitializeMessage() { message = new MessageTypes.Trajectory.JointTrajectory { header = new MessageTypes.Std.Header { frame_id = FrameId }, points = new MessageTypes.Trajectory.JointTrajectoryPoint[1], }; message.points[0] = new MessageTypes.Trajectory.JointTrajectoryPoint(); message.points[0].positions = new double[] { 0, 0, 0, 0, 0, 0 }; message.points[0].velocities = new double[] { 0, 0, 0, 0, 0, 0 }; message.points[0].accelerations = new double[] { 0, 0, 0, 0, 0, 0 }; message.points[0].effort = new double[] { 0, 0, 0, 0, 0, 0 }; message.points[0].time_from_start = new MessageTypes.Std.Duration { nsecs= 20000000 }; } private void FixedUpdate() { int i = 0; foreach (BioIK.BioSegment segment in hebi.Segments) { if (segment.Joint != null) { double angle = 0.0; if (segment.Joint.X.IsEnabled()) { angle = segment.Joint.X.GetCurrentValue(); } else if (segment.Joint.Y.IsEnabled()) { angle = segment.Joint.Y.GetCurrentValue(); } else if (segment.Joint.Z.IsEnabled()) { angle = segment.Joint.Z.GetCurrentValue(); } message.points[0].positions[i] = angle * Mathf.PI / 180; i++; } } Publish(message); } } } 6. Add ''JointTrajectoryPublisher'' component to the model of HEBI arm that contains the BIO IK. {{ :yuhang:hebi_vr:joint_waypoints.png?600 |}} 7. Test the system while only running the file_server on the Ubuntu machine controlling the HEBI arm. $ rostopic echo /joint_waypoints The following ROS msg should be updating on the terminal {{ :yuhang:hebi_vr:joint_trajectory_msg.png?600 |}} ===== Realsense Camera Feed ===== 1. Install the ROS Wrapper for Intel RealSense SDK following the [[https://github.com/IntelRealSense/realsense-ros|instruction]] 2. Start camera node in ROS $ roslaunch realsense2_camera rs_camera.launch 3. Check that /camera messages are being published successfully {{ :yuhang:hebi_vr:camera_topic_list.png?600 |}} 4. Add a MRTK Slate Prefab to the project * Tutorials on Slate Prefab can be found [[https://microsoft.github.io/MixedRealityToolkit-Unity/Documentation/README_Slate.html|here]] 5. Add a ''ImageSubscriber'' component along with ''ROSconnector'' * In the ''Topic'' field, enter the ROS topic ''/camera/color/image_raw/compressed'' * Drag the ''ContentQuad'' sub object from Slate Prefab into the ''Mesh Renderer'' field {{ :yuhang:hebi_vr:image_subscriber.png?600 |}} 6. The camera feed from RealSense can be now seen on the Slate prefab in the VR scene {{ :yuhang:hebi_vr:camera_feed.png?600 |}} ===== Gripper Control ===== 1. Following the last section of tutorial in [[https://www.daslhub.org/unlv/wiki/doku.php?id=hebi_arm_tutorial| HEBI arm]] to set up gripper control * Note: that gripper node must be running in conjunction with arm node 2. Test gripper node rostopic pub /gripper_strength std_msgs/Float64 "data: 0.5" 3. Create C# script in Unity to send the proper ROS msg using System.Collections; using System.Collections.Generic; using UnityEngine; using Microsoft.MixedReality.Toolkit.Input; using Microsoft.MixedReality.Toolkit; namespace RosSharp.RosBridgeClient { public class GripperStrengthPublisher : UnityPublisher { private MessageTypes.Std.Float64 message; protected override void Start() { base.Start(); InitializeMessage(); } private void InitializeMessage() { message = new MessageTypes.Std.Float64 { data = 0.0f }; } private void FixedUpdate() { // Listening to all controller inputs detected by MRTK foreach (var controller in CoreServices.InputSystem.DetectedControllers) { // Interactions for a controller is the list of inputs that this controller exposes foreach (MixedRealityInteractionMapping inputMapping in controller.Interactions) { // Isolate Thumbstick signal if (inputMapping.Description == "Trackpad-Thumbstick Position") { // Small deadzone if (inputMapping.Vector2Data.y > 0.05f) { // Convert input signal to gripper control signal message.data = inputMapping.Vector2Data.y * 0.5; Publish(message); break; } } } } } } } 4. Add GripperStrengthPublisher.cs as a component * Set ''/gripper_strength'' as the ''Topic'' {{ :yuhang:hebi_vr:gripper_strength.png?600 |}} 5. While the gripper node is running, test the code by moving the thumbstick on the Index controller up. ===== Demonstration ===== In this demonstration, I teleoperated the HEBI arm using HTC Vive to grab a 3D printed component. {{ youtube>WnYZJyY4jqQ?large }} ===== Final Words ===== For questions, clarifications, etc, Email: