User Tools

Site Tools


using_htc_vive_to_control_dynamixel_servo

Using HTC Vive to Control Dynamixel Servo

Author: Keitaro Nishimura Email: nishik1@unlv.nevada.edu
Date: Last modified on 06/04/17
Keywords: OpenVR SDK, HTC Vive, Dynamixel, tutorial


The photo above depicts using the Vive position tracking to move a Dynamixel servo around. This tutorial shows you how to setup a Visual Studio Project to use the OpenVR SDK, how to get the position data of the Vive controllers, and how to use that position data to move around a Dynamixel servo.

Motivation and Audience

This tutorial's motivation is to teach the reader how to use the position tracking of the HTC Vive controllers to move a Dynamixel servo. This tutorial is not meant to be a comprehensive tutorial on openVR and Dynamixel SDK. Rather a starting point for the reader to then follow further to learn more about the respective APIs.
Readers of this tutorial are assumed to have the following background and interests:

* Know how to follow directions
* Have experience with/or interest in cpp
* Be interested in some form of HMI
* Have completed the previous 2 tutorials


The rest of this tutorial will follow as presented:

  • Parts Lists and Source
  • Setting up Visual Studio Environment for OpenVR
  • Getting the position Data of the Vive Controller
  • Integrating the Dynamixel SDK to move the servo
  • Final Words

Please note that this tutorail was meant to be viewed in full screen mode. If viewed in a window less than ~5/8 of the monitor the location of the text, pitcures, and code will move around, drastically lowering the readability of the tutorial.

Parts Lists and Sources


To complete this tutorial, you'll need the following items listed bellow.

Part Name/Description Vendor Vendor URL Qty Notes
HTC Vive Kit Vive or Fry's https://www.vive.com/us/product/ or http://www.frys.com/product/8987273 1The kit will come with all the necessary hardware. 1 Vive HMD (head mounted display), 2 lighthouse base stations, 2 controllers, and the cables and power sources needed.
A computer with at least a GTX 1060 or equivalent Amazon https://www.amazon.com/1Please reference the recommended computer specs for any computer the reader plans to use. The lab however already has a computer specifically for research with the vive.
Tripod mounts for lighthouse base stations Amazon or build in-house https://www.amazon.com/s/ref=nb_sb_noss_2?url=search-alias%3Daps&field-keywords=tripod+for+camera 2 Since the base stations have a standard camera mounting hole on the bottom any standard camera tripod should work. You can also use scrap pieces of 2×4 wood to create your own mount for them.
Dynamixel Servo (RX-28) Robotis http://www.robotis.us/dynamixel-rx-28-hn07-n101/ 1 You do not need to use this specific servo, lease please not that some parts of this tutorial may be different depending
USB to Dynamixel Robotis http://www.robotis.us/usb2dynamixel_int/ 1 This is needed to communicate between the dynamixel and computer
SMPS2Dynamixel Robotis http://www.robotis.us/smps2dynamixel/ 1 This is needed to power the dynamixel
12V~5A power adapter Robotis http://www.robotis.us/smps-12v-5a-ps-10-us-110v/ 1 This should be together with the SMPS2Dynamixel
Robot Cable-X4P Robotis http://www.robotis.us/cables/ 2 The actual length does not matter, but the pin type (4pin) does
Male to Female USB Extension Cable Amazon https://www.amazon.com/AmazonBasics-Extension-Cable-Male-Female/dp/B00NH11R3I/ref=sr_1_2?ie=UTF8&qid=1494974321&sr=8-2&keywords=usb+extender+male+to+female 1 This is not necessary for this tutorial but will make your life a lot easier in the last section (we should have a box full of these in the cabinets)


All of the items mentioned above should already be available to your in the lab. Please check with the lab manager or your lab mates for the whereabouts of the various parts.
Note: When getting a dynamixel a quick way to test if its broken or not is to try to move the motor while it is unconnected. If you can move it then it is probably working, if it is rock solid that means that it is probably broken.

Setting up Visual Studio Environment for OpenVR


In this section, we will go over how to setup the visual studio environment to be able to develop with the OpenVR SDK. Although it is possible to do with QT as well this tutorial will focus exclusively on Visual Studio. If you need to use QT please refer to the other sample code within the OpenVR folder that is written with QT (e.x. helloworld_overlay).


1. The first step is to open up a new Visual Studio Project. This picture below should depict what you will see when opening up the IDE. If you have a problem with an expired license please refer to the previous tutorial on how to fix that (if you are using a lab computer this shouldn't be a problem).

Once open click on the New Project… link under “Start” on the left-hand side of the window. A new window will pop up prompting you to choose what type of project, where to save it and what to name it. As the picture below shows click on the “VisualC+ +” option on the left-hand side and choose “Empty Project”. The name doesn't matter in this case. For this tutorial I will name it “ViveDynamixelPosTest”, but feel free to name it whatever makes sense to you. Don't worry about the save location, the default will work fine. Then click “OK”.

You should be greeted by an empty project, as depicted below.

At the top of the window, you will notice a tab with “x86” in it (if it is already x64 don't worry about this). Click on it and choose the “x64” instead. If you don't there will be build errors later on. I had forgotten to do this when making this tutorial thus why there is already code in my window in the picture bellow.

You have now created you own VisualStudio's project! We will now go into all of the setting and changes to the folders that you will need to make so that you can use the OpenVR SDK with it.


2. In the new project window, if you look at the right hand-side, you will notice the Solution Explorer window. Within it, you should find your project and all of its contents (right now empty folders). Right click on the name of your project (it should be highlighted and in bold text right after the window opens) and click on the Properties option at the bottom of the menu.

Once you have clicked on it the Property Pages window will appear. This is where the majority of changes will happen.

Under “Configuration Properties” in the “VC+ + Directories” click left of the Library Directories and choose <Edit…>. In the new window that popped up enter the locations the .lib files for OpenVR, sdl, and glew. All of which can be found inside the OpenVR SDK folder (if you don't already have it downloaded refer to the previous tutorial). For me these paths are:

C:\openvr-master\lib\win64
C:\openvr-master\samples\thirdparty\sdl2-2.0.3\bin\win64
C:\openvr-master\samples\thirdparty\glew\glew-1.11.0\lib\Release\x64

These pathways can differ depending on where you saved the OpenVR folder, but after openvr-master it will be the same. Although you can type the pathway directly into the window you can also look for it by clicking the folder icon in the top right-hand corner next to the red “X” symbol (which will delete the highlighted pathway). Once entered click OK.



Then move to the “C/C+ +” tab and in “General” click left of “Additional Include Directories” and choose <Edit…>. Then the directories to all of the needed header files for OpenVR, sdl, and glew. For the they were:

C:\openvr-master\headers
C:\openvr-master\samples\thirdparty\sdl2-2.0.3\include
C:\openvr-master\samples\thirdparty\glew\glew-1.11.0\include

They may differ for you depending on where you saved your file but after openvr-master will be identical. Once again you can either type the pathways directly or find them through the folder icon in the top right hand corner.



Then change the option for SDL checks from “Yes” to No“.

Then while staying under “C/C+ +” move from “General” to “Optimization” and change “Optimization” to “Disabled (/Od)” and “Enable Intrinsic Functions” to “Yes (/Oi)”.

Now moving from “C/C+ +” to “Linker” under “General” change “Enable Incremental Linking” to “No (/INCREMENTAL:NO)” and click left of “Additional Library Directories” and choose the <Edit…> option. Then in the pop up window add the same pathway that you used from “Library Directories”. Then click “OK”, for me the pathways were:

C:\openvr-master\lib\win64
C:\openvr-master\samples\thirdparty\sdl2-2.0.3\bin\win64
C:\openvr-master\samples\thirdparty\glew\glew-1.11.0\lib\Release\x64



Now move from “General” to “Input” and click left of “Additional Dependencies” and choose the <Edit…> option. The in the pop up window put in this:

openvr_api.lib
glew32.lib
SDL2.lib
SDL2main.lib
glu32.lib
opengl32.lib

Then click “OK”.



Now move from “Input” to “System” and change “SubSystem” to “Console (/SUBSYSTEM:CONSOLE)”.

You have now finished changing the Property setting and click on the “Apply” to officiate all of the changes.


Now go in a File Explorer window open up the “samples” folder within the OpenVR SDK. Then copy and paste the “shared” folder into your project. You can do this by dragging the folder from the File Explorer into your visual studio project and dropping it ontop of your project name (the same place you right clicked to get the properties option), and by copying it into the project folder where the default option makes you save it. After doing this you should be able to see the files within your project's “Solution Explorer”.
NOTE: In the picture with the shared folder in the visual studio I made a mistake and the “shared” folder should be inside the “ViveDynamixelPosTest” folder shown in the picture.

Then it is time to copy over the .dll files over to your project. “dll” stands for Dynamic Link Library and is the core for each API used here (OpenVR, sdl, and glew).

The pathway to find each of the .dll files is as follows:

[path to OpenVR SDK folder]\openvr-master\bin\win64  -for Openvr_api.dll
[path to OpenVR SDK folder]\openvr-master\samples\thirdparty\glew\glew-1.11.0\bin\Release\x64  -for glew32.dll

NOTE: the sdl2.dll file isn't in the folder but if you search your computer drive for it. Steam uses it so you can copy it from there (make sure to copy it and not move it from there). search for “sdl”.

Then copy them into the (for me) Documents/Visual Studio 2017/Projects/(project name)/(project name)/x64/Debug folder.

You have now finished preping the project to use the OpenVR SDK! Now to test whether it is correct or not we will copy and paste the hello world code into the project to see if it builds.


3. Go back to your open Visual Studio project and right click the “Source Files” folder inside the Solutions Explorer. Choose “Add” then “New Item…”. In the pop up window inside “Visual C+ +” select the “C+ + File(.cpp)” option and click “Add”. You don't have to change the name but you can if you want to (I changed it to main.cpp). You now have a new empty file to fill.

Open the hellovr_opengl example code and copy it all into your new empty file. Before testing the code you need to connect the HTC Vive to the computer. After connecting the Vive, try to build it by clicking on the “Local Windows Debugger” button at the top of the window. If everything goes well the code should start to run (after doing some initial setup steps, just click ok to everything).

Now if you encounter some errors here are some troubleshooting steps to take.

  • Make sure that you changed the x86 option to x64 at the top of the Visual Studio window.



  • At the top of the code try to add the files from the “shared” folder that you copied. When it tries to auto-complete for you and you can't find the folder or corresponding files you may want to try to copy them over again or move the location you saved them in your project folder.



  • Go back and compare your properties settings with those of the openvr_opengl example code. Make sure that they are the same.
  • Check the GitHub Issues tab to see if anyone is getting similar errors. If not create a new issue. (this is more of a last ditch effort as if you are getting errors at this stage your questions may be too broad/generic for the community to answer concisely)


If you have persistent errors at this stage feel free to email me.
If everything went well then you have successfully setup your own visual studios project to work with the OpenVR SDK! Don't worry if you can't see anything except for the controllers. Since you changed the location of the code it just can't find the image used in the original code. In the next section, we will add position tracking and how to change the image shown in the cubes in the example code.

Getting the position Data of the Vive Controller

1. First, we will change the image used on all of the cubes in the example code. You can use any image as long as it is a png file, you can't use a jpg. Once you have found an image that you want to use using ctrl+f to search for ”.png“ in your new code. The result will be a pathway to the original image file. Change that pathway to point to the new file that you want to use. Build and test the code to see if you can see the new image being used by the code. In the picture below I used the UNLV logo.

For the position tracking, I used code from the OpenVR-Tracking-Example but will put all of the code needed in this tutorial so you can use it as a reference. But we will still be downloading it so that we can see what it does (don't worry you don't need to compile any code for this).

After downloading the code from github and extracting it open up SteamVR as this code unlike the hellovr example won't open it for you. Now go into the Binary\Release\Release folder to find the executable “HTC Lighthouse Tracking Example” (it's the one with the type “Application”). You will need to extract the second Release folder before moving forward. Press shift + right click inside the File Explorer with the executable and choose “Open command window here”. This will open a terminal for you to start the executable by typing the name of it and pressing enter. You can use [tab] to auto-complete and don't worry about putting ”./“ before the name it will execute without it. You should now see a terminal with position values running down it. Get into your tracking area with the Vive controller and click any button on them and move around the left controller. You should see the values for the controller change as you move it around (the values are in meters). You can do the same thing with the head set, though you won't be able to see anything through it.

This is what we want your code to be able to do as well so we will me making the following changes.
First we need to add some new variables and a new functions into your code. Add the code inbetween the “Shutdown()” function and the “HandleInput()” function as shown in the picture:

vr::HmdVector3_t GetPosition(vr::HmdMatrix34_t matrix);
vr::HmdQuaternion_t GetRotation(vr::HmdMatrix34_t matrix);
 
vr::HmdQuaternion_t GetRotation(vr::HmdMatrix34_t matrix) {
	vr::HmdQuaternion_t q;
 
	q.w = sqrt(fmax(0, 1 + matrix.m[0][0] + matrix.m[1][1] + matrix.m[2][2])) / 2;
	q.x = sqrt(fmax(0, 1 + matrix.m[0][0] - matrix.m[1][1] - matrix.m[2][2])) / 2;
	q.y = sqrt(fmax(0, 1 - matrix.m[0][0] + matrix.m[1][1] - matrix.m[2][2])) / 2;
	q.z = sqrt(fmax(0, 1 - matrix.m[0][0] - matrix.m[1][1] + matrix.m[2][2])) / 2;
	q.x = copysign(q.x, matrix.m[2][1] - matrix.m[1][2]);
	q.y = copysign(q.y, matrix.m[0][2] - matrix.m[2][0]);
	q.z = copysign(q.z, matrix.m[1][0] - matrix.m[0][1]);
	return q;
}
 
vr::HmdVector3_t GetPosition(vr::HmdMatrix34_t matrix) {
	vr::HmdVector3_t vector;
 
	vector.v[0] = matrix.m[0][3];
	vector.v[1] = matrix.m[1][3];
	vector.v[2] = matrix.m[2][3];
 
	return vector;
}



Then add the following case into the “ProcessVREvent” function right after the last “break;” shown in the picture below.

case vr::VREvent_ButtonPress:
	{
		for (vr::TrackedDeviceIndex_t unDevice = 0; unDevice < vr::k_unMaxTrackedDeviceCount; unDevice++)
	{
		// if not connected just skip the rest of the routine
		if (!m_pHMD->IsTrackedDeviceConnected(unDevice))
			continue;
 
		vr::TrackedDevicePose_t trackedDevicePose;
		vr::TrackedDevicePose_t *devicePose = &trackedDevicePose;
 
		vr::VRControllerState_t controllerState;
		vr::VRControllerState_t *ontrollerState_ptr = &controllerState;
 
		vr::HmdVector3_t position;
		vr::HmdQuaternion_t quaternion;
 
		if (vr::VRSystem()->IsInputFocusCapturedByAnotherProcess()) {
			char buf[1024];
 
			sprintf_s(buf, sizeof(buf), "\nInput Focus by Another Process\n");
			printf_s(buf);
		}
 
		bool bPoseValid = trackedDevicePose.bPoseIsValid;
		vr::HmdVector3_t vVel;
		vr::HmdVector3_t vAngVel;
		vr::ETrackingResult eTrackingResult;
 
		// Get what type of device it is and work with its data
		vr::ETrackedDeviceClass trackedDeviceClass = vr::VRSystem()->GetTrackedDeviceClass(unDevice);
                switch (trackedDeviceClass) {
                case vr::ETrackedDeviceClass::TrackedDeviceClass_Controller:
 
			vr::VRSystem()->GetControllerStateWithPose(vr::TrackingUniverseStanding, unDevice, &controllerState, sizeof(controllerState), &trackedDevicePose);
 
			position = GetPosition(devicePose->mDeviceToAbsoluteTracking);
			quaternion = GetRotation(devicePose->mDeviceToAbsoluteTracking);
 
			vVel = trackedDevicePose.vVelocity;
			vAngVel = trackedDevicePose.vAngularVelocity;
			eTrackingResult = trackedDevicePose.eTrackingResult;
			bPoseValid = trackedDevicePose.bPoseIsValid;
 
			switch (vr::VRSystem()->GetControllerRoleForTrackedDeviceIndex(unDevice)) {
			case vr::TrackedControllerRole_Invalid:
				// invalid hand...
				break;
 
			//
			case vr::TrackedControllerRole_LeftHand:
 
				char buf[1024];
 
				sprintf_s(buf, sizeof(buf), "\nLeft Controller\nx: %.2f y: %.2f z: %.2f\n", position.v[0], position.v[1], position.v[2]);
				printf_s(buf);
 
				sprintf_s(buf, sizeof(buf), "qw: %.2f qx: %.2f qy: %.2f qz: %.2f\n", quaternion.w, quaternion.x, quaternion.y, quaternion.z);
				printf_s(buf);
 
				switch (eTrackingResult) {
				case vr::ETrackingResult::TrackingResult_Uninitialized:
					sprintf_s(buf, sizeof(buf), "Invalid tracking result\n");
					printf_s(buf);
					break;
				case vr::ETrackingResult::TrackingResult_Calibrating_InProgress:
					sprintf_s(buf, sizeof(buf), "Calibrating in progress\n");
					printf_s(buf);
					break;
				case vr::ETrackingResult::TrackingResult_Calibrating_OutOfRange:
					sprintf_s(buf, sizeof(buf), "Calibrating Out of range\n");
					printf_s(buf);
					break;
				case vr::ETrackingResult::TrackingResult_Running_OK:
					sprintf_s(buf, sizeof(buf), "Running OK\n");
					printf_s(buf);
					break;
				case vr::ETrackingResult::TrackingResult_Running_OutOfRange:
					sprintf_s(buf, sizeof(buf), "WARNING: Running Out of Range\n");
					printf_s(buf);
 
					break;
				default:
					sprintf_s(buf, sizeof(buf), "Default\n");
					printf_s(buf);
					break;
				}
 
				if (bPoseValid)
					sprintf_s(buf, sizeof(buf), "Valid pose\n");
				else
					sprintf_s(buf, sizeof(buf), "Invalid pose\n");
				printf_s(buf);
 
				break;
 
			case vr::TrackedControllerRole_RightHand:
				// incomplete code, look at left hand for reference
				break;
 
			case vr::TrackedDeviceClass_TrackingReference:
				// incomplete code, only here for switch reference
				sprintf_s(buf, sizeof(buf), "Camera / Base Station");
				printf_s(buf);
				break;
			}
 
			break;
		}
        }
}


This code will allow you to track the position of the left controller each time you press any button on either controller. Test it out by running the code and to check whether its working or not. If should not run into any build issues but if you do they should be easy fixes at this point. In the picture below it shows the position of both the hmd and the left controller, for the purpose of this tutorial I left out the hmd code but it is identical to the controller tracking code and you can find it in the tracking example code. I also changed the image used for the boxes as I didn't like the horror film vibe the UNLV logo gave with the black and red.

Congratulations you can now track the position of the left controller throughout your tracking area. Now it is time to add the dynamixel into the equation.

Integrating the Dynamixel SDK to move the servo

1. Now it's time to setup the dynamixel for usage with your OpenVR code. If you Don't already have the dynamixel related items with you please reference the parts and build table. Below is a picture of the items I used and an example setup picture.
Note: Though not in the picture the power board is connected to the power source from a wall socket and the dynamixel 2 usb is connected to the computer via the usb extender show




Please make sure that the correct communication protocol is used with the slider switch for the specific dynamixel that you are using with the dynamiel 2 usb. To find the communication protocol for your dynamixel find the online e-manual for it. For the RX-28 it is RS485. For more information on the USB2Dynamixel visit it's e-manual.
dynamixelemanual_li.jpgdynamixelcommtype_li.jpg
Now it is time to find out more information on the dynamixel that you are using. It is important to find out the baud rate, ID, and comport to correctly interface with it through the dynamixel SDK.
Go to the Robotis downloads site and download the RoboPlus software (not the 2.0 one). This download will take some time, so take this time to check which comport the dynamixel is connected to through the Device Manager. Under “Ports” it will come up as a “USB Serial Port”, For me it was COM4.


After you have finished installing it open it and under the “Expert” tab open the “Dynamixel Wizard”. In the Dynamixel Wizard select the comport that your dynamixel is connected to in the top left-hand corner and click on the connect button. Once connected do a basic search by clicking the “Start searching” button. When it finds your dynamixel it will show up on the left-hand side. If the search can't find the dynamixel change the “Basic search” to “All search”. This search will take some time but as long as the dynamixel isn't broken, it will find it.
dynamixelwizard2_li.jpg

Once you have found the dynamixel click on it to display all of its information. Take note of the ID number and Baud Rate. If the Baud Rate is 1 that means that it is actually 1,000,000. If you want you can change those values here as well. You can also move the dynamixel around from the wizard by changing the goal position (if it is in join mode) or moving speed (if it is in wheel mode). You can change the mode by choosing either joing or wheel under the picture of the dynamixel.


Now after making sure that the dynamixel is working and it's information we can download the Dynamixel SDK. After extracting it into an easy to access location we need to go back into your project property setting agains to add the Dynamixel SDK to it.

Once you have the properties page for your project open go to VC+ + Directories and add it to the Library Directories. The pathway for me was:

C:\DynamixelSDK\c++\build\win64\output

After the DynamixelSDK it should be the same but can differ before it depending on where you saved it.

Then add the header files to the “C/C+ +”/“General”/“Additional Include Directories”. The pathway for me was:

C:\DynamixelSDK\c++\include


Add the same pathway to the “Additional Library Directories” that you used for “Library Directories”

Add:

dxl_x64_cpp.lib

To the “Additional Dependencies” inside “Input” and click “Apply” to save the changes.

And finally, copy the .dll file for the dynamixel sdk over into your project along with the other .dll files. It is located together with the .lib file that you pointed to in your properties.
We can now move forward with adding the dynamixel to the code. First we need to initialize some things at the top of the code. I put it underneath the header files.

//dynamixel SDK library 
#include "dynamixel_sdk.h"
#include <conio.h>
// Control table address
 
#define ADDR_MX_TORQUE_ENABLE           24                  // Control table address is different in Dynamixel model but it worked with RX-28 that I tested the code with
#define ADDR_MX_GOAL_POSITION           30
#define ADDR_MX_PRESENT_POSITION        36
 
//Protocol version
 
#define PROTOCOL_VERSION                1.0                  
 
// Default setting
#define DXL_ID                          1                   // Dynamixel ID: 1
#define BAUDRATE                        1000000
#define DEVICENAME                      "COM4"              // Check which port is being used on your controller
// ex) Windows: "COM1"   Linux: "/dev/ttyUSB0"
 
#define TORQUE_ENABLE                   1                   // Value for enabling the torque
#define TORQUE_DISABLE                  0                   // Value for disabling the torque
#define DXL_MINIMUM_POSITION_VALUE      0                   // Dynamixel will rotate between this value
#define DXL_MAXIMUM_POSITION_VALUE      1023                // and this value check manual for range. This is the range of RX-28
#define DXL_MOVING_STATUS_THRESHOLD     10                  // Dynamixel moving status threshold note:if it is set too small the dynamixel will never get to its target point...

Make sure the the ID number, baud rate, and comport are all correct.

Now we need to go and add/change the code we put in to get the position of the left hand controller. Add the following code into the “ProcessVREvent” function before the switch case.

	// Initialize PortHandler instance
	// Set the port path
	// Get methods and members of PortHandlerLinux or PortHandlerWindows
	dynamixel::PortHandler *portHandler = dynamixel::PortHandler::getPortHandler(DEVICENAME);
 
	// Initialize PacketHandler instance
	// Set the protocol version
	// Get methods and members of Protocol1PacketHandler or Protocol2PacketHandler
	dynamixel::PacketHandler *packetHandler = dynamixel::PacketHandler::getPacketHandler(PROTOCOL_VERSION);
 
	int index = 0;
	int dxl_comm_result = COMM_TX_FAIL;             // Communication result
	int dxl_goal_position[2] = { DXL_MINIMUM_POSITION_VALUE, DXL_MAXIMUM_POSITION_VALUE };         // Goal position
	double var_dxl_goal_pos_double = 0;						//variable goas position
	int dxl_step_size = DXL_MAXIMUM_POSITION_VALUE;                          //step size of each var movement
	int var_dxl_goal_pos_int = 0;  //converting double into int
	double dxl_con_pos_converter = 0;  //converting the position of vive con to dxl goal
 
 
	uint8_t dxl_error = 0;                          // Dynamixel error
	uint16_t dxl_present_position = 0;              // Present position
 
	//creating reference position
	double Ref_Pos_X = 0;
	double Ref_Pos_Y = 0;
	double Ref_Pos_Z = 0;
 
	//creating delta position
	double delta_pos_X = 0;
	double delta_pos_Y = 0;
	double delta_pos_Z = 0;


Then replace the code inside the for loop in the VREvent_ButtonPress case with this code.

 
			if (!m_pHMD->IsTrackedDeviceConnected(unDevice))
				continue;
 
			vr::TrackedDevicePose_t trackedDevicePose;
			vr::TrackedDevicePose_t *devicePose = &trackedDevicePose;
 
			vr::VRControllerState_t controllerState;
			vr::VRControllerState_t *ontrollerState_ptr = &controllerState;
 
			vr::HmdVector3_t position;
			vr::HmdQuaternion_t quaternion;
 
			if (vr::VRSystem()->IsInputFocusCapturedByAnotherProcess()) {
				char buf[1024];
 
				sprintf_s(buf, sizeof(buf), "\nInput Focus by Another Process\n");
				printf_s(buf);
			}
 
			bool bPoseValid = trackedDevicePose.bPoseIsValid;
			vr::HmdVector3_t vVel;
			vr::HmdVector3_t vAngVel;
			vr::ETrackingResult eTrackingResult;
 
			vr::ETrackedDeviceClass trackedDeviceClass = vr::VRSystem()->GetTrackedDeviceClass(unDevice);
			switch (trackedDeviceClass)
			{
			case vr::ETrackedDeviceClass::TrackedDeviceClass_Controller:
				switch (vr::VRSystem()->GetControllerRoleForTrackedDeviceIndex(unDevice)) {
				case vr::TrackedControllerRole_Invalid:
					/*	 invalid hand...*/
					break;
				case vr::TrackedControllerRole_LeftHand:
 
 
					vr::VRSystem()->GetControllerStateWithPose(vr::TrackingUniverseStanding, unDevice, &controllerState, sizeof(controllerState), &trackedDevicePose);
 
					position = GetPosition(devicePose->mDeviceToAbsoluteTracking);
					quaternion = GetRotation(devicePose->mDeviceToAbsoluteTracking);
 
					vVel = trackedDevicePose.vVelocity;
					vAngVel = trackedDevicePose.vAngularVelocity;
					eTrackingResult = trackedDevicePose.eTrackingResult;
					bPoseValid = trackedDevicePose.bPoseIsValid;
 
					//setting the refrence point
					Ref_Pos_X = position.v[0];
					Ref_Pos_Y = position.v[1];
					Ref_Pos_Z = position.v[2];
 
					//buffer to put messege into Note:this is how the example code did this so I just followed their example.
					char buf[1024];
 
					sprintf_s(buf, sizeof(buf), "\nL-Con Ref Pos\nx: %.2f y: %.2f z: %.2f\n", Ref_Pos_X, Ref_Pos_Y, Ref_Pos_Z);
					printf_s(buf);
 
					while (1)
					{
 
						switch (vr::VRSystem()->GetControllerRoleForTrackedDeviceIndex(unDevice)) {
						case vr::TrackedControllerRole_Invalid:
							/*	 invalid hand...*/
							break;
						case vr::TrackedControllerRole_LeftHand:
 
							vr::VRSystem()->GetControllerStateWithPose(vr::TrackingUniverseStanding, unDevice, &controllerState, sizeof(controllerState), &trackedDevicePose);
 
							position = GetPosition(devicePose->mDeviceToAbsoluteTracking);
 
							// print the tracking data
							char buf[1024];
							sprintf_s(buf, sizeof(buf), "\nLeft Controller -Current Posision-\nx: %.2f y: %.2f z: %.2f\n", position.v[0], position.v[1], position.v[2]);
							printf_s(buf);
							//sprintf_s(buf, sizeof(buf), "qw: %.2f qx: %.2f qy: %.2f qz: %.2f\n", quaternion.w, quaternion.x, quaternion.y, quaternion.z);
							//printf_s(buf);
 
							// calculate the delta in xyz directions from reference point
							delta_pos_X = position.v[0] - Ref_Pos_X;
							delta_pos_Y = position.v[0] - Ref_Pos_Y;
							delta_pos_Z = position.v[0] - Ref_Pos_Z;
 
							sprintf_s(buf, sizeof(buf), "\nDelta object Pos\nx: %.2f y: %.2f z: %.2f\n", delta_pos_X, delta_pos_Y, delta_pos_Z);
							printf_s(buf);
 
 
								dxl_con_pos_converter = -(delta_pos_X / 0.5) * dxl_step_size;
 
								printf("\ndxl con pos conv: %.2f\n", dxl_con_pos_converter);
 
								// Read present position
								dxl_comm_result = packetHandler->read2ByteTxRx(portHandler, DXL_ID, ADDR_MX_PRESENT_POSITION, &dxl_present_position, &dxl_error);
								if (dxl_comm_result != COMM_SUCCESS)
								{
									packetHandler->printTxRxResult(dxl_comm_result);
								}
								else if (dxl_error != 0)
								{
									packetHandler->printRxPacketError(dxl_error);
								}
 
								if (dxl_con_pos_converter > 0)
								{
									var_dxl_goal_pos_double = abs(dxl_present_position - dxl_con_pos_converter);
								}
								else
								{
									var_dxl_goal_pos_double = abs(dxl_present_position + dxl_con_pos_converter);
								}
 
								var_dxl_goal_pos_int = (int)var_dxl_goal_pos_double;
 
								sprintf_s(buf, sizeof(buf), "\ndxl var Pos double\n %.5f\n", var_dxl_goal_pos_double);
								printf_s(buf);
 
								sprintf_s(buf, sizeof(buf), "\ndxl var Pos int\n %i\n", var_dxl_goal_pos_int);
								printf_s(buf);
 
								sprintf_s(buf, sizeof(buf), "\ndxl cur pos\n %.5f\n", dxl_present_position);
								printf_s(buf);
 
 
								if (var_dxl_goal_pos_int > 0 && var_dxl_goal_pos_int < DXL_MAXIMUM_POSITION_VALUE)
								{
									// Open port
									if (portHandler->openPort())
									{
										printf("Succeeded to open the port!\n");
									}
									else
									{
										printf("Failed to open the port!\n");
										printf("Press any key to terminate...\n");
										getch();
										return;
									}
 
									// Set port baudrate
									if (portHandler->setBaudRate(BAUDRATE))
									{
										printf("Pre-set Baudrate was correct!\n");
									}
									else
									{
										printf("Wrong Baudrate was selected!\n");
										printf("Press any key to terminate and go back to correct code...\n");
										getch();
										return;
									}
 
									// Enable Dynamixel Torque
									dxl_comm_result = packetHandler->write1ByteTxRx(portHandler, DXL_ID, ADDR_MX_TORQUE_ENABLE, TORQUE_ENABLE, &dxl_error);
									if (dxl_comm_result != COMM_SUCCESS)
									{
										packetHandler->printTxRxResult(dxl_comm_result);
									}
									else if (dxl_error != 0)
									{
										packetHandler->printRxPacketError(dxl_error);
									}
									else
									{
										printf("Dynamixel has been successfully connected \n");
									}
 
									printf("Push button to continue (or untouch a button to exit!)\n");
 
									// Write goal position
									dxl_comm_result = packetHandler->write2ByteTxRx(portHandler, DXL_ID, ADDR_MX_GOAL_POSITION, var_dxl_goal_pos_int, &dxl_error);
									if (dxl_comm_result != COMM_SUCCESS)
									{
										packetHandler->printTxRxResult(dxl_comm_result);
									}
									else if (dxl_error != 0)
									{
										packetHandler->printRxPacketError(dxl_error);
									}
 
									do
									{
										// Read present position
										dxl_comm_result = packetHandler->read2ByteTxRx(portHandler, DXL_ID, ADDR_MX_PRESENT_POSITION, &dxl_present_position, &dxl_error);
										if (dxl_comm_result != COMM_SUCCESS)
										{
											packetHandler->printTxRxResult(dxl_comm_result);
										}
										else if (dxl_error != 0)
										{
											packetHandler->printRxPacketError(dxl_error);
										}
 
 
										sprintf_s(buf, "[ID:%03d] GoalPos:%03d  PresPos:%03d\n", DXL_ID, var_dxl_goal_pos_int, dxl_present_position);
										printf_s(buf);
										//printf("[ID:%03d] GoalPos:%03d  PresPos:%03d\n", DXL_ID, var_dxl_goal_pos, dxl_present_position);
 
										//outFile << buf;
 
									} while ((abs(var_dxl_goal_pos_int - dxl_present_position) > DXL_MOVING_STATUS_THRESHOLD));
 
									// Disable Dynamixel Torque
									dxl_comm_result = packetHandler->write1ByteTxRx(portHandler, DXL_ID, ADDR_MX_TORQUE_ENABLE, TORQUE_DISABLE, &dxl_error);
									if (dxl_comm_result != COMM_SUCCESS)
									{
										packetHandler->printTxRxResult(dxl_comm_result);
									}
									else if (dxl_error != 0)
									{
										packetHandler->printRxPacketError(dxl_error);
									}
 
									// Close port
									portHandler->closePort();
								}
								else
								{
									printf("invalide position data\n");
 
								}
 
 
							// and print some more info to the user about the state of the device/pose
							switch (eTrackingResult) {
							case vr::ETrackingResult::TrackingResult_Uninitialized:
								sprintf_s(buf, sizeof(buf), "Invalid tracking result\n");
								printf_s(buf);
								break;
							case vr::ETrackingResult::TrackingResult_Calibrating_InProgress:
								sprintf_s(buf, sizeof(buf), "Calibrating in progress\n");
								printf_s(buf);
								break;
							case vr::ETrackingResult::TrackingResult_Calibrating_OutOfRange:
								sprintf_s(buf, sizeof(buf), "Calibrating Out of range\n");
								printf_s(buf);
								break;
							case vr::ETrackingResult::TrackingResult_Running_OK:
								sprintf_s(buf, sizeof(buf), "Running OK\n");
								printf_s(buf);
								break;
							case vr::ETrackingResult::TrackingResult_Running_OutOfRange:
								sprintf_s(buf, sizeof(buf), "WARNING: Running Out of Range\n");
								printf_s(buf);
 
								break;
							default:
								sprintf_s(buf, sizeof(buf), "Default\n");
								printf_s(buf);
								break;
							}
 
						}
						UpdateHMDMatrixPose();
					}
				}
				break;
			}


You should now be able to run this code and be able to move the dynamixel after pushing any button by moving the left controller back and forth along the x-axis.

Final Words


This tutorial's objective was to teach students how to control a dynamixel servo through position tracking of the Vive controllers. Once the concepts were conveyed the reader could enhance the code. As it was made by a student with little experience coding it is nowhere near as clean or accurate as it could be. As the student played with the final product they probably noticed how the servo would jump around as the controller moved. This is because it is using position control rather than velocity. If the student was able to change it to velocity control than the accuracy of the movement would increase.

Speculating future work derived from this tutorial, includes using this code to then communicate with a robotic arm to control it; Render an environment awareness scene with video feedback from a camera as the user controls the arm. In the Big picture, the problem of interfacing with the HTC Vive for research can be solved with this tutorial.

For questions, clarifications, etc, Email:nishik1@unlv.nevada.edu

using_htc_vive_to_control_dynamixel_servo.txt · Last modified: 2017/06/04 12:34 by keitaronishimura