Re: [Video] Alternative Kinect Control (using depth images)
Posted: Sat May 17, 2014 4:58 am
Thanks for the quick reply.
This is the graph I got. A little bit different from Patrik's, roughly the /CrazyflieJoystickDriver is connected to /reconfigure_gui instead.

This is the graph I got. A little bit different from Patrik's, roughly the /CrazyflieJoystickDriver is connected to /reconfigure_gui instead.
Is this the same as unchecking the "Disable Hover Mode" in the Input tab?omwdunkley wrote:- activating the hover mode. However you could probably remove this requirement, by always having the hover mode activated or adding a button to the GUI.
So, the "Automatic Mode" button plays the role here. Is it published by /joy_node to the topic /joy in the file joy_driver_pid.py?omwdunkley wrote:I guess the easiest (even if rather silly) solution would be to quickly make a node that blindly sends "fake" joystick messages at 100hz on the right channel. While you do that, you could even set the "automatic mode" button to set (1.0 I think)). Alternatively you could modify the joy_driver_pid.py file to do that for you.
For the yaw angle, I connected to the flie and the HUD moved as I rotate my flie. The only frames I can see in rviz is the camera frames, the world frame, and cf_xyz frame. /cf0 and /cf_gt did not appear.omwdunkley wrote:"/cf_xyz" is only the flie position (not rotation) as estimated by the tracker. The tracker cannot estimate the rotation. The rotation estimate comes from the flie itself (so you must be connected to it and receiving roll, pitch and yaw). The GUI program sends out a /cf_xyz->/cf0 transform with just the rotation. The static transform publisher in the guide then links /cf0->/cf_gt (ground truth), which is compared against /goal in the PID controller.