Setup Guide

From CAN bus bring-up to first teleoperated episode. Covers piper_sdk, ROS2 launch, and Meta Quest 3 VR teleoperation.

1

CAN Bus & Host Setup

~15 min

The AgileX Piper communicates exclusively over CAN bus at 1 Mbps. You need a USB-to-CAN adapter (e.g., CANable, GS_USB) to expose a SocketCAN interface on your Linux host.

Safety first Physically secure the arm's base to a stable surface before powering on. Keep humans clear of the arm's full reach envelope (~600 mm radius) during motion.

Bring up the CAN interface

Connect the USB-to-CAN adapter, then run:

# Set bitrate and bring up the CAN interface
sudo ip link set can0 type can bitrate 1000000
sudo ip link set can0 up

# Verify the interface is active
ifconfig can0
Automatic activation script. The piper_sdk repository includes a can_activate.sh helper. Run it as: bash can_activate.sh can0 1000000. This is the same script used by piper_ros.
Interface name may vary If you have multiple USB-to-CAN adapters connected, the interface may appear as can1, can2, etc. Use ip link show to list all CAN interfaces and pass the correct name to C_PiperInterface.

OS support

Ubuntu 18.04, 20.04, and 22.04 are the officially tested platforms. Python 3.6+ is required.

2

Install piper_sdk

~20 min

The piper_sdk Python library handles CAN framing, joint state feedback, and gripper control. Install from PyPI (recommended) or from source.

# Option A: Install from PyPI (recommended)
pip3 install piper_sdk

# Option B: Install from source
git clone https://github.com/agilexrobotics/piper_sdk.git
cd piper_sdk
pip install -e .

# Verify installation
python3 -c "import piper_sdk; print('piper_sdk OK')"

The SDK automatically installs python-can as a dependency for CAN bus communication.

Connect and enable

The C_PiperInterface class is the main entry point. After connecting, the arm must be enabled before it accepts motion commands. EnableArm(7) enables all six joints plus the gripper.

from piper_sdk import C_PiperInterface

# Initialize with the CAN interface name (default: "can0")
piper = C_PiperInterface("can0")

# Connect to the arm
piper.ConnectPort()

# Enable all joints (required before motion commands)
piper.EnableArm(7)

print("Piper connected and enabled.")
Demo scripts. The SDK ships with ready-to-run demos in piper_sdk/demo/V2/. Start with demo_joint_ctrl.py to verify basic motion before building your own control loop.
3

First Motion

~20 min

With the arm connected and enabled, read joint state and send your first position command.

Read joint state

import time

# Read joint angles in a polling loop
for _ in range(10):
    joint_state = piper.GetArmJointMsgs()
    print(joint_state)
    time.sleep(0.1)

# Read end-effector pose
end_pose = piper.GetArmEndPoseMsgs()
print(end_pose)

Send a joint position command

# Move to a joint configuration (angles in degrees)
# Arguments: joint1, joint2, joint3, joint4, joint5, joint6
piper.MotionCtrl_2(
    0,      # joint 1
    0,      # joint 2
    90,     # joint 3
    0,      # joint 4
    0,      # joint 5
    0       # joint 6
)
time.sleep(2)  # wait for motion to complete

Gripper control

# Open gripper
piper.GripperCtrl(0, 1000)

# Close gripper (check your gripper's max value)
piper.GripperCtrl(70, 1000)

# Read gripper state
gripper_state = piper.GetArmGripperMsgs()
print(gripper_state)
Disable when done Always disable the arm with piper.DisableArm(7) when finished. An enabled arm responds immediately to any command — including erroneous ones from bugs or dropped packets.

Dual-arm (master-slave) setup

For bimanual configurations, connect two Pipers on separate CAN interfaces:

piper_left  = C_PiperInterface("can0")
piper_right = C_PiperInterface("can1")

piper_left.ConnectPort()
piper_right.ConnectPort()

piper_left.EnableArm(7)
piper_right.EnableArm(7)

print("Both arms connected.")
4

ROS2 / MoveIt Integration

~60 min

The piper_ros package provides a full ROS Noetic driver with MoveIt motion planning and Gazebo simulation. It wraps piper_sdk internally and exposes standard ROS interfaces.

Install dependencies

# Install required ROS packages
sudo apt-get install -y \
  ros-noetic-moveit \
  ros-noetic-ruckig \
  ros-noetic-ompl

# Install Python CAN dependency
pip3 install python-can piper_sdk

Launch

# Step 1: Activate CAN interface
bash can_activate.sh can0 1000000

# Step 2: Launch the Piper control node
roslaunch piper start_single_piper.launch

# For dual-arm:
roslaunch piper start_double_piper.launch

MoveIt planning

# Launch MoveIt with RViz for interactive planning
roslaunch piper_moveit_config demo.launch

# Gazebo simulation (no physical arm required)
roslaunch piper piper_gazebo.launch
Firmware note. Firmware versions prior to S-V1.6-3 require the legacy piper_description_old.urdf file. Newer firmware uses the standard piper_description.urdf. Check the firmware version label on the base of the arm before loading ROS models.

See the Specs page for the full ROS topics and services table.

5

Meta Quest 3 VR Teleoperation

~90 min

The Piper can be controlled in real time using a Meta Quest 3 headset. The architecture uses UDP over your local network: the Quest runs a Unity app that streams hand pose data, and a Python server on the robot PC translates that into Piper SDK commands.

Architecture

Meta Quest 3 (Unity)
↓ UDP — ports 8888 / 8889
Python UDP Server (host PC)
piper_sdk → C_PiperInterface
AgileX Piper (CAN bus)

The Unity side (VRHandPoseSender.cs, VRGripperController.cs, VRTeleoperationManager.cs) and the UDP layer are fully reusable from xArm setups — only the robot controller module needs to be swapped out.

Setup steps

  1. Start the CAN interface and enable the arm.
    sudo ip link set can0 type can bitrate 1000000
    sudo ip link set can0 up
  2. Create a PiperController wrapping C_PiperInterface. Replace the XArmController class in your existing teleoperation stack with a new piper_controller.py. Implement connect(), set_pose(x, y, z, roll, pitch, yaw), set_gripper(value), and emergency_stop() using piper_sdk calls.
  3. Launch the Python UDP server on the robot PC.
    python3 teleoperation_main.py --robot-type piper
    The server listens on UDP ports 8888/8889 and forwards received hand pose packets to the Piper.
  4. Launch the Unity app on the Quest 3 and connect to the PC's IP address. Adjust positionOffset, rotationOffset, and scaleFactor in Unity to match the Piper's workspace. These parameters differ from xArm due to Piper's smaller reach envelope.
Coordinate system differences The Piper workspace is smaller than xArm. Reduce scaleFactor in Unity to prevent the arm from hitting joint limits during teleoperation. Start with a conservative scale and increase gradually while monitoring joint angles.
Full Quest 3 guide. For complete setup instructions — installing the Unity app, pairing with the PC, and calibrating hand tracking — see the Quest 3 VR Teleoperation guide.
6

Data Collection

Ongoing

Once teleoperation is working, use the SVRC platform to record, label, and export manipulation demonstrations.

  • Record teleoperated episodes via the Python UDP server or directly through piper_ros bag recording
  • Export in RLDS or LeRobot format for downstream policy training
  • Use the SVRC Platform to manage datasets, run quality checks, and train ACT or Diffusion Policy models
Tip. Use piper.GetArmJointMsgs() and piper.GetArmEndPoseMsgs() at ~50 Hz in a background thread to capture synchronized joint and end-effector state during teleoperation.
Open Platform → Full Developer Wiki →

Need Help with Your Setup?

Our team is available for hands-on sessions at the Mountain View, CA facility.