Walking with 3D Perception | ogv|mp4|webm |
Open-loop walking, concurrently running 3D perception (actual speed). See below for integration of perception with walking control. |
We designed and built the Rapid-Prototyped BiPed (RPBP) to support our research in 3D perception for bipedal locomotion in uneven terrain. The term “Rapid Prototyping” refers to the 3D-printing fabrication system we used to build the majority of the robot’s custom ABS plastic parts.
RPBP stands about 47cm tall and weighs about 1.7kg. Each of its legs has six degrees of freedom. It has approximately the same kinematic link lengths and foot geometry as DARwIn-OP from the Open Platform Humanoid Project. It also uses a variant of the same Robotis Dynamixel MX-28 servos for all joints. Unlike DARwIn-OP, RPBP is not a full humanoid, but it does include a sensor head with a Primesense Carmine 1.09 short-range depth camera and a CH Robotics UM6 Inertial Measurement Unit (IMU). RPBP is also a remote brain robot: it operates with a lightweight tether which supplies motor power and which conveys sensor data to and control commands from an off-board control computer.
Our Ph.D. student Dimitrios Kanoulas used RPBP in his dissertation Curved Surface Patches for Rough Terrain Perception, which he defended in July 2014. In this work the depth images and IMU readings from the sensor head are fed to our rxkinfu system which integrates the data into a point cloud representing surfaces near the robot. Our imucam software is used for input and to spatially calibrate the IMU to the depth camera. Algorithms from our Surface Patch Library (SPL), which Dimitrios adapted in a soft real-time C++ implementation, are then applied to fit foot-size patches.
Mapping Curved Patches | ogv|mp4|webm |
Detecting and mapping curved surface patches using real-time 3D perception (actual speed). |
We have implemented two basic forms of motion control. First we implemented an open-loop manually tuned gait for flat terrain. As an initial demonstration of the integration of 3D perception with bipedal locomotion, we then implemented a patch-based control strategy which compares currently visible patches with a prepared database of known patches paired with corresponding pre-defined step motions. The video below shows the ability of the robot to use this patch-based 3D perception to identify and step on such rocks. In this work we have not yet implemented closed-loop balancing—the robot stops in a statically stable pose once the stepping foot has settled on the rock.
Stepping on Rocks | ogv|mp4|webm |
Stepping on rocks using 3D perception to select pre-defined motion plans (actual speed). |