Motion Tracking for Soft Robotic Arms

Project Description

The Soft Robotics Control Lab at Boston University is developing large, pneumatic soft robot arms for interaction with humans. In order to test our autonomous control systems for these robots, we need to sense the position of the parts of the robot in 3D space. This REU project seeks to integrate our control system software, in Python, with the 3D motion tracking system in the Boston University Robotics Lab. Tasks would involve electronics and software, particularly using an Arduino and Python to interface with the motion capture software, in order to process the incoming data from reflective markers attached to the robot. We will need minor mechanical design work to attach the reflective markers, as well as a test stand for the robot arm. Finally, if time permits, we will develop code that uses these marker positions to estimate the curvature of the robot arm.

 Mentors

Andrew Sabelhaus, PI   Juan Pacheco Garcia

Goals for this project include the development of a framework for motion capture of individual points on a soft robot arm, as well as the position and orientation of attachment points between the inflatable bellows that make the robot move. We will calculate the position and orientation of those connection points, and use them to estimate the robot’s states (curvatures of flexible segments of the robot) during operation. Future work will incorporate feedback controllers to use this data to position the arm in a grasping and manipulation task.
Students will develop skills in Python programming in real-time environments, using the Robot Operating System (ROS), as well as integration with serial communication over USB. Students will gain the ability to design and manufacture components of a soft robot that are attached after the main body is built, do not interfere with its motion, and are visible to cameras while it moves. Finally, students will develop their electronics skills, with soldering and prototyping, to make their test setup robust and re-usable by others.

Timeline

Week 1: Training, construction of an example soft robot arm, use of motion capture system for existing robots.
Week 2: Setup of the ROS software on student’s laptop. Write example code that outputs motion capture marker data to a text file.
Weeks 3-4: Python code that runs a ROS node in multiprocessing mode, added to the SRC Lab’s existing software.
Week 5: Logging: use combined software to write motion tracker data to file.
Week 6: Design and prototype an attachment bracket to place reflective markers on robot.
Weeks 7-8: Test robot arm motion and identify potential issues, including presence of noise.
Week 9: Test combined system with pneumatics controller, and measure the robot’s motion autonomously.
Week 10: Document results, create poster presentation.