In this project, I developed a vision-guided robotic arm capable of detecting and retrieving a pen from a human hand. The system uses an Interbotix robotic arm controlled via a ROS 2 Python interface, integrating computer vision, kinematics, and motion planning to achieve precise grasping.
School: Northwestern University
Location: - Evanston, IL
Duration: September 2025
Project Gallery
Github Link: https://github.com/ncknight-un/Interbotix_Pen_Grasping
Enclosure & CAD Designs:
Grasp Success Ex#1
Grasp Success Ex#2
Repeatability
Camera - Robot Calibration
Core Challenge
The primary challenge was coordinating perception with manipulation:
- Detect the pen’s position and orientation in real time
- Plan a feasible trajectory given the arm’s 5-DOF constraints
- Execute a smooth and accurate grasp
System Implementation
- Python scripts interface with ROS 2 nodes
- Camera input is processed to extract target coordinates
- Motion commands are generated and sent to the robotic arm
- Kinematics and planning ensure feasible and precise movement to grasp pen
Skills Improved:
- Perception: Vision-based object detection and pose estimation using an Intel RealSense camera
- Python: Data processing, trajectory planning, and robotic control
- System Integration: Combined perception and control into a unified pipeline
- Calibration: Sensor and system calibration for accurate perception and motion
Key Takeaway:
This was my first hands-on experience with ROS 2 and robotic manipulation. It provided practical insight into how robots perceive their environment and execute controlled movements.
Working on this system strengthened my understanding of vision-based perception, motion planning, and precise control, allowing me to demonstrate how these components come together in the real-world for robotics applications.