Back to Projects
PIE Project 2025

We help people
with disabilities
eat independently
with intelligent robotics.

4-DOF
Robotic Arm
30fps
Mouth Tracking
<$600
Total Cost
100%
Accuracy

Robotic assistive feeding device

Our assistive feeding device combines computer vision, robotics, and embedded systems to create an affordable solution for individuals with limited upper body mobility.

The system uses real-time mouth tracking to position food at the optimal location, giving users control over their feeding process. Built with accessibility in mind, the entire system costs under $600 and can be replicated by other teams.

Scroll to explore assembly

01

Independent Feeding

Enable independent feeding for people with mobility impairments

02

Real-Time Tracking

Implement accurate real-time mouth tracking using computer vision

03

Safe Design

Design a safe, intuitive robotic arm with precise positioning

04

Affordable

Keep the total cost under $600 for accessibility

Watch it in action

0:00 / 0:00

Tracking Accuracy

Within 2cm at 30fps real-time processing

Response Time

Delivers food in approximately 6-7 seconds

Safety Features

Dual E-stops, motion limits, emergency pause

Degrees of Freedom

4 DOF with smooth servo control

Design decisions and trade-offs

We explored several design approaches before landing on our final solution. Our initial concepts included a stationary arm with rotating base, a gantry-style system, and a jointed arm similar to industrial robots.

01

Robotic Arm Configuration

OPTIONS

4-DOF articulated arm, 4-DOF SCARA-style, 2-DOF pan-tilt mechanism

OUR CHOICE

4-DOF articulated arm with base rotation, shoulder, elbow, and wrist joints

RATIONALE

We chose the 4-DOF articulated configuration for maximum workspace flexibility and natural motion. The additional wrist joint (vs. 3-DOF) allows proper utensil orientation regardless of approach angle, critical for smooth feeding. While more complex than simpler designs, the improved dexterity justified the added servos and control complexity.

Base
Shoulder
Elbow
Error
0px

Interactive inverse kinematics using algorithm • Move your mouse to control the arm

02

Mouth Tracking Approach

OPTIONS

MediaPipe Face Mesh, OpenCV Haar Cascades, custom CNN model, depth camera only

OUR CHOICE

MediaPipe Face Mesh with RealSense depth camera

RATIONALE

MediaPipe provides 468 facial landmarks at 30fps with excellent accuracy on Raspberry Pi 4. Combines well with RealSense depth data for precise 3D positioning. Pre-trained models eliminated weeks of custom ML training, and open-source implementation made debugging straightforward. Haar Cascades were too inaccurate, and custom CNNs would require extensive datasets we didn't have.

📹

See mouth tracking in action

Tip: Select "Always allow" for best experience

03

Computing Platform

OPTIONS

Raspberry Pi, Arduino + OpenCV on laptop, Jetson Nano, microcontroller with cloud processing

OUR CHOICE

Raspberry Pi 4B (4GB RAM)

RATIONALE

Raspberry Pi 4 offers the perfect balance of compute power and cost ($62 vs. $200+ for Jetson). Quad-core ARM processor handles MediaPipe at 30fps while managing servo control and safety monitoring. Built-in USB 3.0 for camera, GPIO for safety buttons, and strong community support made development smooth. Arduino lacked ML capability, and cloud processing introduced unacceptable latency for safety-critical feeding motions.

Test your reaction time vs Raspberry Pi 4

Click when the screen turns green!

System Architecture

Hover over components for detailed specifications INPUT Intel RealSense Depth Camera RGB-D Video Stream USB 3.0 E-Stop (Primary) Hard-Wired to GPIO E-Stop (Shoulder Mount) Pluggable - GPIO Power Supply 5V 5A Pi Power Power Supply 9V 3A Servo Power Power Switch Controls BusLinker Power (Servo Control) PROCESSING Raspberry Pi 4 Model B Main Processing Unit MediaPipe Face Detection Mouth Tracking Open/Close Detection Motion Control Inverse Kinematics Position Calculation Servo Commands Safety Monitor E-Stop Detection Emergency Pause Serial Control USB Communication Servo Protocol Cooling Fan Thermal Management BusLinker V2.2 Serial Bus Servo Control Converts USB → Serial Bus Protocol Controls 4 servos OUTPUT Servo ID 0 - Base Pan HiWonder LX-15D Rotational Axis - Joint 0 Servo ID 1 - Shoulder HiWonder LX-15D Vertical Axis - Joint 1 Servo ID 2 - Elbow HiWonder LX-15D Vertical Axis - Joint 2 Servo ID 3 - Wrist HiWonder LX-15D Vertical Axis - Joint 3 End Effector Hybrid Spoon & Fork Holder Quick-release mechanism 3D Printed PLA USB 3.0 GPIO 5V 9V USB → Serial Serial Bus

Design specifications and analysis

Data and Energy Flow

Data Flow Camera 30 fps RGB-D MediaPipe Face Detection Motion Control IK Solver Serial Control USB Protocol BusLinker Serial Bus Servos Motion Frames Coords Angles Commands Control 30 Hz 30 Hz 100 Hz 100 Hz 100 Hz Energy Flow Wall Power 120V AC ~50W Total PSU 5V 5V @ 5A 25W PSU 9V 9V @ 3A 27W Raspberry Pi 3-7W Camera 3-5W Fan (2W) BusLinker Controller ~2W 4x Servos 15-25W Load Dependent AC 5V 9V Serial Bus

Key Points

  • Data Path: Camera (30Hz) → MediaPipe → IK Solver → Servo Commands (100Hz)
  • Power Distribution: Dual isolated supplies (5V for computing, 9V for motors)
  • Safety Layer: E-stops monitored via GPIO provide instant shutdown
  • Total Power: ~50W peak with all servos under load

Electrical Design

Detailed Schematic

Detailed Circuit Diagram

Complete circuit schematic with pin assignments and component specifications

System Overview

Simplified Circuit Diagram

Simplified system overview showing main electrical connections

Microcontroller

Raspberry Pi 4 Model B (4GB RAM)

Quad-core ARM Cortex-A72 @ 1.5GHz

Depth Camera

Intel RealSense D435/D455

RGB-D at 1920×1080, 30fps

Servo Motors

4× HiWonder LX-15D

15 kg·cm torque, 240° range

Motor Controller

BusLinker V2.2

USB to Serial Bus converter

Power Supply

Dual rail: 5V @ 5A, 9V @ 3A

~50W total system power

Safety Systems

2× Emergency stop buttons

Primary + shoulder-mounted

Mechanical Design

Robotic Arm
Engineering

4-Degree-of-Freedom Manipulator

Custom linkage topology minimizes moment arm on the base servo for smooth, stall-free operation. Built with PLA and precision bearings for durability.

850g
Total Weight
4 DOF
Articulation
20%
PLA Infill
<2s
Tool Swap
More Pictures
Interactive Model

Explore the 3D Model

Drag to rotate • Scroll to zoom • Click and drag to explore

Loading 3D Model...
This may take a moment
🖱️
Click to Interact
Drag to rotate • Scroll to zoom

Software Design

SYS: ONLINE FREQ: 100Hz THREADS: 4
User Experience

Teach-and-Repeat Trajectories

Hard-coding specific motions for food scooping is inefficient and brittle. Instead, we implemented a playback system.

We manually guide the arm through a scooping motion once. The system records the joint states at high frequency and saves them. During operation, the robot simply replays these states to execute the scoop, allowing us to change the serving size or approach angle without rewriting the control code.

FREQUENCY DECOUPLING
STATUS: SYNCHRONIZED
VISION THREAD ~30 Hz (Variable)
PROCESSING...
RQ
ASYNC
BUFFER
CONTROL THREAD 100 Hz (Fixed)
INTERPOLATING
Latency Management

Computer vision processing is heavy and inconsistent (around 30fps). If the motor controller waits for a vision frame to complete, the movement becomes jerky and unsafe.

Redis Queue (RQ) Implementation

We used Redis Queue to decouple the threads. The vision system pushes coordinates to the queue whenever they are ready. The control thread pulls the most recent data at a strict 100Hz loop, interpolating between points to ensure smooth motor actuation.

Closed-Loop Control

Standard trajectory generation struggled with depth noise from the camera. To fix this, we implemented a gradient descent-based controller. It treats the target coordinate as a minimum on an error surface and iteratively drives the end-effector towards it.

J(q) = + λe
q̇ (Joint Vel)Output Command
ẋ (Task Vel)Desired Motion
e (Error)Error Vector

Live Logs

pi@munchkin-bot:~

System Logic (FSM)

IDLE System standby.
SCOOPING Executes recorded trajectory.
WAIT FOR OPEN MOUTH Scans for open mouth landmarks.
TRACKING Servoing to mouth position.
PROXIMITY WAIT Triggered at <10cm distance.
RETRACT Backs off and resets to Scoop.

Safety Interrupts

Primary E-Stop GPIO_17
Shoulder Stop GPIO_27
Hardware Interrupt

Bypasses main loop.
Instant torque cut.

External Dependencies

PyLX16a
MediaPipe
NumPy
RQ (Redis Queue)
pyrealsense2
OpenCV
Python 3.9+
Raspberry Pi OS

Bill of materials

Project cost breakdown. All costs are shown even if items were obtained for free.

Component Qty Cost Source
Computing & Vision
Raspberry Pi 4B 1 $62.39 Found
Intel RealSense Camera 1 $314.00 Found
Motors & Control
5 Servos and Control Board 1 $95.61 Bought
Thrust Bearing 2-1/4 inch 1 $13.82 Bought
Electronics & Connections
Push Down Button 1 $8.12 Found
E-Stop 1 $6.99 Found
USB - USB C Cable Low Profile 1 $10.61 Bought
90 Degree USB C Adapter 2pck 1 $5.09 Bought
USB C Extender 1ft 1 $8.49 Bought
4 Pack USB Fans 1 $8.81 Bought
Mechanical Components
eSUN White PLA Filament 1 $11.99 Bought
M2 Screws Pack 1 $9.55 Bought
Miscellaneous
Other MISC estimated cost (wires, connectors, 2x shaft bearings, power adapters, screws) 1 $40.00 Found
Total Budget Spent (Bought Items) $163.97
Total Including Found Items $595.47