Home
/
Robotics
Robotics/MechE
Shelvin' Cooper, 2025
Ever since I started volunteering at the Weston Public Library in eighth grade, I've dreamt of designing a robot that could shelve books. Taking the Caltech Robotics Systems course brought this dream one step closer to reality. I worked with two classmates to design and program Shelvin’ Cooper, a 6-DOF robotic arm that assists with sorting and shelving books. Users place books on a table and Shelvin' sorts them on the shelf alphabetically by author last name. This demonstrates how technology can assist librarians by reducing re-shelving time.
Close
Pacman Bot, 2025
In the Caltech Mobile Robots course, I worked with a classmate to program a robot to autonomously localize, plan, and navigate. The robot uses its motor encoders, gyroscope, a known map of the world, and LIDAR for localization. The LIDAR is also used to add new obstacles to the map to aid with path planning via RRT. Furthermore, we implemented game logic to enable the robot to play life-size Pacman.
We made it to the final round (top 6 teams out of 16). I recorded the video above for the third qualifying round. Our robot started in the top left corner of the maze. Within seconds it ate a coin (1 point), the power pellet (10 points), and caught the ghost (30 points). Our robot holds the record for the most points (53) scored in a single round. There were eight rounds and sixteen robots.
We made it to the final round (top 6 teams out of 16). I recorded the video above for the third qualifying round. Our robot started in the top left corner of the maze. Within seconds it ate a coin (1 point), the power pellet (10 points), and caught the ghost (30 points). Our robot holds the record for the most points (53) scored in a single round. There were eight rounds and sixteen robots.
Winner of Transmission Contest, 2025
In the Caltech Design and Fabrication class, I worked in a group of six to design and construct a transmission (under $200) that couples the rotational power of a brushed DC motor to a rotating bicycle wheel.
We competed against six other teams. The score was calculated as w / T, where: w is the maximum RPM of the wheel, and T is the time (seconds) required for the wheel to reach 250 rpm.
We used MATLAB to compute an optimal gear ratio. We purchased gears from SDP that maximized the transmission of torque, and minimized the moment of inertia. Similarly, we opted for retaining rings instead of shaft collars and aimed to make the transmission compact to reduce vibrations. (I'm being vague on our actual design since this contest has been a cornerstone of the class for decades.)
Multi-Agent Temporal-Spatial Planner, 2025
In Caltech Robotics (b) class, we learned how to use the Expansive Search Tree (EST) algorithm to plan a path for a single robot in a stationary world. For the final project, two classmates and I adapted the algorithm to plan temporal-spatial paths for multiple robots simultaneously. The video shows some examples of how each robot (represented by a small square) is able to move from its starting location (represented by a dot) to its goal location (represented by a dot) without collisions.
Close
Ping Pong Ball Handler, 2024.
In the Caltech Robotics (a) course, I worked with two classmates to program the Franka Emika Panda robotic arm to intercept and redirect tennis balls with arbitrary trajectories. We optimized the interception points to minimize singularity risks and ensure smooth deceleration.
FIRST Robotics Competition (FRC) Team Ultraviolet #8567 - 2022 Robot
In May 2020, I was one of six who founded my school’s student-led robotics team, which competed in the most challenging U.S. robotics competition. The first year, I led the electrical sub-team to design and wire the electrical control board for the robot.
The next year as a Co-Captain, I led a team of fifty through training and competitions, during which we designed and built a 3’x3’x3’ robot in ten weeks. I learned about new electrical equipment (pneumatics, sensors) as I guided the electrical team, but I was especially thrilled to dive into designing and constructing the robot’s mechanisms.
FIRST Robotics Competition (FRC) Team Ultraviolet #8567 - 2023 Robot
The following year as Co-Captain, I led the team to engineer a more advanced robot for a new game. I learned to create sophisticated CAD models by importing 3D parts from online vendors and shared this knowledge with the team, enabling us to CAD model the robot for the first time. This made us far more productive during the build sessions. I conducted extensive research to source better prototyping materials, such as pre-drilled aluminum tubes, and identified many of the essential parts to purchase throughout the build season. Additionally, I learned to operate the high school’s CNC machine, purchasing the appropriate drill bits and CAD modeling aluminum gussets for the robot. This allowed us to produce sturdy, custom components in-house, eliminating the need to outsource gusset manufacturing.
2022 Robot
2022 FIRST Robotics Competition Rookie All-Star Award at District Event. Honors the rookie team exemplifying strong leadership, vision, spirit, and partnership between school and sponsors, while inspiring students to learn STEM.
2022 FIRST Robotics Competition Rookie Inspiration Award at our 2nd District Event. The Inspiration Award is the 2nd best award that a rookie team can win. We were ineligible to win the best award (the Rookie All-Star Award) again because we had won it at our previous event.
Image Lightbox
2023 Robot
2023 FIRST Robotics Competition District Quarter-finalist
2023 FIRST Robotics Competition District Sustainability Award.
Image Lightbox
2021 MIT Beaver Works Summer Institute Autonomous Underwater Vehicle (AUV) Challenge
My team coded an AUV able to swim through gates formed by red and green buoys in the MIT pool. Code: github.com/Baygulls/Baygulls. I wrote Image_Processor.py: it detects the pixel position of the centers of the largest red and green buoys by analyzing the red and green channels of an RGB image taken by the AUV. Then the code calculates each buoy’s angle relative to the AUV. I helped code AUV_Controller.py (which has a function returning how much the AUV should turn its rudder) and BWSI_BackSeat.py (which initializes AUV parameters, calls the AUV_Controller function, and sends commands to a lower-level interface). I debugged our code a lot. In our 2nd test run, the AUV cleared the first and final gates. The attached video shows the photos the AUV took. (Sound effects were added to optimize user experience). Analyzing the footage, I found out the AUV often thought the floor was the largest green buoy. So I modified the Image Processor to ignore objects larger than a certain size.