
2024 RoboCup Humanoid World Champions!
Model Optimization
For an autonomous soccer-playing robot, the perception system serves as the foundational component upon which all other processes depend. It provides essential data for decision-making and control, making the balance between speed and accuracy critical to ensuring high-performance in dynamic, fast-paced environments. I profiled and optimized each part of the perception stack, and implemented NVIDIA TensorRT deep learning SDK, applied multi-threading techniques, and optimized YOLOv8 layers to improve processing efficiency by nearly 200%, significantly reducing cycle time for real-time performance. These optimizations allowed ARTEMIS to perform real-time perception tasks with unparalleled accuracy and speed, significantly enhancing its competitive performance in RoboCup.
Error Filtering
I developed an error filtering pipeline to reject false detections of balls and landmarks in varying environments. The system integrates color masking, height-based outlier rejection, and median filtering to reduce noise in object detection and 3D pose data. Additionally, a Kalman filter for the ball is implemented for temporal consistency and a cost function that selects the 6 most reliable landmarks based on object confidence scores and distance metrics is designed; integration testing showed localization is very accurate with 6 landmarks, and additional landmarks begin decreasing localization cycle time with little return. This filtering system compensated for a significant lack of diverse data and insufficient computing power. The extra layers of filters gave a decisive edge over other teams at Eindhoven's highly variable field, which was crucial for our RoboCup World Championship victory.
Proximity
The proximity system was a critical innovation developed for RoboCup 2024 that addressed a fundamental challenge in robot soccer - preventing collisions. Operating as a safety-net for the vision system, it combined ZED camera depth maps with Connected Component Analysis and a Field-of-View algorithm to rapidly detect nearby obstacles. This data is invaluable when the vision system fails to detect an obstacle, or when an obstacle blinds ARTEMIS. The proximity algorithm processes depth data in under 2ms, filtering out expected depth readings and appending alien objects within the proximity field. This capability proved essential for Artemis's success, as it enabled safe navigation at speeds up to 1.2 m/s in a crowded field with multiple moving robots, referees, and human handlers. This approach emerged from real match experience and uniquely positioned ARTEMIS to maintain both high-speed movement and collision avoidance, giving us an edge over other teams and contributing significantly to the team's championship victory.

A seamless demonstration of integrated systems at work - ARTEMIS leverages its precise perception system to track the ball's trajectory and locate landmarks, uses robust localization to maintain field awareness, and executes complex path planning to position itself optimally for both the self-pass and the scoring shot

Experience real-time perception through ARTEMIS's visual system as it rapidly processes and filters sensor data, maintaining reliable object detection and pose estimation while executing high-speed maneuvers in a dynamic soccer environment

Visual representation of the Field-of-View algorithm, which creates a proximity field that filters expected depth readings and flags aliens objects