11/06/2025 Mallory Lindahl
The Breakdown
- SPOT helps robots understand their surroundings using 3D camera data.
- The system assists robots as they determine what objects to move and where to place them to reach a goal.
- SPOT enables intuitive, goal-driven robotic planning similar to how humans organize or clean a space.
***
Researchers at Carnegie Mellon University’s Robotics Institute (RI) have developed a new system that helps robots operate effectively in cluttered, unpredictable environments like kitchens, classrooms and offices — one of the biggest challenges in robotics and a fundamental step toward making robots more capable in everyday settings.
The system, called Search over Point cloud Object Transformations (SPOT), allows robots to better understand their …
11/06/2025 Mallory Lindahl
The Breakdown
- SPOT helps robots understand their surroundings using 3D camera data.
- The system assists robots as they determine what objects to move and where to place them to reach a goal.
- SPOT enables intuitive, goal-driven robotic planning similar to how humans organize or clean a space.
***
Researchers at Carnegie Mellon University’s Robotics Institute (RI) have developed a new system that helps robots operate effectively in cluttered, unpredictable environments like kitchens, classrooms and offices — one of the biggest challenges in robotics and a fundamental step toward making robots more capable in everyday settings.
The system, called Search over Point cloud Object Transformations (SPOT), allows robots to better understand their surroundings and determine how to move objects to achieve a specific goal.
SPOT emerged from a collaboration between students advised by RI Associate Professor David Held and Professor Maxim Likhachev. The team set out to help robots plan and coordinate multi-object movements, like putting away dishes or placing items on a shelf.
Planning a sequence of movements is essential for robots to successfully rearrange objects while avoiding collisions and potential damage to items. Before stacking a plate on a plate, for example, any bowls or cups on top of those plates need to be removed and set aside. Similarly, setting objects on a shelf can be challenging because the exact placement of each item determines whether everything will fit and stay put.
Instead of relying on breaking the world into symbolic descriptions, meaning everything in the environment must be explicitly described with symbols or rules before the robot can act, SPOT uses 3D data to reason about the shapes and spatial relationships of items in the scene. It can decide which object to move, where to place it and in what order to move it to best achieve its goals. With these skills, robots make progress toward safely and accurately organizing household items, assisting with chores or fetching objects for users.
“SPOT operates directly in the point cloud space with raw sensory input from one camera and needs no additional information about the scene or the objects,” said Amber Li, a Ph.D. student in the RI and co-lead researcher on the project. “In other words, it sees the world in 3D. Point clouds give the robot a detailed view of object shapes and positions, allowing it to plan how to move them even in cluttered or partially visible environments.”
Using a Franka robotic arm and a set of plastic dishes, the team conducted several experiments to examine SPOT’s functionalities. With SPOT, the robotic arm successfully rearranged the dishes into several different configurations and knew which objects to move first to achieve the desired outcome. In these tests, SPOT performed better than traditional planning methods.
“When humans organize our homes, we don’t have a set of rules in our minds that we follow before rearranging objects,” said Kallol Saha, a master’s student in robotics and a co-lead researcher on SPOT. “We just look, plan, then act. SPOT brings that kind of intuitive decision-making to robots, allowing them to plan complex movements directly from what they see.”
SPOT was accepted into the 2025 Conference on Robotic Learning in Seoul, South Korea, where the team presented their research earlier this fall. The work was funded by the Toyota Research Institute and the Office of Naval Research. To learn more about SPOT, visit the project website.
For More Information: Aaron Aupperlee | 412-268-9068 | aaupperlee@cmu.edu
Mallory Lindahl2025-11-04T15:39:53-05:00