Stony Brook University has welcomed a new research collaborator to its Interacting Robotic Systems Laboratory (IRSL): a programmable humanoid robot on loan from robotics startup DL-RL. The 4.5-foot, 77-pound unit is designed to study how embodied AI can learn physical tasks through direct human guidance rather than prewritten code. It will remain on campus until July 2026, giving mechanical engineering researchers a long runway to explore advanced manipulation and mobility skills.
The robot focuses on tasks tied to home support and caregiving: pouring a drink, wiping a spill, or transporting items. These motions may appear simple, but they combine balance, force control, orientation, and situational awareness. Rather than programming those actions line by line, Stony Brook researchers rely on “programming by demonstration,” where a caregiver or researcher physically walks the robot through the full movement. The robot’s sensors and internal model then generalize the demonstration so it can repeat the action independently, even when objects, positions, or heights vary.
Associate professor Nilanjan Chakraborty says the new method developed at Stony Brook is aimed at making manipulation training far more approachable. Instead of specialized teleoperation gear, a researcher moves the robot’s arm or hand directly, allowing the system to map real-world movements into a usable control sequence. Graduate students are now extending the work to help the robot navigate and carry out multi-step routines without remote control support.
Early demonstrations show the robot adapting to container shape, grip location, and pour angle — all critical for assistive settings. While public concern often centers on robots replacing human roles, the team stresses the opposite. Caregivers still provide personal connection and judgement; the robot handles repetitive moments that can reduce fatigue or strain over long shifts.
The DL-RL partnership also broadens student access to commercial-grade robotics hardware. As the robot gains new capabilities, researchers hope to share methods that reduce the time and complexity needed to train embodied AI systems. That could help household-task robots mature from experimental prototypes toward practical deployment in support roles.
Embodied AI research is gaining traction, and universities now play a visible role in developing training methods that make robots more adaptable in real-world settings.










