Imagine a scenario in which an individual experiences a sudden tap on their shoulder. In response, their eyes may widen, or they may instinctively flinch. Such stimuli naturally provoke automatic reactions in humans, though these responses tend to diminish or evolve with repeated exposure over time. Building on this understanding of human emotional dynamics, researchers have developed an innovative robotic system capable of simulating adaptive emotional responses based on stimulus frequency and context. This advancement aims to enhance emotional engagement in social and companion robots.
Professor Hui Sung Lee and his team from the Department of Design at UNIST announced the development of this adaptive robot technology, which expresses emotions through changes in eye shape, color, and movement, with responses that evolve dynamically over time.

Shown above is a scene where the robot expresses happiness when gently touched.
The robot is capable of displaying six distinct emotions by combining variations in eye appearance, color, and movement patterns. Interactions are initiated via physical touch—either stroking (representing positive stimuli) or tapping (representing negative stimuli) on the robot’s head. For instance, a sudden tap causes the robot’s eyes to enlarge, turn blue, and its body to lean backward, thereby conveying surprise. Crucially, when the same stimulus is repeated, the robot does not simply replicate its initial response. Rather, its emotional expression adapts according to its previous emotional state and the history of stimuli it has received.
This approach allows the robot to mimic human-like emotional dynamics. User evaluations indicated that “[T]he robot’s responses vary subtly depending on the context, even with identical stimuli, making its reactions feel less mechanical and more natural,” with over 80% of participants describing its emotional expressions as “lifelike and vibrant.”
The research team modeled emotions as dynamic vectors that change over time, rather than static states. Strong stimuli rapidly increase the magnitude of the robot’s emotional vector, while weaker stimuli induce more gradual changes. This methodology enables the robot to exhibit nuanced and realistic emotional behaviors.
Professor Lee remarked, “Unlike traditional robots that display predetermined responses, our model captures the flow of emotional change, making the robot feel more like a living entity.” He added, “This has significant implications for applications such as companion robots and emotional support systems.”
The study was led by Haeun Park, a doctoral student and first author of the publication. The research was accepted for presentation at the 2025 IEEE International Conference on Robotics and Automation (ICRA), the world’s premier event dedicated to advancing the field of robotics, held in Atlanta, USA, on May 21, 2025. Funding was provided by the Ministry of Trade, Industry and Energy (MOTIE).
Journal Reference
Haeun Park, Jiyeon Lee, and Hui Sung Lee, “Adaptive Emotional Expression in Social Robots: A Multimodal Approach to Dynamic Emotion Modeling,” 2025 IEEE International Conference on Robotics and Automation (ICRA 2025), May 2025.