The Flower
date. 2025
media. 3d printed parts/ electronic components/ customized circuit and software
type. interactive installation/ robotics/ real-time machine learning
The flower explores human social interactions and emotional states, such as loneliness, recognition, and social pressure. By embedding psychological models into an artificial organism, I seek to evoke deep emotional engagement, allowing audiences to form empathic connections with a mechanical entity powered by artificial intelligence. It responds to audience presence and emotions through facial expression detection. For example, the flower becomes excited and blooms happily, or may exhibit nervousness as its “breathing” quickens during interaction with people. When left alone, it withers in solitude.
Technically, the viewer’s facial expressions are captured by a camera and processed through a deep neural network for classification. The results are then fed into a human psychological emotion model (PAD), simulating artificial life’s emotional responses. These synthetic emotions drive a motion synthesizer, controlling 15 motors to produce complex movements that express emotions. It is programmed in Python and built on the ROS platform, utilizing multiple computer vision libraries for various visual tasks, such as OpenCV, Dlib, and DeepFace. Five microcontrollers with customized code manage motion control to ensure smooth movements.
Inspired by animism and the theories of Plant Perception and Consciousness, this artwork challenges traditional notions of sentience and interaction. It invites viewers to reflect on the complexities of human relationships, the emotional capacities of artificial life, and how emotions influence our perception of both synthetic-world and real-world connections.


