2025
Perceptron// EX TIME
Living computer that perceives Time
Our installation is about how humans perceive life, time, and themselves within it—and how technology can serve as a mediator in this process.

Our central image, used as the conceptual foundation, is a “Living Neural Network” that creates a biological computer. We recreated Rosenblatt’s Perceptron, the pioneering technology of neural networks, in which the nodes contain not mechanical mathematical units but living beings—humans and slime molds.

The neural network—the Perceptron—perceives and recognizes biological events from the surrounding world and makes decisions according to a specific algorithm. Originally, Rosenblatt was inspired by the workings of our nervous system; he developed a mathematical model patterned after the function of neurons in the human brain. That led to the intellectual landscape and model in which we live today where the image of the computer, the metaphor of computability became intertwined with our understanding of living forms. Moreover, when explaining aspects of human behavior, motivation, or life, we use machine-based correlations derived from cybernetics, computer science, neuroscience, and artificial intelligence. Our aim is to invert this narrative and challenge that metaphor.

In our Perceptron’s layers, there are no standard mathematical models; instead, there is fungus mycelium and people. Together, they form the input and transform layers of data, creating a rhizome of nodes and strands. Slime molds feed their biochemical processes into the Perceptron, while humans contribute their pulses. All of this flows into the nodes, where our algorithm decides when an Event occurs. Accumulated small events lead to larger ones, which are detected by our Algorithm. It then relays this information to a human interface, voicing and vibrating these impulses that resemble the asynchronous beating of the Machine’s Heart.

Our work draws on A. Bergson’s concept of the living and time, as well as G. Simondon’s ideas that machines can be represented as media for human contact with the world, shaping their own perspectives and language. Humanity, in turn, must learn to discern this language, grant technology more space and trust, allow it to manifest its otherness, and redefine how humans interact with technology.

In our interpretation, the Perceptron perceives living time, which is present within living creatures. Its operation reflects temporality and change, and the way it interacts with humans is conveyed at the bodily level (through vibration, sound, media visualization resonated with the events), enabling one to bypass purely rational, computational interpretation and enter a meditative, intuitive plane of perception.

To experience this, we invite participants not only to observe how the slime molds transmit their impulses to the system (across several stations), but also to locate one of the small hearts, which—using a photoplethysmographic sensor—sends each station’s pulse into the system (several small hearts). Finally, participants can immerse themselves inside the large machine-heart, where all data converge and decisions are made, and listen to, feel, this beating of time that the machine transmits to humans. To sense living time.

At the same time, we make the technology visible, endow it with agency and the ability to perceive time and the living intuitively. With the help of AI, we teach the Machine to learn to predict the next event, and we visualize this expectation with vibration inside the big heart. Thus, the person inside hears both the time outside and the intuitive expectation and the attempt to understand it by the machine itself from within.
Software
Our system uses a layered signal modulation algorithm to detect resonance and synchronization moments. Modulators compare incoming signals to a median threshold, registering a signal as "1" if exceeded. Values update every 10 ms, transmitting state changes via OSC protocol. Synchronization amplifies signals, while dissonance weakens them. Detected events control machine pulse generation, ambient, and visualization. The system learns from past events, developing “intuition” to anticipate new ones.

Use of AI
A “machine time intuition” algorithm predicts event occurrences by analyzing past data. Our perceptron, with human and myxomycete modulators as neurons, processes signals based on biological rhythms, forming nonlinear, subjective time instead of fixed machine time. This approach explores subjective AI, where machines gain an adaptive perception of time.

Hardware
The ESP32-based modulators process signals from humans and myxomycetes. Human modulators are 3D-printed biodegradable hearts with pulse sensors. Myxomycete modulators consist of glass chambers with petri dishes, recording electrical activity through electrodes. Myxomycetes pulse every 45–120 sec, altering their membrane potential via Ca²⁺ consumption. All pulses transmit via Wi-Fi, triggering LED visualization. The machine pulse, synthesized from this data, manifests as ambient sound, projection, and fabric vibrations inside the heart-machine.
Made on
Tilda