MIMESIS

REAL-TIME A/V INSTALLATION
2020

MIMESIS

REAL-TIME A/V INSTALLATION
2020

Mimesis is a large scale installation which takes inspiration from the biological evolution of life on the planet while investigating the influence that anthropic action has on earth. It wants to explore our positive vision towards what can be preserved, changed or improved in our ecosystem and the consequences of our practices on different habitats. Since the beginning of our history, we as humans have tried to imitate and reproduce aspects of reality in order to create a meaningful representation of everything that surrounds us. How can we represent the scenario that is ahead of us, our future planet? Through this work we try to reveal how the future will be influenced by the way we interact with the environment. In order to do this we created a generative system which collects and elaborates data from the local environment where the installation is staged. These data constantly affect the visual and sonic outcome creating an audio-visual generalization of the future environment that we’ll inhabit.

This concept is not understood just in the mere sense of reproduction or illusory image of nature but rather as a manifestation. Thanks to the real-time interaction of data from atmospheric measurement, nature appears in the digital image to be intensified in its processuality and unpredictability. Thus, Mimesis not only gives back an unprecedented knowledge of the aspects of nature that usually escape the vision, but also increases awareness of the effects of human action on the environment.

 

PROCESS

The main theme around which the narration develops is the concept of Primary ecological succession, the process that occurs with the formation of new lands and the exposure of a new habitat that can be colonized for the first time, as happens following the eruption of volcanoes or after violent floods. The soil is affected by changes with the deposition of organic matter and a gradual increase of the number and diversity of species in the ecosystem.

Mimesis EPK.jpg 0006 Layer 8
It is often subdivided in stages where different species find their way in according to the particular conditions; first we find pioneer plants such as mosses, algae, fungi and lichens, followed by herbaceous plants, then grasses and shrubs which precede the appearance of more robust plants such as shade tolerant trees. At each stage, new species move into an area, often due to changes to the environment made by the preceding species, and may replace their predecessors. At some point, the community may reach a relatively stable state and stop changing in composition. We can consider this process as a gradual growth of an ecosystem over a long period of time.

In Mimesis this concept is addressed and investigated through a digital reinterpretation. We outlined five chapters, each corresponding to the different phases of the succession, that synthesize the ecological succession that is the starting point of a one-hour long process that tries to create a new form of life from the most simple conditions possible.
These conditions change continuously throughout the day since they mirror the climatic conditions of the city where the installation is staged. The information retrieved from the sensors is then used to create a biome which is a large naturally occurring community of flora and fauna occupying a major habitat. Each chapter is generated in real-time both for the visual and the audio part following the live-extracted data. The initial configurations of the local climate condition lead to different paths for the ecological process while the unpredictability of the weather will constantly modify the audiovisual outcome.

Mimesis EPK.jpg 0007 Layer 7

The digital simulations try to emulate different natural phenomena and patterns; in order to do that several algorithms are used and applied in combination with one another. The visual outcome is influenced also by the use of specific software that exploit a machine learning architecture (GAN) that is specifically suited for the generation of synthetic images. Therefore specific GANs (Generative Adversarial Networks) are used to generate new artificial images for each theme of the ecological succession. The result of this process is fed into the software and is constantly affecting the final visual result.
Mimesis EPK.jpg 0005 Layer 6
The generation of new imagery is only part of the software that processes the visual outcome. For this reason, the same weather data fuse* can collect is influencing the audiovisual result by changing the way the digital landscapes evolve throughout the day.

Mimesis EPK.jpg 0004 Layer 5
The meteorological data influence the content on a longer time-scale over the year. In particular, each phase of the ecological succession is represented by different scenarios or every season. On the right, fuse* brings the example of the first phase in which a series of inhospitable terrains are generated thanks to the AI software. In this case, during summer the main generation will revolve around the theme of desert landscapes and these will transform into canyons, glaciers, and rocks throughout the different seasons. All the smaller changes in the parameters during the day will affect the result on lower hierarchical order. The same concept is then applied for every other chapter of the succession. For this reason the visual outcome is very different from one season to the other and this choice represents also the fact that life might emerge from a diverse set of conditions.

Mimesis EPK.jpg 0003 Layer 4

 Mimesis - Seasonal Variation

Mimesis EPK.jpg 0002 Layer 3

  Mimesis - Digital Ecological Succession

 

SOUNDTRACK

The musical component of the installation is a generative system created with Max/MSP integrated into Ableton Live. For each phase of the digital ecological succession, it has been created a soundscape by modeling in real-time sound recording of natural elements. The notes generation starts with the procedure of selection of the scale and tone which can be ordered according to a criterion of color/expressivity starting from the one with a darker texture to the more brilliant one. Every mode has its own color. The selection of the scales changes according to the season, starting from lighter tones in spring and summer going towards darker/colder colors for the autumn and winter. Every note of the list has its own weight/probability of being played, this percentage can vary during time or change based on the collected data; the height of the octave of the note follows the same principle.
The generated notes are used to control the intonation of the filter of an instrument of physical modeling, thus controlling the original sound source. The synthesis of the physical modeling refers to methods of sound synthesis in which the waveform of the generated sound is calculated using a mathematical model, a series of equations and algorithms that are capable of simulating a physical sound source, usually a musical instrument. The parameters used for the modeling can change over time based on the collected data or by values extracted from the visual generative process.

Production: fuse*
First Commission: JL Design
Artistic Direction:
 Mattia Carretti, Luca Camellini
Concept: Mattia Carretti, Luca Camellini, Samuel Pietri, Riccardo Bazzoni
Software Artists: Luca Camellini, Samuel Pietri
Sound Design: Riccardo Bazzoni
Hardware Engineering: Matteo Mestucci
Support to Concept Writing: Saverio Macrì

 

Selected Exhibitions
Transart Festival / Bozen, IT
Fjord Oslo Light Art Festival / Oslo, NO