Quassine is an artist based in Bordeaux. Her production is influenced by Fluxus. She ventures into all areas of multimedia expression, with an experimental approach focused on discovery that allows her to map new territories, to be explored in the light of the machines of her manufacture. The interlacing between analog and digital is always at the center of her research, in an attempt to reconcile Nature that nourishes her and the technological world that devours her. For that purpose, she creates visual and sound interfaces with sometimes unpredictable results, giving electronic circuits the central place in the creation. Her performances feature these devices in a constant oscillation between immediacy and pre-recorded tracks, repeated loops. The intention is that of trance, introspection, dizziness and déjà vu. A spontaneous experiment painted on a canvas woven from the threads of memory.
Website:
Proposal:
I have always been driven toward technology and computers as a way to express parts of my psyche that are unreachable for me otherwise. Growing up seeing movies like ExistenZ and Ghost in the Shell, I always hoped for a relationship with machines and AI in a way where we would let them acquire some kind of sensibility that would be drastically different from humans. In my work, I put the circuit in the center place, my role as a performer being first about choosing the types of interactions and selecting parameters and then letting the machine run its course. My project is to create an instrument to control sound and image that would be a silicon-based « lifeform » with emotions and moods of its own that would influence the outcome of the art produced through the interactions with the user. Proof of concept I have a prototype of a MIDI controller that is integrating different types of sensors: force resistive sensors, joysticks, temperature, etc. I use velostat to make custom force resistive sensors. It's covered with foam and wool to give it a soft feel. It is based on a Teensy LC micro controller that smoothens and translates the incoming data from the sensors into MIDI control changes. Those are routed into a software called Praxis Live ( https://www.praxislive.org/ ) that allows me to change parameters on a sound script and a video script with a webcam input. Those MIDI messages could be used through other software (Ableton live, TouchDesigner...) Work in progress The finished controller will be covered with silicon and resemble a lifeform more than a machine. Since the sensors aren't clearly visible, the user will have to experiment with them and deal with the fact that the machine won't always have the wanted response. Instead of controlling the machine per se, the user is discovering it, learning its ways to improvise with it. I plan on experimenting with graphite-charged silicon to create stretchable resistive sensors that would run through the soft machine like a limbic system. Based on the results I will be getting with those, it could open the opportunity of creating a wearable glove-like version. I would like to produce software dedicated to the instrument, probably using Processing (or maybe Unity) to create a unique experience for the user. Every session would be different due to the nature of the controller and the influence of data gathered from the environment (temperature, humidity, barometric pressure...). Further explorations My final goal would be to produce a controller in a form of a large cylinder or a hole, in which the user can insert their hand and arm. Most instruments are meant to be held, surrounded, and even inserted a little (the mouthpiece of a saxophone for example). This instrument would be played by penetration. I enjoy feeling pressure on my body and I feel like it would be a great way to make music and video art. Working with disabled persons in my day job I also care a lot about accessibility and I would like my controller to be usable by people with very low motricity skills.
Comments