Imagine if you could turn your memories and emotions into compelling, abstract paintings. It’s basically every artist’s dream.
A London-based creative technology studio, random quark, has found a way to visually represent emotions by scanning people’s brain activity to create awe-inspiring paintings.
Equipping individuals with commercial EEG headsets in a room with dim lights and free of noise, the company asks you to close their eyes and think of an emotionally charged memory, happy or sad.
As the device scans the brain’s electrical activity from the left to the right, it creates a dataset which constitutes a unique insight into the individuals’ memory and mood at that time.
But this deluge of information needs to be translated into the canvas in a visual way that is a unique piece of art.
Random quark’s Theodoros Papatheodorou and Tom Chambers adopted a technique that draws inspiration from generative art that’s a flocking algorithm inspired by bird swarm or movement of fish in the sea.
“Some rules of the swarm rely on stochastic/random decisions and therefore are unique,” Papatheodorou said. “At the same time, keeping some rules the same we managed to create a uniform visual output. Basically, all look like they were made by the same painter, on a different day.”
Flocking algorithms also give the chance to make use of the massive amounts of data that comes out of the brain during the EEG test, without reducing it to a few inputs.
To determine what people are feeling, random quark relies on a theory called lateralization of emotion which basically says that activity in the left side of the brain is associated with positive feelings while increased activity on the right is linked to negative feelings.
“We measure the asymmetry between left/right hemispheres as well as the overall activation of the brain (alpha/beta/gamma waves) and we plot this data in a 2D valence-activation graph which is known as the Geneva Emotions wheel where all the human emotions are plotted,” Papatheodorou said.
For the purpose of the experiment, random quark filtered and reduced the emotions to 7 major ones joy, sadness, anger, love, disgust, fear, surprise and measure only them, giving them a score.
The feelings are ranked by intensity, with a confidence level for each one. Then, the system picks only the first 2 emotions and proportionally assign a unique shade of colour to each particle of the system.
“Around 100,000 agents are released on the canvas and their movements are guided by a swarm algorithm that we have written which is partly affected by the raw EEG data,” Chambers said.
“One particle leaves a random trail, but when you have 100, 1000, 10,000, 100,000 particles you start to have a painting that looks like brush strokes, because these particles kind of coalesce together, they break apart and they interact with each other to make the painting.”
How the swarm travels on the canvas is directly linked to the raw brainwaves as we read them – shaping in effect the patterns on the “paper”, Papatheodorou says.
The colours’ association is arbitrary although it’s based on common symbols for emotions red for anger, blue for sadness and so on and the authors stress the artistic (not scientific) nature of their work.
The project started when Saatchi Wellness, a creative agency, asked random quark to find a way to represent emotions in an intuitive and accessible way.
The paintings have been exhibited in a gallery in London. Papatheodorou and Chambers also used emotional techniques for Saatchi Wellness website where the due basically scan the twitter-sphere and extract the emotional state of the world using machine learning.
“We process random tweets in real-time and we measure their emotional state using IBMs Watson to measure basic feelings. We then use these parameters about how the world feels to guide the swarm you see on their website.”
Random quark sees the brainwaves painting as an experiment in human computer interaction. “What we see as the future of computing is a way of giving computers context so they can make better decisions. When you talk to Alexa or Google Home if they can understand how you’re felling it can engage with you in a more intelligent way,” Chambers said.
In the future computers will be able to understand more context like emotions which will enable them to become more adaptive, with applications such as adjusting your home environment to suit your mood when you come home from work. It can also impact storytelling in a way that makes it more interactive so that the story can adapt to your reactions to the way the story is being told.
“Rather than writing a book or a song, artists in the future will likely write programs that generate them uniquely for each read or listen based on all the cues it takes in,” Papatheodorou said.
WATCH: Memories are colour and brain waves are brush strokes in these mesmerizing paintings