An exploration in the interplay between virtual and physical: manipulating digital objects with physical sensations and emotions. I’m interested in researching how to better represent the “unseen” internal experiences in the external world (objects in virtual or physical reality) so they can be more readily perceived and shared among others as a common language. This category of research could pave the way for technology in assisting us in greater connection and collective empathy.
In this experiment, I am wearing my EEG device (Muse S) to capture my brainwaves in real-time. After streaming via OSC to my Max software, I process my brainwaves, so that the more relaxed my state, the faster the object distortions become.
In the video below, you can see that when I close my eyes, the shape starts to morph rapidly:
Next, I’d like to explore different directions or holds that the shape can form, depending on whether or not I am in a meditative or attentive state. Specific shapes can be personalized to different individuals as well ( a kind of “soul visualization”), but would involve incorporating some amount of machine learning models and increasing the amount of training data and classification necessary to calibrate.
Original tweet published here. Max / MSP object inspired from tutorial by Amazing Max Stuff.