Brain-Computer Artifact

Making brain waves tangible

Date

2022

Duration

4 weeks

Project Type

Interactive Installation

Keywords

BCI, EEG, Unity VFX, procedural mesh

Role

Research, Ideation, Prototyping

Team

Hyejun Youn, Alejandro Medina

We are proposing an interactive art  installation that materializes sleep into spatial artifacts and invites the audience to experience the process. EEG data of a sleeping participant is visualized and the trace of the visualization is materialized into a procedural mesh. The spatial artifacts then go on to form part of a larger archive where the visual language of the artifacts speaks to the sleep identity of each person. As the archive grows, the complex abstractions have the potential of gaining more legibility.

Inspiration

Three wooden relief map of East Greenland Coast have functioned as a storytelling device, showing how the Tunumiit cognitively organized their world. These maps could be placed next to one another to demonstrate the relative positions of the islands along the coast. 

Technical Pipeline

There are five main aspects to this technical pipeline: EEG data collection and mapping, procedural mesh generation, real-time visual effects, visualization in virtual reality, and 3D printing. 



The Neurosity Crown EEG sensor collects and transmits data via WiFi to a computer in real-time. Through a modified Neurosity SDK (“Notion Unity SDK”), the real-time EEG data is fed into Unity. Then, the Delta power band value is specifically accessed through the SDK and each stream of data from its eight channels maps to a starting point in the mesh generation process. Each participant received a unique set of starting points through a randomized selection process so that the base shape of the meshes look different by person, but are legibly similar for the same person. It is worth noting that we took creative control in individualizing the form of the meshes. In an ideal scenario, the EEG “fingerprint” would be mapped to the mesh generation so that the correlation between form and data is less arbitrary.



Second, the procedural mesh generation is implemented through a C# script in Unity. It manipulates the vertices of a default Unity sphere mesh, translating the vertices through three parameters: force, duration, and effect radius. Force affects the extent to which the vertex is translated on the direction of the normal; duration is the time it takes to translate; effect radius is how many other vertices are affected by this translation. Gaussian falloff is utilized to produce a smooth curvature when morphing the mesh. 



In addition, the procedural mesh is paired with Unity’s VFX graph for added dynamism and visual effect. The procedural mesh is converted to an SDF (signed distance field) in real-time, which is then fed into the VFX graph. As such, the VFX particles are able to wrap around the morphing mesh. In addition, Oculus Integration SDK is used to allow the participant to view the mesh generation in virtual reality with an Oculus Quest 2 headset. Finally, the procedural mesh can be saved to an OBJ file at any point in the process. The OBJ file can then be 3D printed to become a tangible artifact. 



Data to Mesh Mapping

Prototype

EEG Artifacts

Participant A EEG Artifact

Participant B EEG Artifact

Participant C EEG Artifact

Exhibition Design

Demo

Designed with ❤️ across realities

Davide Zhang © 2016 - 2023. All Rights Reserved.

Davide Zhang © 2016 - 2023. All Rights Reserved.

Davide Zhang © 2016 - 2023. All Rights Reserved.