In May 2018 I will be installing work for the Science Rendezvous, Toronto!
An iteration of Wonderlandish. This version of Wonderlandish will be made of glass sculptures depicting Human and various flora and fauna microbiota. Entering into Wonderlandish visitors will see artistic renderings of scientific data depicting life-size versions of the organisms mentioned above. A live component will allow visitors to interact with the microbiota on display, through an Augmented Reality app.
Game developer, Chris Tihor of Ironic Iconic Studios is working with me on building the app and UI.
I have some experience (super minor) with building augmented reality work in Unity. I do not know how to make things generative and/or seemingly react to User’s bio information.
In 2016 I sent out a proposal to Grow-Op towards installing a piece where Visitors/Users could see their influence on the environment. I wanted to create a biofeedback type of app-thing. Depending on pulse or heat (say) this would create either a positive or negative outcome. Either the environment would be over populated with invasive species (in this case plants not native to Ontario= negative impact) or the positive (where native, indigenous species thrive).
The plant life would be generative in that they would grow as people relaxed or became excited.
Visitors wouldn’t necessarily be (initially) aware of how they were impacting the space and why the heck they were destroying it 😉 or making it thrive. More on that later.
I can create or purchase (if funding happens) 3D models of bacteria, microbes, etc., that animate. At the moment I am uncertain whether a Kinect(s) would need to be used in something like this but I own one and have access to several if needed.
Sculptural depictions of human + plant + animals made using glass; flame-worked and cast. The sculptural forms would/could either have symbols painted onto them (to activate the AG) or be arranged in patterns easily read, that would activate the AG component.
Sound.
Bacteria, microbes and cells make sounds. Sadly (or perhaps luckily for us!) these sounds are more than a thousand times lower than Human ears can detect.. without the use of this wild optical fibre technique, that proved to be more sensitive than an Atomic Force Microscope! It can detect forces less than 160 femtonewtons and sounds less than -30 decibels. Soooo, of course I am looking into either obtaining the data sets or sounds or seeing how to work with the scientists to record specific microbes. 🙂
Either way, sound is an important aspect.
Even if it is made up for this first installation.
After writing all of this I am thinking that maybe an AG app might be a more simplified way to start out? Considering I do not have an Oculus
on hand, or the SDK or access to Gears, or multiple headsets… currently. Also, I am thinking that if I can get something relatively functional happening, then perhaps Crowd funding can happen, as well as hitting up all the granting agencies.
I have worked with Max/msp and Processing.
Processing more so with the Kinect and video mapping.
Arduino and sensor triggering Midi. I have a bunch of sensors actually.
How do we begin?
Software: Unity, Blender, Ableton, Processing
Hardware: iPad, iPhone, Notebook, Android…
Sculpture: lost wax cast and flame-worked glass, silicone cases for hardware devices. In order to build life size versions of various animals I will have to construct them in parts (due to kiln/annealer limitations), using perhaps metal armatures to hold them together.
Wanted: scientists interested in collaborating! Microbiologist(s), environmental – geographic ? Relating to air pollution.