Building large-scale interactive exhibitions is some of the most interesting and challenging work I have done. Taking a set of ideas in consultation with a client and making something truly amazing, at scale, is a great feeling.
I wanted to come up with a simple 'single-goal' installation, that would be intuitive and natural for people to engage with, as well as being something I could make good progress on technically with a team of 1.
I decided on an idea that revolved around marine conservation - the exhibit is designed to showcase the idea that our marine environments are being negatively affected by manmade pollution, and that even small contributions, or just being aware of a problem that is hidden under the waves - can have a meaningful effect. By allowing visitors to clean up 'virtual garbage' they can understand more clearly the need to clean it up in the real world, as well as be delighted and entertained by the positive impact on the virtual environment their actions have.
Rather than simply document the idea and attempting to convince people it was worthwhile, I decided to build a proof-of-concept of the exhibit, both to make sure the idea was technically feasible, as well as to more convincingly demonstrate the business' capability to deliver on the vision.
A simple idea sometimes requires a less-than-simple approach - to really achieve what I wanted - a 'living reef' that responded in realtime to the introduction of negative externalities, as well as stimulus from local actors (e.g. fish eating all available food in an area), I ended up writing a lot of code to support a voxel-based reef and seamlessly update it at runtime, and used hardware instancing heavily along with animation implemented in vertex shaders to allow the rendering of a very dynamically complex scene while still maintaining interactive framerates.
I used C++ to develop this, with OpenSceneGraph as the core rendering framework along with Polyvox to handle the dynamic voxel reef, and the Bullet physics engine so that the fish can collide with each other and with a mesh to represent the reef. The system needed to be multithreaded to handle background reef updates so rendering stays smooth, without glitches as the reef is updated.
Live video input for optical-tracking hand or body movements for user input using Video4Linux is present in the system, as well as basic support for gestures from a Leap Motion controller
Currently the project is in Tech Demo/Proof of concept stage, which the video linked below illustrates - there are at least 1500 independently-animated fish being rendered in a constantly evolving reef environment, which, when polished for presentation, and rendered at a 'larger than life' scale should be a visually stunning environment to view and interact with.
If you are interested in progress on this project, would like to know more, or talk about how it might fit with your own organisations' goals - please email me.