Note:
Unfortunately I've had to roll back my past two days worth of work on this project, so the presentation images are still from Monday. There was a lot of work that needed and still needs to be done to isolate all of the data structures I have, and I just kept running in to errors. I will continue to iron those out in the coming days.
Concept
Create a sound "garden" where plant-like structures are interpreted as musical notes and chords.
The installation takes up considerable gallery wall-space and is meant to feel immersive.
The sound played back by each of the "L-Sound-Systems" is panned relative to their position on screen
The structures are drawn in real time, and visualize leaf-like structures at the nodes of each note
Progress
A single structure can be drawn and shown at once. Only the pitch is currently driven directly by the L-system. The empty space should be filled by other structures.
Future Work
- Bug fixes
- Draw (i.e. reveal) plant structure in real time as it is "played"
- Allow for multiple plants to grow
- Match note groups from parser with node groupings in visualizer (incorrect green circles are currently drawn due to mismatch)
- Time MIDI events properly
- Draw radial sine "leaves" (or other visualization) instead of debugging circles
- Expand the L-system grammar and parameters to be more flexible and carry more information
- Add genetic wrapper to evolve variable input structures according to pleasing chord structures, note progressions, rhythms, etc.
- Explore audience interaction via keyboard, where users can input short sequences or entire pieces and watch them grow in to a forest.
alternatively: - Consider allowing input of midi files to be interpreted
- Consider audience interaction via touchscreen to allow users to draw out their own plants
No comments:
Post a Comment