I've recently taken an interest in generative art. I find there is a sense of wonder in defining a set of constraints and then letting algorithm take over to produce each generation. Even more so when you can provide some external source of chaos (like an audio file).
Below are a few recent experiments in TouchDesigner (a node based visual development platform).
Manipulating the vertices of a polygon with Simplex 3D noise. The frequency of the audio input is used to adjust the Z value of the noise map which is what controls the distance of the vertices from the centre of the sphere.
This is a modified version of this tutorial this time using the frequency of the music to control the amplitude of the noise being applied to each of the lines.
More Simplex 3D noise, this time applied to a plane with the frequency of the music controlling the offset of all 3 coordinates (X,Y, and Z) . There's also some beat detection to quickly increase the value of the noise's Y offset.