Idea: music visualization with spring networks

The basic idea is to connect a collection of springs into an arbitrary graph, then drive certain points in this graph with the waveform of a piece of music (possibly with some filtering, band separation, etc.) This could be restrained to two dimensions or allowed unrestricted use of three.

Spring constants could be chosen so the springs resonate with tones in the key of the piece. Choosing these constants and the graph connectivity to be aesthetically pleasing would likely be an art form in of itself. A good starting point would be interconnected concentric polygonal rings of varying stiffness. Symmetry seems like a must.

For a software implementation, a useful starting point would be CS 176 project 5; a cloth simulator that considers only edge terms is essentially a spring-network simulator. There are many ways to render the output; for example, draw nodes with opacity proportional to velocity, and/or draw edges with opacity proportional to stored energy. Use saturated colors on a black background, and render on top of blurred previous frames for a nice trail effect. Since I've already coded the gnarly math once, I might try to throw this together tomorrow evening, if I don't get distracted by something else.

The variations are really endless. For example, with gravity and stiff (or entirely rigid) edges, you could make a chaos pendulum. By allowing edges to dynamically break and form based on proximity and/or energy, you could get all kinds of dynamic clustering behavior, which might look like molecules forming or something.

A hardware implementation (i.e., actual springs) would be badass in the extreme, although I imagine it would be finicky to set up and tune.

Idea: immersive video with one projector

This is an idea I had while lying in bed listening to Radiohead and hallucinating. (I was perfectly sober, I swear. The Bends is just that damn good.)

Build a frame structure (out of PVC or similar) with the approximate width/depth of a bed, and height of a few feet -- enough that you could comfortably lie on a mattress inside and not feel claustrophobic. Cover every side with white sheets, drawn taut. Mount a widescreen projector directly above the middle of this structure, pointing down. Then hang two mirrors such that the left third of the image is reflected 90 degrees to the left and the right third is reflected 90 degrees to the right (from the projector's orientation), with the middle third projecting directly onto the top of the frame. Then use more mirrors to get the left and right images onto the corresponding sides of the frame. (You'd probably also need some lenses to make everything focus at the same time; this is the only part I'm really iffy on. Fresnel lenses would probably be a good choice. Anyone who knows optics and has any idea how to set this up, please let me know.)

Anyway, the beauty of this setup is that it allows one to control nearly the whole visual field with a single projector and a single video output, thus minimizing complexity and expense. It's not hard to set up OpenGL to render three separate images to three sections of the screen; they could be different viewpoints in the same 3D scene, although as usual I'm more interested in the more abstract uses of this. In particular, you get control over both central and peripheral vision, which has psychovisual importance.

I'm really tempted to build this when I get back to Tech, but there's a high probability that someone else's expensive DLP projector will suffer an untimely demise at the hands of improvised mounting equipment.

Edit: I thought of an even simpler setup that does away with the mirrors and lenses. Make the enclosure a half-cylinder, and project a single widescreen image onto it (orienting left-right with head-feet), correcting for cylindrical distortion in software. The major obstacle here is making a uniformly cylindrical projection surface, but that shouldn't be too hard.


This did not work

Still, the images look pretty. Now I'm going to sleep.


Sesnsory Deprivation, a fucking ripoff

I tried the sensory deprivation experiment this afternoon. I prepared by shutting off my room, lights off, door closed, shades down, and so on. I then lay on my bed, covered my eyes with a mask, and put earplugs on, checked my crashbox, and went exploring.

Initial preparations were encouraging, as I felt a sensitivity in my skin and tinnitus like effects. I experienced some closed eye visual along with a sudden suppression of conscious thought at an estimate 20 minutes. However, not more became of this, and I exited the experiment at BLT 1:30 when I realized I had fallen asleep and was dreaming. Because I could not check to see if my crashbox was recording in the dark, all actual data was lost. The only linger effects were extreme difficult writing this post, which may be attributed to just having woken up.

My dreams involved being trapped in the Dabney computer lab with a couple of nerds arguing about the best way to install a cracked version of CnC 3 on the lab machines, so sensory deprivation may hold promise as a method of torture.

--Biff-"sticking to chemistry"-motron


I am an RSS feed classifier

Thanks to Google Reader, I now spend much of my time trying to keep up with an unbelievable flood of information from the blogosphere. I'm now tagging the stuff I find interesting; there's also a RSS feed of that. I'm working through a large backlog right now, so it will be moving fast.

I know I'm late at getting into it, but the ideas behind this blogging / RSS / Web 2.0 stuff are really cool -- everyone produces a little bit of content while filtering and correlating other content. Of course, this is the same thing humans have been doing for millennia, but we're reaching unprecedented levels of participation, connectivity, and information processing rate. How far can this growth continue before we start to see fundamental shifts in the nature of human thought -- or have these shifts already occurred, maybe out of sight of our limited perspective? Maybe the development of language was such a shift, and we've just been pushing the performance of that development ever higher.