h a l f b a k e r yYeah, I wish it made more sense too.
add, search, annotate, link, view, overview, recent, by name, random
news, help, about, links, report a problem
browse anonymously,
or get an account
and write.
register,
|
|
|
Since I imagine the input to many computer animation packages is paramter-based (i.e., values for location, shape, velocity, texture, color, etc.), why can't you generate values for those parameters by transforming output from a MIDI system.
Instant synchronized surreal music videos!
Somebody
must have done this already, yes?
(?) Men In Black II Visualizer
http://www.winamp.c...l?componentId=98687 [phoenix, Apr 04 2002, last modified Oct 04 2004]
(?) A Knights Tale Visualizer
http://www.winamp.c...l?componentId=53037 [phoenix, Apr 04 2002, last modified Oct 04 2004]
(?) CaveGirl
http://www.winamp.c...l?componentId=82560 [phoenix, Apr 04 2002, last modified Oct 04 2004]
Valentine Dancer
http://www.winamp.c...l?componentId=44849 Etc, etc, etc... [phoenix, Apr 04 2002, last modified Oct 04 2004]
(?) Hypnoticon
http://www.jbum.com...are.html#Hypnoticon MIDI -> Computer Graphics for Macintosh [jbum, Apr 05 2002, last modified Oct 04 2004]
Tons of info
http://www.midi.org/ [xrayTed, Apr 05 2002, last modified Oct 04 2004]
(?) VidiWall
http://www.vidiwall...list/belgium_xl.jpg Live Performance Medium for Animation [xrayTed, Apr 06 2002, last modified Oct 04 2004]
MIDI Controlled Animation
http://www.xb111.de/xb_fs.html [xrayTed, Apr 16 2002, last modified Oct 04 2004]
Animusic
http://www.animusic.com Prestored MIDI used to drive 3D animations [erich666, Jun 05 2002, last modified Oct 04 2004]
Cycling74 / Max URL (moved from [Umiachi]'s anno)
http://www.cycling7...roducts/jitter.html 29 Dec 02 | Very interesting looking authoring package currently running only on Mac. (Is this the remains of the old OS MAX product of years ago?). [bristolz, Oct 04 2004]
MAX Users site (URL moved from [Umiachi]'s anno)
http://www.goldbergs.com/max/ [bristolz, Oct 04 2004]
[link]
|
|
Yes. Check out WinAmp and it's visualization plug-ins as an example. |
|
|
I've seen WinAmp, but they're using the total waveform of the music, not the parameters of the individual components & instruments. What they're doing is closer to an occilliscope. |
|
|
How do you isolate one instrument in an orchestra? |
|
|
Well, MIDI data is data about each note of each instrument, isn't it? |
|
|
Part of the General MIDI specification defines what instrument is being played. Each device parsing a MIDI data stream synthesizes it's own rendition of that particular instrument. |
|
|
I think this genre of animation is called "Waldo," or performance animation. |
|
|
I wrote a program that did this in 1994, called "Hypnoticon". It produces live computer animations of rain-storms and kaleidoscopic effects, paying attention to the pitch, attack-velocity, sustain, and harmonic content of the MIDI stream. |
|
|
The advantage of using MIDI over the raw audio stream is that you have access to much higher-level information. For example, some of the most interesting effects where those that tracked chord changes (something that would be difficult to do with just the audio waveform). The ability of the animation to synchronize with the music on so many levels simultaneously was quite mesmerizing. |
|
|
The program worked with the Apple MIDI manager, and could be used with either MIDI files, or live. I generally played with it live, using a piano keyboard, although we also experimented with a guitar MIDI converter. |
|
|
I should also mention that the performance artist John Lario did extensive work using MIDI to control real-time animation in his performances. |
|
|
I didn't realize you meant MIDI in the literal sense. I thought you were referring to digital audio in general. Sorry. |
|
|
I have to ask though: When I play a .MID or .MIDI file, does the file actually contain a separate stream of data for each instrument? Or are you talking about doing it 'live'? |
|
|
A midi stream will not have distinct 'streams' per se; the instrument definition, or voice is part of the protocol packet.
|
|
|
As well as synthesizers, MIDI can be, and very often is, used to control other units such as mixing desks and lighting rigs. Any MIDI-enabled unit will respond to any MIDI signal (assuming its on the right channel), and it's possible to control, say, a lighting rig with the MIDI stream from a keyboard. Middle C generates a 261Hz sound from a synth and could also switch that blue light on. I know a couple of guys who play as a duo with MIDI backing tracks, and they control their lights from the sequencer. |
|
|
I am a little confused as to the intent of this idea. Is the idea to use the MIDI protocol to pass parameters to a video synthesizer as part of a performance? Or is the idea more of a visualization engine for an arbitrary MIDI stream authored for an audio synthesizer? Each interpretation has merit in its own right, but I am wondering which was your intention. |
|
|
[phoenix], as I understand it, in the MIDI protocol there is just one stream. The stream consists of a chain of packets, with each packet defining an event. Each event identifies itself as belonging to a particular channel. I think that there are 16 channels defined in the basic MIDI spec. A MIDI enabled device is set to listen for events on one or more specific channels and ignore events on other channels. Each channel represents an instrument, and each event defines a parameter change (attack, decay, pitch, etc.) for the current note. Keep in mind that not all MIDI enabled devices are audio synthesizers, and so the 'instrument' may in fact be a light dimmer, a digital effects processor, or a mixing console. The parameters called 'pitch', 'velocity', 'decay', etc. take on new meanings specific to the instrument, and may control things like lamp color, time delay, or mix volume. |
|
|
"...there is just one stream. The stream consists of a chain of packets..."
My question is: Is this the case when MIDI is being played back from a file on disk or only when the 'live' stream is being generated? I've got all the other stuff. |
|
|
[phoenix]: to my knowledge, there is no provision in the MIDI spec for MIDI file 'streaming', as in internet streaming audio, i.e. shoutcast.com, etc. In this context I believe what is meant by streaming is that MIDI uses a serial protocol to transmit to external devices via MIDI cable. When you open a .mid or .midi on your computer, the file is opened in it's entirety. Likewise, if you follow a MIDI link from a webpage, execution of the file begins when the download is complete, rather than 'buffering' the file, as is commonly done with .mp3's, etc. Is that what you are getting at? |
|
|
This idea has real potential. The graphic synth could 'sniff' MIDI data intended for an audio synth, or the graphic synth could be the direct recipient of the data, having it's own implementation of General MIDI, plus System Exclusive data. Either way, the visual MIDI realm is largely unexplored. True enough, MIDI controlled lighting rigs have been around for years, but never, to my knowledge, has animation been attempted.
|
|
|
There are possibilities for both recorded and live MIDI, but imagine this: you go to a <insert band name> concert at a large venue. Behind each band member is a huge video screen (see link: VidiWall). The huge video screens are used for midi visualization. The keyboard player is playing a midi-enabled Hammond organ, and tremulous, irridescent bubbles float up behind him as he plays each note (on the video screen). As each organ note ends (a MIDI "note OFF" event), each bubble bursts. The guitar player's video screen is adjacent to the keyboard player's. He is playing an old Hagstrom II guitar through an old Fender Bassman amp (all analog stuff). However, his guitar is outfitted with a hexaphonic pickup rig that converts the analog notes to MIDI information. As he plays, brilliant bars of light flash out from behind him (on the video screen). The angle of each bar is relevant to the note being played. As he depresses his MIDI-enabled Cry Baby wah-wah pedal, the bars of light bend and flicker toward the keyboard player like flames. He plays an out-of-tempo, staccato passage of shrill, distorted notes, and the flames shoot off and burst some of the keyboardist's bubbles. They burst with an audible pop (the animation events can, in turn, generate MIDI events).
|
|
|
I think this would be cool, and less acid would be consumed at rock concerts.
|
|
|
Surely that would lead to *more* acid being consumed at rock concerts...? |
|
|
Wayne Lytle made an influential animated short for SIGGRAPH 1990 called "More Bells and Whistles" - it was fantastic. He used prerecorded MIDI to drive animation - prerecorded so that he could anticipate events (e.g., so that a ball would fly through the air and hit a gong at the moment the sound is heard). Go check out http://www.animusic.com for his latest efforts - click on the instruments, the short Quicktime clips are fun to watch. |
|
|
=).. Tis the worlds most extensive multimedia system around these days, it freely integrates audio, midi and video, allowing control by virtually anything. |
|
|
Max is the midi side, MSP is the audio side, and Jitter is the new video side (but the system is open source, and the stomping grounds of many upper division academians, so many MANY user created externals exist to expand on the core concepts). |
|
|
<another URL moved to links>
(not me, but links to some nice externals) |
|
|
The idea describes precisely what Animusic is, see the link added by [erich666]. There is a DVD you can buy only on that web site, I have it and it is absolutely stunning. Highly recommended for computer animation and/or music fans. |
|
|
"Highly recommended for computer animation and/or music fans." |
|
|
With emphasis on the "computer" part. I find the work to be somewhat sterile and artless, myself. |
|
|
the MIDIman object agent for Lightwave's LScript will do this. |
|
|
[-] MIDI is the key being pressed, not the sound being played. |
|
| |