From a talk at the Converging Worlds Conference, Toronto, February 1999 by Greg Hermanovic on the Interactive Dance Club

 

Side Effects Software's Houdini

Side Effects is the software company that creates and sells Houdini, a 3D animation product for artists who create special FX & characters for film, TV, games and visualization.

You likely have seen Houdini's works in Titanic, Fifth Element, Blade and Godzilla. And soon Prince of Egypt from Dreamworks will be released, featuring the parting of the Red Sea, created with Houdini. In total 30 films from 1997 and 1998 were or are being made with Side Effects tools. And this year we were honored with a Scientific and Technical Academy Award for the pioneering technology called "procedural modelling".

With the Interactive Dance Club, we were showing an unusual application of a 3D animation tool, yet it is a natural evolution in the animation industry. Animation for film and video is created a frame at a time - it may take 2 to 20 minutes to render one frame of final film or video. In contrast, realtime animation is rapidly generated at 15 to 60 images per second, fast enough that it can respond to people driving the animation with some input devices. 

The IDC Inception

This past July (1998) I helped put on Interactive Dance Club in Orlando for the Siggraph 98 Graphics Conference. Siggraph is the largest annual graphics conference. It includes an exhibition, technical papers, panels and foremost, parties.

Synesthesia's Ryan Ulyate and David Bianciardi conceived of IDC in summer 1996 when they worked together for the first time on the Hirakata theme part in Osaka, Japan. Ryan was approached by SIGGRAPH in spring of 1997 when they were looking for a demonstration of integrated music and images. Ryan pitched the IDC to SIGGRAPH (a rather more ambitious idea than they originally had in mind!), sought out my expertise and brought me on board. The event was based on the literal name, Interactive Dance Club. The name stuck, the commitee went for it, and we all dove into it.

The IDC shaped itself to be an event driven by 8 organizers spearheaded by Ryan and Dave, plus 12 animators, and 2 extra musicians.

Challenges Faced

The intent of IDC was to put more control of music and visuals into the hands of the members of the audience. The visuals were not to be pre-rendered, and the sound was not to be pre-recorded. Despite the desire to create and use live 3D graphics, there were no suitable live 3D products available at the time.

We were walking into difficulty because we were expecting people from the audience (players), who were non-professionals in performance, to be the performers. The players were put in a situation where no learning time was afforded.

Considering the number of sound and visual sources and the desire to make the result danceable, we needed to avoid total audio-visual chaos like you find in an arcade or pachinko parlor. Rather, we needed to keep a musical flow going.This wasn't straightforward.

Unlike most games, there were going to be no winners, no scores, no killing, no chases, yet it had to engage players. At the same time, it needed to be technologically and visually interesting to jaded Siggraph attendees. And we had to balance this as a social event, keeping in mind how alienating and overwhelming technology can be. The most scary part was that most of the attendees at the Interactive Dance Club were a non-clubbing crowd!

Design Choices

We decided to have 6 zones, each with computer animation and audio, 3 zones with audio-only. We decided to make most zones designed for one person at a time per zone.

In the zones involving graphics, we chose no group-controlled things because it is difficult for one person to identify what they are changing when there are other people changing things in the same scene. We quickly decided to stay away from VR head-mount-displays or wired-up people, and no cameras recognizing gestures. This is a club after all, not a torture chamber.

We went with some simple (and inexpensive!) custom devices that you could hit, stomp, turn, and push, plus devices that trigger when a light beam is blocked and some proximity detectors. These were converted to output MIDI. Some of the sensors were made and donated by Infusion Systems (Vancouver) who create the I-Cube product.

Each computer graphics (CG) zone had one massive projection screen. Five were donated from Digital Projections, and two from Barco. One nearby audio monitor in each zone was placed to allow the player to hear their own sound.

Our philosophy was that single actions would produce big effects - one gesture may trigger pre-made behaviors. We wanted to put the effort into the quality & responsiveness of the sound and images, not into sophisticated input devices.

It was important to have musical cohesion, so for each musical piece, Ryan composed solid rhythm backing tracks and on top of there he added 9 player-controlled music tracks. These were organized around a traditional song structure with different sections including verse, chorus, bridge etc.

In the IDC the "EJ"s (Experience Jockey - a term coined by Synesthesia) control networked computers running.Synesthesia?s PiOS (Public Interactive Operating System). PiOS acquires and filters sensor data from the interactive zones, drives lighting consoles, controls music instrumentation, manages audio systems, (synthesizers,  samplers, FX boxes and mixers), and finally sends MIDI to the SGIs and our Houdini software.

Six of the zones allowed users to control Computer Graphics as well as music. There were 4 to 18 sensors per zone. Data from these sensors was rounted to PiOS where it was arranged and filtered before being sent onward to Houdini..

There were four "experiences" (another Synesthesia term) running about 15 minutes each..The EJs selected sections of music to play, oversaw the 48-channel audio mix for the house system and the individual zones. 

But the CG animators' job was yet easier. Because the people from the crowd were doing the performing, the animators watched, laughed, and drank large quantities of alcoholic beverages.

Houdini's Pivotal Role

Houdini was used for the 3D visuals, from the authoring of characters, environments and effects, and for performing it in realtime. At the IDC, Houdini ran on six Octanes, received MIDI, and generated 3D graphics for the six screens. Digital video was delivered from the SGI Octane to the projectors. 

We were shooting for a frame update rate of 15 to 30 images per second. After some user-level performance tuning, we usually reached those frame rates. We applied the same ingenius tricks that game makers employ to maximize realism within 1/30 second.The visuals were immediately reacting, repeatable, and eventually made compatible with the sound after a lot of experimentation prior to each of the four nights of the IDC show.

The scenes were fully textured 3D objects with lighting, or 2D abstract animations. The choice was up to the animator. The player was given specific control of colors/textures, lighting, camera moves, shape changes and behavior.

Houdini took a technological leap during the IDC production because it elegantly combines a 3D modelling tool with a live performance tool. Many of these new realtime features are now shipping in Houdini 2.5 and and will be extended further in 3.0.

It was clear that we were breaking new ground in making interactive live performance, as the turnaround time between iterations of animation was very short: animators were making content and refining it rapidly. For example, four of the 24 IDC animations were made by a completely novice Houdini user in 9 days!

To our surprise the participant response was positive and expressive. The event was successful over its run of four nights.

Reflections on the Content

The realtime animations created for IDC were designed to work well as interactive installations. As much as possible, they conformed to the Ten Commandments of IDC. Some observations:

  • All things the players were controlling had to be in-frame all the time. You can't do cinematic panning of subjects off and on-screen. You have to see it all the time. 
  • This isn't MTV, so a scene cutting 4 times a second won't do. Smooth movement that matches the musical pace works well. 
  • Every gesture the player makes to the input devices needs to produce a visible and an auditory change, and needs to be repeatable. 
  • Ever tried to learn a musical instument in 10 seconds? By providing pre-made audio and visual parts, less needs to be learned. 
  • Get the player to a point quickly where they confidently ride & guide the visuals and sound - up on the surfboard and now perform! 
  • Everyone gets on a platform and thinks they're the drummer for Queen. But the animation movement was not percussive. Many of the effects take a second or two to sweep on, and a few more seconds to sweep off when the player lets go. The input devices somethime did not suit the animation. 
  • There is a lot of stuff going on... the sound generated from other zones, the master rhythm tracks that change over time. The player looks at only their own screen, and hears the sound from their monitor. 
  • Some input devices, especially the on-off devices like stomp pads, are more flexible to the audio than the visuals. 
  • It was important to consider the input devices - their resolution, response time, filtering needed on noisy devices, and zero-return devices line a car gas pedal. Also it is important to have a varierty of event-generating devices vs continuous controller devices. 

The players from the audience were faced with input devices, visuals and sound effects they had never seen before. That may bean  overwhelming starting point, especially since players in IDC are not going to shoot at things, are not racing, are not being scored and are not running from anything. They are just slipping into a groove. (In 1998, that's unusual!)

The pieces were designed as if one player would be in the zone for the whole song, and would have enough time to learn what each scene did, what the controls did in each scene, get to understand what effect his movements had on the sound and visuals.

But instead the people were quickly jumping into the zones, slamming on things, not listening to the effect of each device and not looking at the visual effect of their actions! So we quickly simplified the visuals and sound. That's the beauty of Houdini, you can tune things so much while it's playing, without any programming or tedious scripting.

Developing Animations On-Set and On-the-Fly

Greg: "I became commited to developing Houdini's CHOPs when we were doing live events, starting in 1994 with PRISMS, Houdini's predecessor. There we were, trying to type in scripts and matching parentheses in math expressions during shows while the DJs were dishing out, and people were crawling all around us and spilling drinks on our keyboards. I then declared to myself, any tool that forces you to type scripts and expressions to change behaviors is a crappy tool for live performance. Hence the realization of CHOPs, a visual expression and  programming language. CHOPs are making other math expression languages obsolete."

By the second day of performances at the IDC, we were spending less effort on making everything work, and we started to refine the content. It was kind of a surprise to be able to see out into to open again... With no wiring troubles, no software bugs... we had a chance to think of the art... and so many opportunities opened up in such a short time.

We started to make the sounds work better with the visuals. We started to make the visuals in one zone complement the visuals in other zones. And we started to use the incredible range of vivid, bright colors of the projectors from Digital Projections and Barco, and we could look up at the 16x12 foot screens through the smoke machine's fog, and arrive at a more bold, graphical look of the visuals.

This was the moment when we wanted to freeze time and work on the content, to sit back and think about what we had created, and think of the literally thousands of relationships between the crowd's movements and reactions, the sounds, the visuals, the fog and the light show... how to turn the sensory rollercoaster into smoother tapestries. But that refinement was impossible... we had to roll through 4 shows in 4 nights, and squeeze all our refinements in the hours between shows, which we enthusistically did. And we thrived on the kaos that we had unleashed... and took great amusement watching the fun-house in action!

The IDC was an experiment, but it succeeded as a real production. It's a testament to the professionals involved, and  the maturity of Houdini as a realtime production tool.

Most-Frequently-Asked-Question

    Why were there no goggles and head-mounted VR gear? 

    • You would look stupid. 
    • You would have fallen out of the gogo cage. 
    • It doesn't fit with the "shared experience" idea. 
    • Chicks don't dig guys with big HMDs on.