by January 2, 2003 0 comments



Since a large chunk of Indian movies are romantic and family dramas, traditionally they have had little scope for SFX. It was only after a series of blockbuster Hollywood movies using SFX, like Star Wars and Jurassic Park, made a splash that a few techno-savvy Indian producers ventured into this area. The result: a chapter opened up for the Indian film industry, with movies like Jeans, Virasaat and Hum Dil De Chuke Sanam
(HDDCS). 

In the South, SFX began to be implemented from the early 90s–mostly in mythological movies, like Amman (Tamil) and Devi (Telugu). For Bollywood, it was a slow start. But, as the new breed of filmmakers got experimental, a need for SFX arose in the late 90s. Among such movies, Sanjay Leela Bhansali’s HDDCS is a good example. 

Flying kites
You’d recall a song, dheel de, dheel de de re bhaia, from HDDCS, which is picturized with many people flying kites. The problem that the cinematographer faced here was that given the wind condition it was difficult to capture many kites in a single frame. So, the kites were generated digitally and composited against a live background. 

First, the live background (of many people acting as if they were flying kites) was shot. The positives of this film were scanned through a film scanner at a very high resolution. (In India, the film positives are generally scanned at 2K resolution. 2K = 2000+ pixels per frame.) Approximately 1440 frames (1 sec=24 frames for feature film), amounting to around 5 mins of footage, were scanned and then saved in RGB/.tga format. Obviously, huge amounts of disk space was required for this.

These digitized frames were ready for visual treatment.

The second step was to create the kites. This was done using 3DStudo Max and Lightwave. When the kites were being created, the live footage of the people flying kites (shot earlier) was used in 3D viewport for framing reference. Next, the kites had to be rendered (convert 3D meshes into images), which was done with the alpha channel matte. (For more on rendering, see Your SFX Primer, page 33.) Alpha channel is used in a compositing or editing software to control an image’s
transparency level.  

The output of the first two steps–digitized frames of people (appearing to) flying kites and the rendered kites–were put together. The shots were composited inside the multi-layer environment of Eyeon’s Digital Fusion (a specialized compositing software). The rendered kite sequence was used as a foreground layer and the digitized people’s frames as the background layer. Each set of kites was used as a separate layer while
compositing.

Chaand Chupa…
Another significant shot where computerized SFX had a role is the song, chaand chupa badaal mein, where the entire song sequence was shot using a chroma (single color, usually blue or green) background. 

In the song sequence, a patch of clouds slowly covers the moon. Here, the entire backdrop with the moon, patch of clouds and stars, was created using 3DStudio Max. Each was made as a separate layer. Later, the sequence was composited in Digital Fusion using the computer-generated backdrop as background and the live shot as foreground. Since the foreground shot was with chroma, while compositing, the blue background (chroma) was keyed out to reveal the computer-generated sky.

Fireworks
Another shot is the one at the end of the movie when Aishwaraya Rai and Ajay Devgan meet on a bridge, and the camera slowly pulls out without any cut. Here, the director wanted fireworks in the background. It was very difficult to manually generate so many fireworks at a time and frame them in a single shot. So, digital effects were called in. 

Thousands of fireworks were generated using the powerful particles and dynamics system in Alias/Wavefront’s Maya. The advantage of using 3D-generated fireworks was immense. There was freedom to play around with the various parameters of those fireworks inside the software, like using customized color, lives of individual fireworks, speed, positions, etc. But, the challenge was to composite this complicated multilayered shot. To do this an animated digital mask was generated inside Digital Fusion. The entire shot had to be tracked frame by frame to adjust the speed and the positions of all the layers together. 

After all the shots were rendered, the frames were exposed on to a film roll through a digital film recorder and then sent to the film-development lab for processing. And, how much time did it take to do these SFX for HDDCS? About a two and a half months with a team of about six to seven people.

Jaydip Sikdar worked on the SFX for HDDCS

No Comments so far

Jump into a conversation

No Comments Yet!

You can be the one to start a conversation.

<