Event Documentation and Webcasting for Museums
Skip to main content
Walker News

Event Documentation and Webcasting for Museums

At the Walker, we webcast many of our events live. It is a history wrought with hiccups and road bumps, but doing so has given our audiences the opportunity to watch lectures, artist talks, and events live from their home or even abroad. More importantly, webcasting has focused our technique for documenting events. In the broadcast world, “straight to tape” is a term used for programs such as late night talk shows that are directed live and sent straight to video tape, free of post-production. For the most part, we also try to minimize our post-production process, allowing us to push out content relatively quickly before moving onto the next show.

At the heart of our process is a Panasonic AV-HS400 video mixer, which accepts both an HD-SDI camera feed and a VGA feed from the presenter’s laptop.  The video mixer allows us to cut live between the speaker and his or her presentation materials, either with fades or straight cuts. In addition, the mixer’s picture-in-picture capability allows us to insert presentation materials into the frame, next to the speaker.  Doing so gives viewers both the expressiveness of the presenter and the visual references live audiences are seeing. One thing to note: if a speaker begins moving around the stage, it becomes difficult to frame a picture-in-picture, so the technique works better when people stand still.

        

The camera we use is a Sony PMW-350K, which is part of the XDCAM family. We shoot from the back of the room in all of our public spaces, putting a lot of distance between the camera and the subject. As a result, we need all the zoom our camera lens can give. Presently our lens is a Fujinon 8mm–128mm (16x), but realistically we could use something longer for better close-ups of the speaker. This is an important factor when considering cameras: where will your camera be positioned in relation to the subject, and how much reach is needed to get a good shot. Having a camera close to the speaker isn’t always practical with a live audience present, so many of shooters push the limits of their camera lens. Being so far out also puts a lot of strain on a tripod head. It is very easy to jiggle the frame when making slight camera moves fully zoomed out, so a good tripod head should go hand in hand with a long video lens.

For audio, our presenter’s microphone first hits the house soundboard and then travels to our camera where levels are monitored and adjusted. At that point, both the audio and the camera’s images travel through a single HD-SDI BNC cable to our video mixer where audio and video signals split up once again. This happens because the mixer draws audio from whatever source is selected. As such, if a non-camera source is selected, such as the PowerPoint, no audio is present. To resolve this, an HD-SDI direct out from the camera source on the mixer is used to feed a device that re-embeds the audio with the final mixed video signal. The embedding device we use is an AJA FS-1 frame synchronizer.

         

With the frame synchronizer now kicking out a finished program, complete with embedded audio, our AJA KiPro records the content to an Apple ProRes file. We use a solid-state hard drive module as media, which pops out after an event is over and plugs directly into a computer for file transferring. An important thing to remember for anyone considering a mixer is that an external recording device is necessary to capture the final product.

To webcast, our FS-1 frame synchronizer simultaneously sends out a second finished signal to our Apple laptop. The laptop is outfitted with a video capture card, in our case a Matrox MXO2 LE breakout box, that attaches via the ExpressCard slot. Once the computer recognizes the video signal, it is ready for webcasting. The particular service we use is called Ustream. A link to our Ustream account is embedded in the Walker’s video page, titled The Channel, and viewers can watch the event live through their browser. Live viewership can run the gamut from just a few people to more than 75 viewers. Design-related programs–like the popular lecture by designer Aaron Draplin in March–tend to attract the biggest audiences. Once an event has concluded, Ustream stores a recording of the event within the account. We have the option to link to this recorded Ustream file through our website, but we don’t. Instead we try to quickly process our own recording to improve the quality before uploading it to YouTube.

       

The most frustrating part of our webcasting experiment has been bandwidth. The Walker has very little of it and thus we share a DSL line with the FTP server for webcasting. The upload speed on this DSL line tops out at 750 kbps. In real life, we get more like 500 kbps, leaving us to broadcast around 400 kbps. These are essentially dial-up numbers, which means the image quality is poor and our stream is periodically lost, even when the bit rate is kept down. Viewers at home are therefore prone to multiple disruptions while watching an event. We do hope to increase bandwidth in the coming months to make our service more reliable.

Earlier I mentioned that the Walker does as little post-production as possible for event documentation, but we still do some. Once the final ProRes file is transferred to an editing station, it is opened up in Final Cut 7. The audio track is then exported as a stand-alone stereo file and opened with Soundtrack Pro where it is normalized to 0db and given a layer of compression. With live events, speakers often turn their head or move away from the microphone periodically. This can make audio levels uneven.  Compression helps bring the softer moments in line with the louder ones, thus limiting dynamic range and delivering a more consistent product.

After the audio track is finished, it is dropped back into the timeline and the program’s front and back end are trimmed. We try to cut out all topical announcements and unnecessary introductions. Viewers don’t need to hear about this weekend’s events two years from now, so we don’t waste their time with it. In addition to tightening up the top of the show, an opening title slide is added including the program’s name and date. The timeline is then exported as a reference file and converted to an MP4 through the shareware program MPEG streamclip.

MPEG streamclip is a favorite of mine because it lists the final file size and lets users easily adjust the bit rate. With a 2GB file size limit on YouTube uploads, we try to maximize bitrate (typically 1800–3000 kbps) for our 1280 x 720p files. Using a constant bit rate for encoding instead of a variable bit rate also saves us a lot of time. With the runtime of our events averaging 90 minutes, the sacrifice in image quality for a constant bit rate seems justified considering how long an HD variable bit rate encode can take.

Once we have the final MP4 file it is uploaded to YouTube and embedded in the Walker’s video page.

 

Get Walker Reader in your inbox. Sign up to receive first word about our original videos, commissioned essays, curatorial perspectives, and artist interviews.