Printer Friendly

How to manage and integrate onsite and offsite audience and presenter interactions in live webcasts.

Recording a presentation delivered to a live audience for later on-demand viewing is nothing new. Neither is webcasting live presentations. But the norms for presentations are becoming less and less clear, as online participant interaction is incorporated with the live audience's experience and presenters delivering their talks concurrently to onsite and online audiences.

In this article I will share two very different methods for achieving this by describing streams I produced for two different clients.

Time-Shifting a Live Production

The ability to manipulate the space-time continuum is the basis for many works of science fiction. For an observer standing on Earth, time and space are relative, but because the speed of light is almost 300 million meters per second, there is no way we will experience special relativistic time dilation unless we leave Earth. Einstein's theories on relativity have been confirmed by scientific tests in which atomic clocks, sent to space on rockets, return slightly behind the time to identical clocks left on Earth, demonstrating that it is possible to experience time at a different rate than what you would if you remained on the ground.

This phenomenon is integral to the storyline of Interstellar, where characters age at different rates relative to what we experience on Earth, depending on how fast they travel through space.

As much as the space-time continuum is fascinating to discuss and manipulate in works of fiction, on Earth, real-world video producers have to find other ways to control time.

My first foray into time-shifting live productions was my most complicated to date. Four people took turns presenting, and each presenter also showed a video or slides and was assigned her own dedicated camera to look into. For audio, each wore a wireless lavaliere microphone. On the surface this might not sound too complicated; four-camera live switches with video and slide playback are the bread and butter of many event production companies and aren't much more complicated than a two-camera live switch--you just have to manage more camera and computer inputs. What made this gig uniquely challenging was that instead of the four presenters being in one room, three were in my studio near Vancouver, B.C., while the fourth was joining us from a studio in Germany.

In our initial tests, the client impressed on us how important it was to have each of the four presenters appear equal in terms of their video and audio quality. The remote presenter (and client of my client) was the CEO of a company that, as of this writing, has a valuation just shy of $100 billion. They quickly ruled out having the fourth participant connect via a satellite connection because none of us had that expertise. We couldn't deliver the presentation via Skype because the client didn't feel the connection would be of high enough quality.

The solution was to connect the remote presenter in Germany to my studio in Canada via a high-definition webcast on Ustream Enterprise (Figure 1). The incoming webcast feed went into the video switcher in my studio, which also controlled all the live camera, video playback, and presentation slide feeds. The HD quality of the webcast was sufficient on the incoming signal, and the quality gap between the live camera feed video and the webcast feed was decreased when the program output was then rebroadcast to a global online audience.

Now here is where the time dilation requirement comes into play. A "live" webcast has a broadcast latency of 15-30 seconds before it is received by viewers. This meant there was no way to cut both the in-studio Canadian video cameras and audio with the German webcast audio and video in real time. We had no ability to mess with time dilation so we did the next best thing: We delayed our live in-studio video and audio to match the delay on the German webcast feed, and then cut the newly resynchronized video in "real time"--15 seconds after it happened.

The technology I used for the delay was the NewTek 3Play 425 instant replay system (Figure 2). After we matched the timing of the delay manually, we crossed our fingers that the latency would not drift, which fortunately it did not. This only solved the synchronization of both feeds relative to each other--albeit delayed by 15 seconds--but we still needed the presenters to have conversations in real time on both ends so that they knew when it was their turn to speak. This was also important for the Q&A portion of the presentation when real (unplanted and unrehearsed) questions were read off-camera in Canada and presenters in both countries were asked to respond in turn.

The Canadian studio presenters could hear each other, since they were in the same room, but connecting real-time audio between Canada and Germany wasn't possible through the webcast feeds because of the delay. The solution came in the form of a phone line connection between the two sides using a phone bridge, with local playback on a small stage monitor (speaker).

It all came together, and although I definitely wish I had another chance at getting it just perfect or actually having a rehearsal before we went live, we successfully synchronized two sets of live and interactive video sources with a massive latency difference between the two (Figure 3 on the next page).

Operating the video switcher post-delay was a bit surreal. If I focused, I could see or hear into the video switcher's future, by listening to the live audio feed through the phone bridge or seeing the live presenters either in person or on the monitor of the 3Play 425 predelay.

Or course, this also meant that I could see any mistakes that the upstream real-time video switcher operator just made. Unfortunately, I had no ability to correct the bad switch; downstream I did not have a fallback delayed video signal to switch to. In our workflow, the second delayed signal through the instant replay machine was dedicated to a quad-view display that the client requested. We used the Matrox MicroQuad for this post-delay for the Canadian video feeds, and with built-in webcast delay on the incoming German webcast feed.

Needless to say, this was an extremely complicated setup, one that was made even more complicated by the introduction of an incoming high-latency webcast video feed and our desire to hide the fact that a remote presenter was not in the same studio as the other presenters. Nowadays, when we get asked to connect remote presenters with live presenters, we stick strictly to low-latency video connections so the audio and video can travel both ways in real time (Figure 4), and we embrace and do not try to hide the fact that some of the presenters and a portion of the audience are joining the live presentation online.

Integrating the Onsite and Online Webcast Audience Experiences

As live webcasts and webinars become more and more successful, it's not uncommon for us to be broadcasting content from a conference room with "only" a few hundred in-person attendees but with a total reach, including the online audience, that is two or three times that size. This is especially true with professional and regulatory agencies that need to connect to members who are located in every corner of our province and can't easily travel to the larger urban centers for regular education updates that we often take for granted. We encourage our clients to ensure that they consider the viewer experience when they plan their events. These can be simple gestures, such as welcoming the online audience at the outset, making sure housekeeping announcements are kept offline, and pausing if there is a technological interruption in the live video feed so online viewers don't feel like they're missing out.

With recent technological advancements, we prefer incoming remote presenter feeds to come in via Skype or Google Hangout types of webcam chat. Since we produced the previously discussed Canada-German webcast, both of these platforms have improved their video quality to HD quality. Skype's documentation lists its HD video calling requirement (bit.ly/lKnV6aw) as between 1.2Mbps-1.5Mbps and Google lists 2.6Mbps as ideal in its Hangouts video requirements (bit.ly/JLOODH). Neither specifies if by HD they support both 720p and 1080p resolutions, but 1.5Mbps-2.6Mbps is a decent video bandwidth, especially considering I still default to 720p30 webcasting at 2.5Mbps for most of my outgoing webcast productions.

The improvement of video chat technology and the widespread availability of the base video chat requirements of a webcam in a >2GHz dual-core processor laptop, and a high-speed internet connection has made it easier to bring in remote presenters who otherwise are not physically able to present at a conference, be it for cost, travel, or time restrictions. In my video business we have experienced an increase in demand to have remote presenters join live presenters via video chat.

I've even used this technology in my own professional life. Last month I asked my brother to present remotely from Japan, where he now lives, to my local professional video association, the BCPVA (www.bcpva.com). He talked about his successes as a YouTube producer (bit.ly/ lH7lXfK); in only 10 months he has over 100,000 subscribers, 7 million views, and his most viewed video went viral with 3 million views.

Challenges in the Audio Worflow

When connecting a remote presenter to an outgoing webcast or webinar production, I prefer to receive the incoming video chat signal on its own laptop and feed that signal directly into my video switcher via HDMI, for video only. The audio is connected to a soundboard via a DI direct box that lifts the electrical ground and prevents unwanted audio buzz. The audio from the remote video chat presenter and the local audio are then routed into my video switcher, before being hardware-encoded for the webcast or webinar.

On webinar platforms such as Adobe Connect, we have the ability to allow remote presenters to send their audio and video directly to the webinar audience, in addition to the live video cameras and audio that we mix and send, but this can really complicate the audio workflow. If you're not careful, you can create an audio feedback loop. The problem is that if your audio technician is amplifying the local audio and the incoming video chat audio through loudspeakers for the live audience, they have to be very careful not to send you back the same signal. The webinar audience already has a direct feed of the remote presenter, and if you send this same signal back into the outgoing webinar feed, you create a self-perpetuating echo that turns into a feedback squeal almost instantly.

A talented audio technician with the right equipment (a sound board with subgroups) can avoid sending you a redundant audio signal, but it is much easier and less risky to keep the video and audio in a different workflow. This isn't the only situation where you have to be really careful about not creating audio feedback loops. If a remote presenter has his or her speakers turned up too high and the webinar or video chat noise cancellation can't keep up, your local audio could feed back into the remote outgoing audio. This is the reason radio hosts ask callers to turn down their radios.

My preference is that remote presenters wear headphones (most people have a pair of in-ear ones that come with cellphones) to avoid creating a feedback loop and to improve the audio quality overall, as the noise cancelling doesn't become too aggressive. Of course I would also prefer if presenters also placed their laptops at eye level so we aren't looking up their nostrils and positioned themselves in a quiet room with soft front lighting. I find that external webcams, such as the highly rated Logitech C920 HD Pro Webcam, do a much better job than do laptop webcams for both audio and video.

If the environment is noisy, then having the presenter wear a webcam headset with a boom mic (USB or mini audio connector) will improve audio quality significantly. The same audio feedback problem can happen if the remote presenter audio levels are too high, the local microphones have an open polar pickup pattern (such as an omnidirectional or PZM/Boundary microphone), and the audio technician has left their feed open when not in use. Ensuring that all local presenter microphones are handheld-style (whether on stands or not)--with a more selective cardioid polar pattern--will virtually eliminate yet another potential audio feedback source.

Taking more control of remote presenter audio and video by isolating it from the live webinar signal also allows you the opportunity to give the presenter valuable feedback before you take them live, and prevents the audience from seeing them making the usual physical webcam adjustments so they look good on camera. There is nothing worse than a remote presenter launching their webcam feed live and only then do you realize they have positioned themselves in front of a bright window (and are horribly backlit) or made some other regrettable and avoidable mistake.

Of course, I would prefer if presenters didn't use a webcam but had a videographer actively filming them from their remote locations. The easiest way to get an HDMI or HD-SDI signal into a laptop is with HDMI to USB or HD-SDI to USB capture hardware. I personally use the Magewell USB Capture HDMI for this purpose, as it works for both Adobe Connect webinars and Ustream webcasts and doesn't require the broadcasting workstation, desktop computer, or laptop to do as much of the heavy lifting of the video processing compared to some video capture cards that don't do any scaling. Webcams comply with the USB video class (UVC) device protocol, and pieces of hardware, such as the Magewell USB Capture line, that meet the UVC protocol are among the few hardware options that will allow you to connect professional video cameras to a webinar broadcast.

Advancing the Slides

When producing live webinars or webcasts with remote presenters, it is important to decide who advances the slides. Typically, in webcasts we only send a single video feed and cut between the video camera feeds and the slides, but with webinars, you show slides and video at the same time.

We usually have remote presenters advance their own slides for the webinar audience, and we push the same slides locally from a local laptop, according to the timing of the webinar slides. Of course, this requires that you to obtain the slides from the remote presenter in advance. It isn't a bad idea to have a backup presenter who can speak to the remote presenter's slides if the primary presenter is unavailable at the last minute.

In our last webinar production with a remote presenter, this very thing happened when our presenter's return flight home was delayed after take-off and the presenter was still in the air when it was his turn to present. Fortunately, the client did prep for this possibility (and the possibility that the technology connecting their video and audio would fail), and a replacement speaker, who was attending locally and who was familiar with the content of the presentation, made the presentation. The presenter later joined the live and online audience for the Q&A to fill in any gaps in the presentation. With webcasts we have a local person rehearse with the remote presenter and advance the slides for the live audience, the signal of which we split and take a feed to our video switcher for the webcast audience.

Handling Q&A

Q&A with both a live and an online audience is another area where a bit of planning, coordination, and rehearsal comes in handy. Webcasts, being a one-way video communication, lack the ability for the audience to easily interact with other viewers and to submit questions. Ustream does have a chat window that viewers can use with a Facebook login or a free Ustream user registration, but we have found that many business people do not want to mix their private Facebook lives with their business persona, nor do they want to take the extra step to register a free account so that they can chat with their name attached to their user.

Enabling a public chat allows chatting without any registration, but the username then is an assigned name such as user12345, and this forced anonymity makes it hard to follow up with users offline without asking them to share their contact info in public. As a workaround we encourage our webcast clients to create a simple custom Google form that they can embed on their website below the webcast video. Viewers can submit questions, along with their contact details, and the questions go to a Google Sheet in the creator's Google Drive account. There, they can curate the questions that they will have a moderator ask into a microphone, in turn with the questions from the live audience.

With Adobe Connect webinars it is much easier to enable both chat and viewer questions in the specialized pods (Figure 5). Both of these webinar and webcast solutions allow the organizers to respond to questions that didn't get asked or answered offline at a later time.

Managing the blurred lines of presentations with mixed live and online presenters and audiences takes a lot of planning. Going through the exercise of committing connection diagrams on paper and considering issues of audio loops goes a long way to prevent unwanted feedback, both through the loudspeakers and from the online audience.

Shawn Lam (video@shawnlam.ca) is a professionally accredited and multi-award winning Vancouver-based video producer and technical director. He has written more than 50 articles for Streaming Media Producer and EventDV and is a respected video production expert and sought-after product reviewer. Comments? Email us at letters@streamingmedia.com, or check the masthead for other ways to contact us.
COPYRIGHT 2016 Information Today, Inc.
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 2016 Gale, Cengage Learning. All rights reserved.

Article Details
Printer friendly Cite/link Email Feedback
Author:Lam, Shawn
Publication:Streaming Media
Date:Apr 1, 2016
Words:2992
Previous Article:How to produce virtual reality video for online delivery.
Next Article:Teradek VidiU Pro: the must-have tool for professional live streaming.
Topics:

Terms of use | Privacy policy | Copyright © 2019 Farlex, Inc. | Feedback | For webmasters