Hello,
The mechanisms for supporting multiple video & audio feeds seems clear when a single encoder is encoding all feeds. However, in our environment having all encoders co-located is impossible. We’ve done some work to create a custom player that synchronizes
the feeds based on the PIP and multiple-camera switching examples that we’ve found on the IIS Media Services sites/forums. We’re attempting to synchronize
based on the time that each encoder starts, but we’re having some challenges getting playback synchronized consistently. We’re using Expression Encoder’s API for the encoding.
We don’t need synchronization to the tick, but we’re hoping to achieve something in the range of having the feeds within .1-.25 seconds of one another.
One of the challenges is understanding all the sources of delay. Currently, each encoder calls a web-service to record its feed start time once the encoder status reaches StreamingStarted. All encoders use the same encoding profile and report times to
the same machine so that we don’t have issues based on the variation of system clocks. With two encoders looking at the same digital clock, and then manually synchronizing the feeds (Feed2.startTime – Feed1.StartTime) we see a variation of up to several seconds
from one run of this test to another. For example, in test #1, the feeds are within 1 second of each other after synchronizing; in test #2 the feeds are 3 seconds of each other. Any guidance about how to achieve better results?
Specifically:
- Are there any mechanisms we could extend to allow multiple encoders to connect to the same publishing point?
- Are there any mechanisms we could extend that would allow us to override the encoding code to synchronize on ingest?
- Any mechanisms to synchronize the streams at the server? We’ve tried creating custom CSM files without much success. Should we continue pursuing this option? Even if live coverage is important?
- Any other ideas about the sources of this variation that we need to consider when synchronizing?
Thanks
Dev