Multiple live streams

Hey Guys,

I want to get your take on this scenario. We have been doing live streaming for an event that past couple times its gone one. The client wants to be able to do two live feeds this time.

I came up with two solutions.

  1. Two cameras, each one to a dif lap top that sends the stream and then my flash app would basically have 2 incoming live streams.

  2. Some sort of hardware encoder than takes 2 video inputs and merges them into 1 output which is ultimately a single stream of the cameras merged.

I would like to know your thoughts, thanks.

Hi, when you say “single stream of the cameras merged” you make reference to watch both video stream at the same moment?

Like a VideoWall?

Hey Guys,

I want to get your take on this scenario. We have been doing live streaming for an event that past couple times its gone one. The client wants to be able to do two live feeds this time.

I came up with two solutions.

  1. Two cameras, each one to a dif lap top that sends the stream and then my flash app would basically have 2 incoming live streams.

  2. Some sort of hardware encoder than takes 2 video inputs and merges them into 1 output which is ultimately a single stream of the cameras merged.

I would like to know your thoughts, thanks.

I’m in the middle of trying to do exactly the same thing, two streams into one player to be displayed fullscreen on an overhead projector. I’m using XSplit to capture the RTMP streams and mix them into one player. Wirecast links to XSplit seamlessly and can be used to broadcast the mixed feed via RTSP to a media player such as VLC where it can be shown on the projector. All sounds really neat but I’m having problems with video lag between wirecast and vlc. I think it might be the hardware I’m testing it on though, Core i3 with 4GB RAM. The real job will be done on a Core i7 with 8GB.

Backing up a little, showing two streams is not difficult. You can use two player in a web page and show a different video in each one. Combining streams probably more than you need.

Richard

If two streams are involved, I think cue-points are going to be a necessary part of the solution.

There are a lot of different scenarios that could have different solutions. If two videos are exactly the same encoding and their start times were synched, you could try just playing them at the same time. If one video is really just a series of relatively static images, like a ppt presentation, that needed to be synched to a speaker in another video, that’s easy to do with cue-points.

Richard

You can insert cue-points live from the client with netstream.send, and from the server into a live stream with this method:

http://community.wowza.com/t/-/238#2

I made something awhile back that involved a flv converted from PPT with Captivate, and a live stream where the speaker could flip the pages using netstream.send. It was a little involved, and required output of Captivate, but to my point that there are a lot of possible scenarios. A lot synchronization efforts have come up short. Many have tried with little or no success to replay a live chat, for example.

Richard

You need use other tools for join this two stream into one… the unique tool I know in Java and documented are Xuggler, but is pretty difficult for use.

The other way is doing this from player side, call both stream and join into the player or screen, but always this are 2 different streams connected at the same time.

regards,

Ale

I would be very interested in suggestions for this too. We currently embed two QuickTime streams using Javascript or SMIL, which works great, but we’d like to offer a Flash version too. We do this for both live and on-demand events – keeping the videos separate allows users to move, resize, or hide/disable one stream independent of the other. We’ve been working on a custom Flash player that connects to both streams and displays them in similar manner, but I’ve noticed it is difficult to keep them in sync.

I would have thought this would be a more popular delivery option (dynamic presenter + slides), but I have found very little information about it. I have seen a few proprietary boxes and systems to do this (Envivio and MediaSite), but I don’t think they are Flash-based, and I haven’t seen any custom solutions for presenting multiple Flash streams simultaneously.

Revisiting this one…

Embedding two different players might work for live streams, but I could see synchronizing VOD streams being troublesome. Is there a good way to manage sync?

Ideally, we would still want them together in one player, so the viewer could arrange/resize both as they wanted and still use fullscreen to view their custom arrangement. Like this demo:

http://128.101.97.135/courseblender/vlm/

(This particular player doesn’t appear to work with live streams, and I’ve encountered trouble with its RTMP VOD sync too.)

Hi, when you say “single stream of the cameras merged” you make reference to watch both video stream at the same moment?

Like a VideoWall?

Yes they are playing at the same time on the same screen…One will be of a speaker and the other be the speakers powerpoint

You need use other tools for join this two stream into one… the unique tool I know in Java and documented are Xuggler, but is pretty difficult for use.

The other way is doing this from player side, call both stream and join into the player or screen, but always this are 2 different streams connected at the same time.

regards,

Ale

Right…I wanted to try and avoid doing two streams on the player side. Ill take a look at this Xuggler

I’ve been checking out that xuggler, but it requires java knowledge which I dont know. On the forums, everyone is saying it will be pretty difficult, but it is possible. I am trying to look into a hardware encoder.

Thats a good example. That is essentially what we are trying to achieve

but using cue points, you have have to determine these ahead of time. My situation is a live event where the audience can potentially ask questions which would throw the cue points off. I am going to be testing your solution of two embedded swfs in the page sometimes next week.

we are using a switcher now. We have two cameras and a DV deck with some videos on it. The two cameras plug into the switch via firewire and the DV deck plugs into the switch via firewire as well. The switcher we have can support up to 4 inputs. It has two outputs and we only need one. So basically we switch between the two cameras and when their videos come up we switch to the DV deck. The one output goes into my laptop which is my main feed that I get in Adobe Live Encoder and that essentially gets sent out to the viewers.

Now, we haven’t figured out actually displaying two cameras at the same time. We ended up not needing that and the switcher works perfectly for what we’re doing