H264 live broadcasting

So i’ve been searching for hours now… figured someone here might have the answer. So what i’m doing is taking Bitmaps, feeding them into FFMpeg, and dumping out raw H263 (I’ll get to 264 in sec) data. I use my own FLV library and rtmp library to created in C# with headers/etc to stream the data directly out to wowza… bit by freeking bit, its been a battle. It does work though, worked with Red5 a long time ago too…anyways…

Now I’m trying to get the same result with 264. Firstly i’ve got it all working up to the point where it supposed to show up on the other side. Frames are being generated, sent to wowza, and sent to the client. No one is choking, or at least no one is complaining. I have even inspected the bits being generated and everything looks inline with VIDEODATA/AVCVIDEOPACKET from the adobe spec… but the client doesn’t see anything, the Flash Video player doesn’t even render anything, not even black or white screen.

From what i’ve been reading, it has to do with the fact that I am not sending any Metadata/Moov atom junk (basically a command that say time to play the stream). Everything online talks about moving these headers to the start of your file before streaming… I have no file, i have no start, and the end is unclear… so i need to generate them dynamically when a client connects. Fine, i’ve done this bit by bit before… but there is hardly anything on how these packets are formatted, let alone which ones i need.

Since i’m targeting wowza anyways, hopefully someone here might know what i need to send and how. Otherwise i’m stuck with 263 encoding, and that means more data == more $$…

So who’s the expert, cuz this one is a big one.

Thanks in advance!

How are you sending frames to Wowza? If you are sending over RTP or MPEG-TS do the key frames include PPS and SPS NAL units?

Charlie

How are you sending RTMP? What libraries are you using?

Charlie

We are sending via RTMP. FFMPEG is giving me NALU’s, i am assuming they have PPS and SPS encoded there because thats how ffmpeg is supposed to work. Usually you write a header then the data out, but since i’m not using the files, i skip the header part of a file.

My best guess at this point is based on people posts about moving the ‘moov atom’ to the beginning of a real file from the end. This allows 264 to start playing before the whole file is downloaded. I am assuming my problem is the same only with real live streaming that end ‘moov atom’ will never get sent. So i am hoping injecting to the client before the rest of the data will solve the problem.

How would i verify the PPS/SPS from the NALU’s if i was looking at the bytes.

Thanks again for your help

As I said in the first post, i have my own library i wrote for FLV and RTMP body’s and headers in C#. that part is working just fine since h263 works. I handle all the pings, and server bw calls as well this way. I even am using it to send AMF to make function calls to my wowza application.