Hello
I am a Korean developer and want to use the WOWZA SYSTEM to perform live streaming.
We encode real-time video and audio data from the capture board using ffmpeg, and we want to live stream the buffer that is encoded in real time.
Is it possible?
The documentation only shows live streaming methods using a file-based protocol, so I ask you a question.
Thanks for the detailed answer, @Tim Dougherty.
We encode video and audio data from the capture board every frame. And the encoded frames are stacked in the frame queue.
I want to take these encoded frames out of the queue one by one and continue live streaming
For example, we want to stream data that is encoded by the camera in real time, like YouTube live streaming, using the Wowza server.
I have read all the ffmpeg reference sites that you recommend to me
Unfortunately, there doesn’t seem to be an example that uses a buffer encoded in real time as an input source (all based on file or url).
Thanks @Tim Dougherty for the answer. You can also consider playback over WebRTC since you’re looking for close to real-time streaming @Park Taesoon.
https://www.wowza.com/docs/ingest-rtsp-srt-or-rtmp-streams-into-wowza-streaming-engine-for-playback-with-webrtc
It sounds like you’re asking if Wowza can ingest a live stream from FFMPEG. I can’t speak to the detail in selecting low-level data from an FFMPEG buffer, respectively, but a standard live stream from FFMPEG is possible and very common using RTMP, SRT, MPEG-TS and more. This article offers some commands to get started: Use FFmpeg as a live encoder with Wowza Streaming Engine
This sounds like some low-level FFMPEG configuration. I personally don’t have experience with this, but others here may.
As far as Wowza is concerned, if you’re able to achieve this function in FFMPEG, transmitting the data via RTMP (or other supported protocol), of course, is supported.