Welcome to the forums @Julian_Ridley.
First of all I’d like to explain that in Streaming Cloud, you can connect a WebRTC source and it is transcoded to HLS for playback. It is not yet WebRTC playback in Cloud, but coming very soon!
Second, when using WebRTC with either Wowza Streaming Engine or Cloud, we are the server in the middle and it is not peer-to -peer. We act as the middle man:
At its core, WebRTC was designed for peer-to-peer communication between a limited number of browsers. But by adding a media server to their WebRTC streaming solutions, content distributors can enhance the framework’s out-of-the-box capabilities and broadcast live streams to any destination.
Read full blog here.
Third, as far as latency option in Cloud, because Cloud is converting the WebRTC stream to HLS for playback, with HLS there is a standard latency of about to 30 to 45 seconds which for some workflows is acceptable. (Video segments or chunks are usually 10 seconds each in typical HLS.)
If, however, your workflow requires lower latency, you can select “Enable Low-Latency Streaming” and Cloud will create smaller segments of 2 seconds each.
(Three segments are required for HLS playback, so this will reduce your latency to around 6 to 8 seconds)
Depends on your use case and latency requirements.
Lastly, if you need closer to real-time playback in Streaming Engine, you have the option to ingest a non-WebRTC source like RTMP and Engine will play that back in WebRTC or the WebRTC ingest with WebRTC playback is also available in Engine.
Once again, we will be supporting WebRTC ingest and WebRTC playback in Streaming Cloud in very near future and I’ll be sure to let you know when that is official.
Hope this helped clear things up a bit.