About Detecting Client's Abnormal Disconnection and Handling it

I have a question about client’s abnormal disconnection.

In case of poor internet connection or unexpected situation, the clients sending RTMP would be disconnected from the session.

In order to catch and handle those situations, I’ve set PingTimeout and ValidationFrequency properties into Application configurations, but it just print out ‘pingtimeout’ logs. However, I need to operate handling abnormally disconnected session logics to handle ping timed-out session for managing our services.

So how can I catch pingtimeout event at the codes? Is it possible to figure out if the session disconnection event is normal or abnormal, and handle it at onDisconnect or onUnpublish event?

Thank you in advance

If your requirement is to be able to disconnect unhealthy RTMP streams, then you can implement a monitor that keeps track of the incoming bitrate, and disconnects it based on a specified minimum value.

This is similar to the following module:

https://github.com/WowzaMediaSystems/wse-plugin-limitpublishedstreambandwidth

The module currently monitors using a max bitrate, but you can easily add another if condition to compare the incoming bitrate to a minimum value as well.

Here’s a gist for an example:

https://gist.github.com/wowza-mbrinker/1f0c6d82dcb3c54e0fcd61670b50895b

Some caveats:

  • You need to add a delay in the TimerTask because when the stream initially starts up, it will initially register with a 0 bitrate value. This delay can be adjusted as needed through an application property, but 750ms should be sufficient.

  • Not all RTMP encoders behave in the same way. Some of them may be persistent in re-connecting.