Simulated live via schedule has proiblems when changing videos

Hi all,
We have a live simulated stream via video schedule system using smils and on the player side we use videojs.
As you know, smils can have playlist sections, in a section you can sequence videos. Usually a section or playlist entry is used to resync the stream at fixed times: something like this:

<stream name="sched_en_m"></stream>
        <playlist name="pl1" playOnStream="sched_en_m" repeat="false" scheduled="2022-08-18 00:48:00">
                        <video src="mp4:a4b5d1d81e4c38d7f284b3d1f7df6675.mp4" start="0" length="-1"/>
                        <video src="mp4:5a413c8c93217e76c58f833f2e377bbe.mp4" start="0" length="-1"/>
                        <video src="mp4:2c4d155aa5328cb555924d40bcaf9087.mp4" start="0" length="-1"/>

I have the following problem: when one playlist takes over from the one above, videojs is displaying the animated loading icon, because of some signaling or something that forces re-buffering.
Anyone can help me with this? Is there a way to remove the “STOP” metadata (I assume there is one) or somehow to make the stream flow seamlessly?

I have even created an “aggregated app” that receives locally the stream via the source app via a stream target, assuming wowza will smooth the streaming, but it seems the situation propagates even in such a setup.
Any ideas are highly appreciated!
Cheers!

Gabriel

Hi,
All video footage MUST be IDENTICAL /bitrate, fps, frame size, etc./.
Otherwise - the effect as you describe.
We don’t use smil in our playlists /another solution/ but the principle is ALWAYS the same.
Even when we mix vod with live.
Maybe this will help.
D

Thank you for the hint! We have a 10 year old video repository, so is definitely the case from time to time to have different video format. We have only standardized this a year or so ago, but bitrate is always a variable, depending on the video content - audio books with cover are of course encoded on lower bitrates than conferences with (more) movement.on the actual video channel.

If you say you definitely have the same problem, I will try to replicate this in a controlled environment, but I still need a solution. We have powerful servers and video cards, they should be able to normalize the video streams. We generate 4 output formats after transcoding. Even when the sources are different, shouldn’t the transcoded output solve this? Meaning the 360p or 720p formats resulted have everything normalized?