Hello,
I'll try and answer your questions to the best of my abilities, seeing as it's quite a lot please do ask again if something isn't clear.
1. Enable Stream Protocol
This is one of the things we plan to tackle on making it more clear with our new interface.
Only protocols that require a specific port are affected input wise, as they'd need a port to be active to work.
All protocols that mention "X over HTTP" are actually output only and require HTTP to be active in order to work. So if you want HLS, you just need to activate HTTP and set up Apple segmented over HTTP (HLS) to be enabled.
Things that aren't quite clear here is that input wise you'd still be able to use MP4 urls/links or HLS links even if they are not enabled. Since input is something you control whether you do it or not it's never inactive.
This is a default message that comes up whenever an end location does not have DTSH, which MistServer uses to both speed up reading in as well as allowing trick play on protocols that would otherwise not support it. The message can be safely ignored as it mentions it cannot find one and then that it cannot create one, which is normal for connecting to a different server.
HLS pull would work with or without HLS being set up, however it would need to read in the entire HLS stream every time it is considered inactive, which would be a hassle. If this is a VOD stream we would recommend making the vod asset available to all servers using it for example putting it in a storage all of them can access.
If the goal here is sharing a live stream between MistServer servers the easier set up would be using the DTSC protocol, which is our own internal media format and a lot speedier in terms of latency when sharing media between MistServers.
If it's pulling from a completely different server and only HLS is available, well then you'll just see this message more.
3. Reduce Live Latency
The settings you're using are quite wrong. The buffer time is the amount of live DVR buffer available for the stream. This will get raised to an amount that every protocol output can have decent playback so it doesn't matter too much. It's not a latency introduced or anything.
Latencywise MistServer would automatically try to put players as close to live as they would be able to play with stable playback. There's protocols that are way better for low latency than others of course. Best in terms of latency are for the browser:
WebRTC > WS/MP4 > MP4 = WebM > HLS(CMAF) = DASH(CMAF) > HLS(TS)
There's tricks in lowering the latency like:
- Make sure there's 1 keyframe per second
- Remove the bframes so WebRTC can work
- Do not hook up encoders, or if you do choose gstreamer (as it's faster than ffmpeg) or use a hardware encoder
- WebRTC does work faster if you configure the STUN/TURN
- For HLS/CMAF you can limit the amount of segments available from live by editing the protocol in the protocol panel
Streamprocesses
These are meant for you to run/activate an encoder to add different qualities. They are completely optional and will always increase latency. These are definitely not recommended if you want low latency. You've also got them set to invalid settings so they're probably starting/stopping continuously on the background, a resolution is at least the minimum that it'll need to work.
If your stream is already in a quality that you won't need any transcoding, I would recommend removing these completely or at the very least only encode on the origin ( or edge), not both.
Sharing live streams between servers
Using DTSC is the fastest method between servers, you just have the origin push dtsc or edge pull dtsc. It's as simple as having it push towards the edge using:
dtsc://edgeserver/streamname
or pull using
dtsc://originserver/streamname
If you want to manually add delay I would push from the origin server, you can add the flag "?pushdelay=seconds" where seconds is the amount of seconds you want the push to be delayed by. If you want to use this, you'll need to use RTMP, SRT or RIST however. DTSC will ignore this as it's made to send over stream contents as fast as possible.
You can also simply push towards the edge using RTMP, SRT or RIST. Which is the best fit depends on the network conditions.
4. Live automatic push
Yeah a traditional setup with origin/edge servers is possible through MistServer. We also have a load balancer that can make things easier, but it's not provided with the open source version of MistServer.
Whether push or pull is better depends on your usage and preference. Technically there's no big difference as they should get similar latencies, but only if you use similar protocols. HLS is not recommended as an output to share between servers. The main con to HLS is that compared to other formats it has a lot of overhead, so I would go to the more traditional push formats for live streams:
RTMP, SRT, RIST or DTSC (if it's between MistServers).
In general you can expect similar latencies between the protocols, but DTSC would be optimized for low latency. It requires a TCP connection and goes over port 4200 by default. You can set it up in the protocol panel.
Most of our users tend to go with a push set up, mostly because it's easier to understand and thus set up. Pull tends to get better when you have too many servers to remember however.