
JWPlayer can handle HLS - so that makes it work anywhere Flash is available. The audio is in there because some HLS client implementations seem to require it - I'm interested in using this to stream to anything that's HLS capable - connected TVs / Android / iOS / etc etc (also I intend to inject audio at some point - so I figure I might as well have it in the encoding pipeline from the start). Typically the lowest latency you can achieve is around 2 * fragment duration - so you can drive the delay down by having, say, 2s fragments - but that impacts on player stability. And the player holds at least one downloaded fragment in hand - in case of adverse network conditions. The server doesn't advertise a chunk of video until it's completely encoded - so that's one fragment's worth of delay before you start. The delay can be minimised but not eliminated entirely. In this case we're not doing adaptive streaming - because the Pi is only encoding a single bit rate. ng_tr.html (that's mainly Adobe HDS but the principle is the same). You can read more about how it works in general here.

The latency is inherent in HLS (and all http chunked streaming) I'm afraid. In this case it's probably just cargo culting.

Under some circumstances ffmpeg seems to have a better stab at analysing an incoming stream if it has a full fifo to read from - so I tend to chuck a fifo and a small delay in if I'm having difficulty getting something working. As you say I'm not sure if the fifo is now necessary at all.
