Queueing Theory
Internal
TODO
TODO: 2018.07.20 - The Essential Guide to Queueing Theory.pdf in Learning.
Overview
Queueing in Pipelines
This section discusses queueing for pipelines.
A pipeline consists in a series of stages, which process a stream of data fed into the pipeline. If there are no buffers (queues) between stages, a slow stage will block the upstream stages from working. This is because a slow stage will be unable to process, thus accept a new stream element from the upstream stage, making the upstream stage unable to hand over the stream element, thus blocking it. The delay will propagate upstream, possibly all the way to the pipeline input, preventing new stream elements to be processed altogether. Introducing queues seem like an obvious optimization, but in most situation, this is actually not the case.
Decoupling
Introducing queues between stages makes possible to hold data stream elements processed by an upstream stage, while the downstream stage is still busy, thus freeing up the upstream stage to continue doing work. Queues decouple stages. However, if the difference in processing speed is constant, the queue will fill up after a while, so introducing the queue will only delay blocking the upstream stage, not eliminate it.