Queueing Theory: Difference between revisions

From NovaOrdis Knowledge Base
Jump to navigation Jump to search
Line 14: Line 14:
This section discusses queueing for pipelines.  
This section discusses queueing for pipelines.  


A [[Go_Pipelines#Overview|pipeline]] consists in a series of stages, which process a stream of data fed into the pipeline. If there are no buffers (queues) between stages, a slow stage will block the upstream stages from working. This is because a slow stage will be unable to process, thus accept a new stream element from the upstream stage, making the upstream stage unable to hand over the stream element, thus blocking it. The delay will propagate upstream, possibly all the way to the pipeline input, preventing new stream elements to be processed altogether. Introducing queues seem like an obvious optimization, but in most situation, this is actually not the case.  
A [[Go_Pipelines#Overview|pipeline]] consists in a series of stages, which process a stream of data fed into the pipeline. If there are no buffers (queues) between stages, a slow stage will block the upstream stages from working. This is because a slow stage will be unable to process, thus accept a new stream element from the upstream stage, making the upstream stage unable to hand over the stream element, thus blocking it. The delay will propagate upstream, possibly all the way to the pipeline input, preventing new stream elements to be processed altogether. Introducing queues seems like an obvious optimization, but in most situation, this is actually not the case.  
==Decoupling==
==Decoupling==
Introducing '''queues''' between stages makes possible to hold data stream elements processed by an upstream stage, while the downstream stage is still busy, thus freeing up the upstream stage to continue doing work. Queues '''decouple''' stages. However, if the difference in processing speed is constant, the queue will fill up after a while, so introducing the queue will only delay blocking the upstream stage, not eliminate it.
Introducing '''queues''' between stages makes possible to hold data stream elements processed by an upstream stage, while the downstream stage is still busy, thus freeing up the upstream stage to continue doing work. Queues '''decouple''' stages. However, if the difference in processing speed is constant, the queue will fill up after a while, so introducing the queue will only delay blocking the upstream stage, not eliminate it.

Revision as of 18:13, 7 February 2024

Internal

TODO

TODO: 2018.07.20 - The Essential Guide to Queueing Theory.pdf in Learning.

Overview

Queueing in Pipelines

This section discusses queueing for pipelines.

A pipeline consists in a series of stages, which process a stream of data fed into the pipeline. If there are no buffers (queues) between stages, a slow stage will block the upstream stages from working. This is because a slow stage will be unable to process, thus accept a new stream element from the upstream stage, making the upstream stage unable to hand over the stream element, thus blocking it. The delay will propagate upstream, possibly all the way to the pipeline input, preventing new stream elements to be processed altogether. Introducing queues seems like an obvious optimization, but in most situation, this is actually not the case.

Decoupling

Introducing queues between stages makes possible to hold data stream elements processed by an upstream stage, while the downstream stage is still busy, thus freeing up the upstream stage to continue doing work. Queues decouple stages. However, if the difference in processing speed is constant, the queue will fill up after a while, so introducing the queue will only delay blocking the upstream stage, not eliminate it.

Queueing and Overall Pipeline Performance

Queue Tuning