A typical messaging solution exchanges data between its distributed components using message queues, which include publishers publishing messages into queues and subscribers intended to receive messages. The subscriber can be implemented either as a single or a multiple-threaded process, either continuously running or initiated on demand.
Primary queuing mechanisms
At a higher level, there are two primary queuing mechanisms used to enable a queue listener (receiver) to receive messages stored in a queue.
- Polling or Poll-based model: A listener monitors a queue by checking the queue at regular intervals. A listener is a part of a worker role instance. The main processing logic is comprised of a loop in which messages are dequeued and dispatched for processing. The listener checks for messages periodically. The queue is polled until the listener is notified to exit the loop. The Windows Azure pricing model measures storage transactions based on requests performed against the queue, regardless of whether the queue is empty or not.
- Triggering or Push-based model: A listener subscribes to an event triggered either by the publisher or by the queue service manager, whenever a message arrives on a queue. Then the listener dispatches the message for processing. So, it does not have to poll the queue to determine whether any new work is available or not. A notification can be pushed to the queue listeners for every new message, when the first message arrives in an empty queue, or when the queue reaches a certain level. While using Windows Azure, the Service Bus volume of messaging entities like queues or topics should be considered.
Best practices for optimizing transaction costs
In a queue-based messaging solution, the volume of storage transactions can be reduced using a combination of the following methods.
- Group related messages into a single larger batch, and compress and store the compressed image in the blob storage, while keeping a reference of the blob in the queue.
- Batch multiple messages together in a single storage transaction. TheGetMessagesmethod in the Queue Service API enables de-queuing the specified number of messages in a single transaction.
- While polling, avoid aggressive polling intervals and implement a back-off delay that increases the time between polling requests if a queue remains continuously empty.
- Reduce the number of queue listeners- when using a pull-based model, use only 1 queue listener per role instance when a queue is empty. To further reduce the number of queue listeners per role instance to zero, use a notification mechanism to instantiate queue listeners when the queue receives work items.
- If queues remain empty for most of the time, automatically scale down the number of role instances and continue to monitor relevant system metrics to determine if and when the application should scale up the number of role instances to handle increasing workload.
- Using a combination of polling and push-based notifications, enables the listeners to subscribe to a notification event (trigger) that is raised upon certain conditions to indicate that a new workload is put on the queue.