We often have to handle asynchronous and periodic task execution. It is particularly useful in scenarios where tasks take a significant amount of time to complete or need to be executed periodically. A few examples are sending emails, generating reports, web scraping, database audits, etc.
Any tool that we select must-have features that can help us achieve below concerns.
- Creating Periodic Tasks.
- Monitoring
- Ease of use and configuration
Python offers a variety of frameworks like Celery, RQ, Dramatic, etc that help us address the above concerns.
In this article, we will learn about the Celery framework. For running tasks effectively it requires a message broker and an additional package for monitoring. To set Celery, we need the tools below.
- Background Tasks: Celery
- Message broker: Redis
- Monitoring: Flower
Setting up system
We will use Python libraries 'celery' to create background jobs, 'flower' to start the monitoring tool, and 'redis' as a message broker.
Create virtual environment
Install packages
Python code
Create a Python file and run
Run logs
Redis Keys View
Run the below command to start monitoring the tool.
The monitor will be running at http://0.0.0.0:5555.
Now, our worker and monitoring tool are set up. We can add more tasks and execute them using different approaches.
That's it, it is easy to set up a background process framework.
Thanks for reading till the end.