Implementing Background Tasks in Python

We often have to handle asynchronous and periodic task execution. It is particularly useful in scenarios where tasks take a significant amount of time to complete or need to be executed periodically. A few examples are sending emails, generating reports, web scraping, database audits, etc.

Any tool that we select must-have features that can help us achieve below concerns.

  • Creating Periodic Tasks.
  • Monitoring
  • Ease of use and configuration

Python offers a variety of frameworks like Celery, RQ, Dramatic, etc that help us address the above concerns.

In this article, we will learn about the Celery framework. For running tasks effectively it requires a message broker and an additional package for monitoring. To set Celery, we need the tools below.

  1. Background Tasks: Celery
  2. Message broker: Redis
  3. Monitoring: Flower

Setting up system

We will use Python libraries 'celery' to create background jobs, 'flower' to start the monitoring tool, and 'redis' as a message broker.

Create virtual environment

python -m venv bgtasksvenv
bgtasksvenv/Scripts/activate
Bash

Install packages

pip install -U Celery
pip install "celery[redis]"
pip install flower
Bash

Python code

from celery import Celery

app = Celery('celery_bgjobs', broker='redis://<username>:<password>@<servername>')

@app.task
def hello():
    return 'hello world'
Python

Create a Python file and run

celery -A <filename> worker --loglevel=INFO
Bash

Run logs

Run logs

Redis Keys View

Redis Keys View

Run the below command to start monitoring the tool.

celery -A main_tasks flower
Bash

The monitor will be running at http://0.0.0.0:5555.

Monitor

Now, our worker and monitoring tool are set up. We can add more tasks and execute them using different approaches.

from main_tasks import hello

hello.delay()  # Approach 1
hello.apply_async()  # Approach 2
Python

That's it, it is easy to set up a background process framework.

Thanks for reading till the end.