Celery and Redis with Django

Learn via video courses
Topics Covered

Celery enhances Django applications by managing asynchronous tasks through a message broker like Redis. It offloads long-running jobs, optimizing performance and enabling efficient task scheduling and execution. This integration is essential for robust, distributed environments in Django projects.

Celery

The tasks are executed asynchronously using the open-source Celery Django Python library. A task queue keeps the tasks and appropriately allocates them to the workers. Although it supports scheduling, its primary focus is real-time operation (running regular interval tasks). It dramatically improves end-user activity. Celery incorporates various message brokers, including Django Redis and RabbitMQ.

Tryton, Flask, Pylons, web2py, and Tornad are just a few of the web frameworks combined in Celery Django.

Why should we Use Celery ?

Consider a scenario in which we want to send several emails at the final moment of the day or need to access an API once every minute (or hour). Such recurring tasks can be quickly scheduled in Celery Django.

Please look at a different situation : a user sends a request, but it takes too long for the page to load. Meanwhile, Django Celery speeds up page load times by running some functionality as deferred tasks on the same server or occasionally on a different server.

Then, Celery's workers can process file types, send emails, make changes to databases, refresh the UI via callbacks, and perform various other tasks.

Our application's ability to continue responding to client requests is Celery's main benefit. This prevents needless waiting for the end users.

Installation

Python3

We must install Python 3 because Django 2 depends on it. The installation command used while I'm operating on a Mac is :

This installs pip3 and python3 together. The global Python version, however, still refers to Python 2 :

Virtualenv

To define the python version for the project, the virtualenvwrapper package is used. Depending on the project, we can switch between different versions of Python. To use virtualenvwrapper (assuming it has been installed), the following command needs to be run :

Where <env_name> should be changed to the environment's name. The expression that Python resolves to is /user/local/bin/python3 , where Python3 is installed. Running work on <env_name> will activate the virtual environment. The python version used in the virtual environment is :

Pip Packages :
Installing the necessary Python packages inside the virtual environment could be done by typing :

It's a good idea to explicitly state the package versions because doing so will result in a codebase that is easier to maintain because it is predictable, by the 12 Factor App Manifesto.

Redis

We must set up a message broker because Django Celery needs one. One of the most straightforward brokers to set up is Django Redis, which can be done by executing the following commands :

That is all required to start Django Redis; we will use this in our settings.py file since it runs by default on localhost at port 6379.

Create Django Project

Assume we want to create a website. The website is our project, and the applications—the forum, the news, and the contact form—are the tools. Due to the independence of each application, this structure makes switching between projects easier.

Get a terminal or a cmd prompt on Windows or Linux, go to the location where your project should be created, and then enter this code.

Next, we develop a Django project, and a Django application called fonepolls. The project structure is listed below.

celery.py

Let's then begin installing and configuring Celery Django.

Create django_celery_example/celery.py beside django_celery_example/wsgi.py

settings.py

We can add configuration for better management because Celery Django can read configuration from the Django settings file.

It would be best if you kept a few things in mind.

Broker_url is the configuration key you should set for a message broker, according to the Django Celery documentation, but in the aforementioned celery.py.

If you set broker_url in your Django settings file, the setting would not take effect because app.config_from_object('django.conf', namespace='CELERY') instructs Celery to read the value from the CELERY namespace. All Celery Django configuration keys listed in the doc must adhere to this rule.

When config, please refer to the documentation because some configuration keys differ between Django Celery 3 and Django Celery 4.

Setup

The article's above section contains information on how to set up a Python project. The mentioned directory structure has likely been established :

The actual project identity should be used in place of <mysite> here.

celery.py

The next step is to create the file that defines the Celery Django instance, which is located at <mysite>/mysite>/celery.py. The file should be configured as follows :

init.py

The celery.py file must be imported in <mysite>/mysite>/__init__.py for the app to be loaded when Django starts.

settings.py

In the <mysite>/<mysite>/settings.py file, we need to configure celery Django by adding the following variables :

This demonstrates that a broker running on localhost with port 6379 has been used, which is redis. For serialization, it accepts the application/json content type and json format.

Rapid Testing

Run the following command to test our setup in the virtual setting :

Running and connected to the redis host at redis://localhost:6379// should be the celery Django worker.

Activate the virtual environment, i.e., workon <env name>, in a different terminal but within the same folder, and then run :

This launches the Python interpreter, which is loaded with environmental variables specific to the Django project. When the interpreter is running,

In this file, the debug_taskdefined in <mysite>/<mysite>/celery.py is imported. Using the message broker, the function is then executed asynchronously. It then returns an asynchronous object that can be used to both retrieve the function's result and determine whether the task has been completed.

The outcome of the debug_task task is displayed in the original terminal as :

What is Redis ?

An in-memory data structure store called Redis (REmote DIctionary Server) can be used as a database, cache, or message broker.

Django Redis stores data as key-value pairs, with the keys being used to find and extract the data from the Redis instance.

Standard databases typically store data on the disc, which incurs extra time and hardware resource costs. In contrast to traditional databases, Django Redis avoids this by keeping all the data in memory, where it is easily accessible and processed more quickly.

Asynchronous Task with Django Celery Redis and Production Using Supervisor

What does an asynchronous task entail ?

A computation that executes on a background thread and publishes its result on the UI thread constitutes an asynchronous task.

Let's make it a little bit simpler to understand :

When something is executed synchronously, you wait for it to complete before moving on to the next task. Asynchronous execution allows you to switch to another task before something is finished.

When using asynchronous execution, you start one routine, let it run in the background while you start another, and then at some point, you tell it to wait until it is finished. Instead, think of :

starts with A->B->C->D-> Wait for A to complete :

The benefit is that by running B, C, and/or D concurrently with A (in the backstory, on a different thread), you can better manage your resources and experience fewer "hangs" and "waits".

Celery Implementation with Django Step by Step :

Step - 1 : Start creating a Django application

You must first make a Django application. I'm not going to explain how to do that again, as it has already been described in the above section of this article, or you can visit this website for the additional information regarding the installation of Django.

Step - 2 : Installing celery in  with pip

Run the following command from the terminal or another command-line program, such as git-bash, before installing Celery Django

Step - 3 : Add the celery.py File to Your Project Module

celery.py :

Step - 4 : Adding this same Celery App to Django

Add the following code to the __init__.py file, which is located on the project module next to thesettings.py file, to make sure that now the Django Celery app is loaded when Django launches.

__init__.py

Step - 5 : Download Redis as a Celery ‘broker’

First, get Redis from the approved download link and install it. Then, go to your terminal and run the following command to start the server in a new terminal window.

By entering the following command into your terminal, you can verify that Django Redis is operating correctly :

Try it : Redis should respond with PONG.

Step - 6 : The Django project should now include Redis as a dependency

Execute this command :

Step - 7 : Configure Celery Stuff in the Django Settings File

Add the following code to your settings.py file after Django Redis has been set up :

CELERY_TIMEZONE can be changed to reflect your time zone or other settings.

I'm done now! As of right now, Celery should work with Django. Check to see if the Celery Django worker is prepared to accept tasks :

Please replace the placeholder my_django_project with the name of your project.

The most crucial action is always to run the celery task, a worker is required

if Redis throws an error that looks like this :

Use the older version of Redis as a workaround.

Add a New Task to the Celery Step by Step :

Step - 1 : Add the tasks.py file to your Django app

tasks.py :

You just need to assign the celery django your new task before you can use it.

Step - 2 : Assign the Celery a Task

You must give the celery a task to complete. You must call this function with a different argument if you want to assign this task. Django Celery provides us with two methods to call tasks :delay() and apply_async().

The delay method can be used to pass arguments to the function.

Open a new terminal tab, navigate to the project directory, activate your environment if applicable, and run this command once more to verify that Django Celery is acting as expected.

Please replace the placeholder my_django_project with the name of your project.

You'll notice a new task.Django Celery has assigned my_first_task.

This is too cool! Right ?

Let's Complete and Test the Task with the Response

Design a View for your App :

views.py :

In the tasks.py file, I assigned the function first_assignment, and I used delay() to assign a task to the celery django worker.

Then use the URL of your app to call the view :

urls.py :

In just 2 or 3 seconds after submitting a request to this URL, you will see the response "response done", but what actually ought to have happened? I used the delay() method to pass the argument 10 to the my_first_task function. Therefore, the duration parameter will use this value of 10 and call sleep (10) The function should have returned after waiting for 10 seconds, but it didn't. The response time was too quick!

This is the power of celery, you received the response promptly even though a worker is working on the task in the background.

Celery in Production Using Supervisor on Linux Server Step by Step :

It's simple to run Django Celery locally : just type celery -A yourprojectname worker -l info to get started. However, something a little more durable is required for production.

Thankfully, it's not at all challenging. Celery Django can be supervised by Supervisor in the same manner as Gunicorn.

The application must start and shut down with the os on a production server. A process control system is generally necessary to accomplish this (especially for daemons like Gunicorn's that don't handle this task themselves). The supervisor is an excellent example and is relatively easy to set up.

Step - 1 : Install Supervisor in Ubuntu Server as the first step

Step - 2 : Adding a.conf file in Supervisor

Anything goes for app_name, but it needs to be related to the name of your project.

Step - 3 : Include a few configurations in app_name.conf

your_app name is completely up to you; anything goes. This name only has a connection to the supervisor. You will be able to access this app in the future using this name.

Describe the configure file as follows :

Step - 4 : Tell the server about configuration

The following two commands should be run after adding a new program to tell the server to read the configuration files again and apply any changes.

Congratulation! You completed all necessary tasks, and now Django Celery is prepared to access the production server!

Conclusion

  • Using Celery to start an asynchronous task that will run serially until it is finished is a common task, and this article explained why and how to do it.

  • With fewer long-running code paths interfering with the web application server's ability to handle new requests, the user experience will be significantly improved as a result.

  • To the best of my ability, I have provided a thorough explanation of the entire process, from creating the development environment to having to implement Django Celery tasks, creating tasks in Django application code, and consuming results via Django and some basic JavaScript.