Job Queuing 101: Start using Bull in your Node.js Project (Part I)
If you haven’t implemented job queuing in any project yet, this article is for you. A guide to setup job queues with a simple UI to monitor the jobs and handle their prioritization and repetition using the package Bull in a Koa application will be explained. And to do that, we will show it from the scratch in 2 parts. Here in the first one we will create our first queue and see the entire process from beginning to end. And on the second part we will add some functionalities as well as a UI.
For this tutorial, we are going to simulate that we are an online store that receives orders and we have to process them.
Before We Start
Even though the purpose of this article is not to explain in depth the meanings of job queuing or any other thing, let’s see a brief explanation of some concepts.
What is a Job Queue?
They are data structures that stores/caches ordered lists of jobs waiting to be processed by a job processor, who is later in charge of process these jobs in order of priority, retry them if they failed, and many other things.
They are useful for scenarios where a process/job can be heavy and running them in parallel on a machine with limited resources can be inefficient.
There exists 3 roles in a job queue:
- Producer
It is the one in charge of adding new jobs to the queue. - Consumer/Worker
It is responsible for processing the jobs that are waiting in the queue. - Listener
It listens for events happening either in a particular queue instance or globally.
What is Bull?
Bull is Node library that implements a fast and robust queue system based on Redis. It makes the access to Redis low-level functionalities pretty simple and intuitive with its API, that also enriches them.
You could implement queues with Redis directly, but it will mean more headaches and more time to get the most out of it.
What is Koa?
Koa.js is an open source Node.js web framework that was designed by the team behind express. As their official website says, the framework aims to be a smaller, more expressive, and more robust foundation for web applications and APIs. Koa uses asynchronous functions to help eliminate the need for callbacks and significantly improves error handling.
Let’s begin
Installation
We will start by creating a Koa server with the basics. So we will execute the following commands on the terminal:
$ npm init -y // To initialize a Node project$ npm install koa // Our application is ready to use Koa$ npm install @koa/router // Router for our koa application$ npm install koa-bodyparser // Parses the body in request$ npm install nodemon --save-dev // For running our server
Once the package.json file is generated. we will need to do 2 things. Firstly, we will obviously need to install two packages: bull and bull-board. The last one is a third-party library developed on top of bull to help you visualize your queues and their jobs with a UI.
$ npm install bull$ npm install @bull-board/koa // UI library
As I mentioned before, Bull works with Redis as we need to store/cache the job-description of each of the jobs, so the next thing we will have to do is to install Redis on our machine. This process can be done in 2 ways; either install Redis on your computer, or use Redis with Docker.
I am going to do it using the second way, which can be done pulling the image from here. In case you want to do it using the first way, here I leave you a link to install it on a Mac.
We need the Redis server to be up and running when we run our project.
First steps
Let’s start by creating a folder named src
with our entry file for the application which will be named index.js
, and setting up our Koa server.
We have just created a simple Koa server with a router that has an endpoint that indicates if our server is running.
Creating our First Queue
We want to create a queue that handles the orders that we are going to be receiving, being our producer. So let’s start by creating a folder named queues
and a file named order-queue.js
:
With the code above, all we did is to create our orders queue, passing our Redis url, which in this case will be the local one (redis://localhost:6379) as Redis by default listens in port 6379.
Then, we created the createNewOrder
method that will receive an order and insert it into our queue. For now, we will leave the options empty but in the future we will see what can we put here.
Creating our First Consumer
We will create a new file named orders-queue-consumer.js
where we will indicate the process function. We will export it as we will use it in our queue declared before.
The process
function will be called every time the worker is idling and there are jobs to process in the queue.
Now, we just need to add the process function created as the callback for the queue process function. By just adding this:
We just added the function that we exported before and indicated it in ordersQueue.process(ordersProcess)
.
Time to Test our First Order
All we want to do now is to test if everything is working fine. To do that, we are going to create and endpoint that will simulate a new order received in our online store, so let’s create it real quick in our index.js
file under the route /order
as a POST method.
Now, in order to see it all working, we just go to Insomnia (or similar), and make a request to this endpoint. What it should happen is that, besides receiving the response with a 200 status of course, we should check in our Redis local database that the order has been queued. So let’s start our server and make the request.
In order to check our Redis storage, I personally use an extension available for VSCode named Redis Client. After doing the request, we get that a job was enqueued with its properties as seen below.
And we will see that the process log has been invoked and therefore, the order was logged as shown below:
Check part II to start adding functionalities and a dashboard to control our queue!