9

I have one server that runs a Django application served by gunicorn and a Celery task queue.

Gunicorn docs suggest (2 x $num_cores) + 1 as the default number of workers.

Celery docs show that the number of Celery workers defaults to 1 x $num_cores.

And both suggest experimenting to find the proper number.

My question is, what would be a good rule of thumb for running both Gunicorn and Celery on the same machine? On an eight core machine should I start with 17 Gunicorn and 8 Celery workers? Or would it make sense to start with say 9 Gunicorn and 4 Celery workers?

The system is CPU-bound, in case that helps.

YPCrumble
  • 185

1 Answers1

1

Any sort of performance tuning is often more sooth saying then hard and fast rules. I recently had to do something similar with my Django app that someone decided to abuse. When you run both Gunicorn and Celery on the same machine, you should ideally start with a more balanced approach rather than maxing out the Gunicorn workers. This is my approach:

  • Gunicorn Workers:

    • Begin with a slightly lower number of Gunicorn workers using the formula: 2 * $num_cores + 1
    • For an 8-core machine, you can start with 8-12 Gunicorn workers
    • The reasoning behind this is that you want to make sure that there are still some CPU resources left for Celery workers, as Celery tasks can be CPU-intensive.
  • Celery Workers:

    • You can use the ‘recommended’ number of Celery workers
    • It can be defined as 1 * $num_cores
    • For an 8-core machine, you may start with 8 Celery workers, for instance.

Therefore, on an 8-core machine, a good starting setup could look like:

  • Gunicorn: 8-12 workers
  • Celery: 8 workers

The goal is to not overwhelming the Gunicorn workers, you leave the CPU resources at least for the Celery workers.