你正在阅读 Celery 3.1 的文档。开发版本文档见: 此处.
The app branch is a work-in-progress to remove the use of a global configuration in Celery.
Celery can now be instantiated, which means several instances of Celery may exist in the same process space. Also, large parts can be customized without resorting to monkey patching.
Creating a Celery instance:
>>> from celery import Celery
>>> app = Celery()
>>> app.config_from_object("celeryconfig")
>>> #app.config_from_envvar("CELERY_CONFIG_MODULE")
Creating tasks:
@app.task
def add(x, y):
return x + y
Creating custom Task subclasses:
Task = celery.create_task_cls()
class DebugTask(Task):
abstract = True
def on_failure(self, *args, **kwargs):
import pdb
pdb.set_trace()
@app.task(base=DebugTask)
def add(x, y):
return x + y
Starting a worker:
worker = celery.Worker(loglevel="INFO")
Getting access to the configuration:
celery.conf.CELERY_ALWAYS_EAGER = True
celery.conf["CELERY_ALWAYS_EAGER"] = True
Controlling workers:
>>> celery.control.inspect().active()
>>> celery.control.rate_limit(add.name, "100/m")
>>> celery.control.broadcast("shutdown")
>>> celery.control.discard_all()
Other interesting attributes:
# Establish broker connection.
>>> celery.broker_connection()
# AMQP Specific features.
>>> celery.amqp
>>> celery.amqp.Router
>>> celery.amqp.get_queues()
>>> celery.amqp.get_task_consumer()
# Loader
>>> celery.loader
# Default backend
>>> celery.backend
As you can probably see, this really opens up another dimension of customization abilities.
celery.task.ping celery.task.PingTask
Inferior to the ping remote control command. Will be removed in Celery 2.3.
Use: celery.utils.compat.defaultdict()
Use: celery.utils.compat.all()
Use app.send_task
Use celery.registry.tasks
celery.conf.* -> {app.conf}
NOTE: All configuration keys are now named the same as in the configuration. So the key “CELERY_ALWAYS_EAGER” is accessed as:
>>> app.conf.CELERY_ALWAYS_EAGERinstead of:
>>> from celery import conf >>> conf.ALWAYS_EAGER
- .get_queues -> {app.amqp.get_queues}
To be backward compatible, it must be possible to use all the classes/functions without passing an explicit app instance.
This is achieved by having all app-dependent objects use default_app if the app instance is missing.
from celery.app import app_or_default
class SomeClass(object):
def __init__(self, app=None):
self.app = app_or_default(app)
The problem with this approach is that there is a chance that the app instance is lost along the way, and everything seems to be working normally. Testing app instance leaks is hard. The environment variable CELERY_TRACE_APP can be used, when this is enabled celery.app.app_or_default() will raise an exception whenever it has to go back to the default app instance.
celery.loaders.base.BaseLoader
celery.backends.base.BaseBackend
celery.worker.job.TaskRequest
celery.events.EventDispatcher
celery.pidbox.BroadcastConsumer
celery.worker.controllers.Mediator
celery.beat.EmbeddedService
celery.bin.amqp.AMQPAdmin