你正在阅读 Celery 3.1 的文档。开发版本文档见: 此处.

celery — Distributed processing


This module is the main entry-point for the Celery API. It includes commonly needed things for calling tasks, and creating Celery applications.

Celery celery application instance
group group tasks together
chain chain tasks together
chord chords enable callbacks for groups
signature object describing a task invocation
current_app proxy to the current application instance
current_task proxy to the currently executing task

Celery application objects

2.5 新版功能.

class celery.Celery(main='__main__', broker='amqp://localhost//', )
参数:
  • main – Name of the main module if running as __main__. This is used as a prefix for task names.
  • broker – URL of the default broker used.
  • loader – The loader class, or the name of the loader class to use. Default is celery.loaders.app.AppLoader.
  • backend – The result store backend class, or the name of the backend class to use. Default is the value of the CELERY_RESULT_BACKEND setting.
  • amqp – AMQP object or class name.
  • events – Events object or class name.
  • log – Log object or class name.
  • control – Control object or class name.
  • set_as_current – Make this the global current app.
  • tasks – A task registry or the name of a registry class.
  • include – List of modules every worker should import.
  • fixups – List of fixup plug-ins (see e.g. celery.fixups.django).
  • autofinalize – If set to False a RuntimeError will be raised if the task registry or tasks are used before the app is finalized.
main

Name of the __main__ module. Required for standalone scripts.

If set this will be used instead of __main__ when automatically generating task names.

conf

Current configuration.

user_options

Custom options for command-line programs. See Adding new command-line options

steps

Custom bootsteps to extend and modify the worker. See Installing Bootsteps.

current_task

The instance of the task that is being executed, or None.

amqp

AMQP related functionality: amqp.

backend

Current backend instance.

loader

Current loader instance.

control

Remote control: control.

events

Consuming and sending events: events.

log

Logging: log.

tasks

Task registry.

Accessing this attribute will also finalize the app.

pool

Broker connection pool: pool. This attribute is not related to the workers concurrency pool.

Task

Base task class for this app.

timezone

Current timezone for this app. This is a cached property taking the time zone from the CELERY_TIMEZONE setting.

close()

Close any open pool connections and do any other steps necessary to clean up after the application.

Only necessary for dynamically created apps for which you can use the with statement instead:

with Celery(set_as_current=False) as app:
    with app.connection() as conn:
        pass
signature()

Return a new Signature bound to this app. See signature()

bugreport()

Return a string with information useful for the Celery core developers when reporting a bug.

config_from_object(obj, silent=False, force=False)

Reads configuration from object, where object is either an object or the name of a module to import.

参数:
  • silent – If true then import errors will be ignored.
  • force – Force reading configuration immediately. By default the configuration will be read only when required.
>>> celery.config_from_object("myapp.celeryconfig")

>>> from myapp import celeryconfig
>>> celery.config_from_object(celeryconfig)
Celery.config_from_envvar(variable_name,
silent=False, force=False)

Read configuration from environment variable.

The value of the environment variable must be the name of a module to import.

>>> os.environ["CELERY_CONFIG_MODULE"] = "myapp.celeryconfig"
>>> celery.config_from_envvar("CELERY_CONFIG_MODULE")
autodiscover_tasks(packages, related_name="tasks")

With a list of packages, try to import modules of a specific name (by default ‘tasks’).

For example if you have an (imagined) directory tree like this:

foo/__init__.py
   tasks.py
   models.py

bar/__init__.py
    tasks.py
    models.py

baz/__init__.py
    models.py

Then calling app.autodiscover_tasks(['foo', bar', 'baz']) will result in the modules foo.tasks and bar.tasks being imported.

参数:
  • packages – List of packages to search. This argument may also be a callable, in which case the value returned is used (for lazy evaluation).
  • related_name – The name of the module to find. Defaults to “tasks”, which means it look for “module.tasks” for every module in packages.
  • force – By default this call is lazy so that the actual autodiscovery will not happen until an application imports the default modules. Forcing will cause the autodiscovery to happen immediately.
add_defaults(d)

Add default configuration from dict d.

If the argument is a callable function then it will be regarded as a promise, and it won’t be loaded until the configuration is actually needed.

This method can be compared to:

>>> celery.conf.update(d)

with a difference that 1) no copy will be made and 2) the dict will not be transferred when the worker spawns child processes, so it’s important that the same configuration happens at import time when pickle restores the object on the other side.

setup_security()

Setup the message-signing serializer. This will affect all application instances (a global operation).

Disables untrusted serializers and if configured to use the auth serializer will register the auth serializer with the provided settings into the Kombu serializer registry.

参数:
  • allowed_serializers – List of serializer names, or content_types that should be exempt from being disabled.
  • key – Name of private key file to use. Defaults to the CELERY_SECURITY_KEY setting.
  • cert – Name of certificate file to use. Defaults to the CELERY_SECURITY_CERTIFICATE setting.
  • store – Directory containing certificates. Defaults to the CELERY_SECURITY_CERT_STORE setting.
  • digest – Digest algorithm used when signing messages. Default is sha1.
  • serializer – Serializer used to encode messages after they have been signed. See CELERY_TASK_SERIALIZER for the serializers supported. Default is json.
start(argv=None)

Run celery using argv.

Uses sys.argv if argv is not specified.

task(fun, )

Decorator to create a task class out of any callable.

Examples:

@app.task
def refresh_feed(url):
    return …

with setting extra options:

@app.task(exchange="feeds")
def refresh_feed(url):
    return …

App Binding

For custom apps the task decorator will return a proxy object, so that the act of creating the task is not performed until the task is used or the task registry is accessed.

If you are depending on binding to be deferred, then you must not access any attributes on the returned object until the application is fully set up (finalized).

send_task(name[, args[, kwargs[, ]]])

Send task by name.

参数:
  • name – Name of task to call (e.g. “tasks.add”).
  • result_cls – Specify custom result class. Default is using AsyncResult().

Otherwise supports the same arguments as Task.apply_async().

AsyncResult

Create new result instance. See AsyncResult.

GroupResult

Create new group result instance. See GroupResult.

worker_main(argv=None)

Run celery worker using argv.

Uses sys.argv if argv is not specified.

Worker

Worker application. See Worker.

WorkController

Embeddable worker. See WorkController.

Beat

Celerybeat scheduler application. See Beat.

connection(url=default[, ssl[, transport_options={}]])

Establish a connection to the message broker.

参数:
  • url – Either the URL or the hostname of the broker to use.
  • hostname – URL, Hostname/IP-address of the broker. If an URL is used, then the other argument below will be taken from the URL instead.
  • userid – Username to authenticate as.
  • password – Password to authenticate with
  • virtual_host – Virtual host to use (domain).
  • port – Port to connect to.
  • ssl – Defaults to the BROKER_USE_SSL setting.
  • transport – defaults to the BROKER_TRANSPORT setting.

:returns kombu.Connection:

connection_or_acquire(connection=None)

For use within a with-statement to get a connection from the pool if one is not already provided.

参数:connection – If not provided, then a connection will be acquired from the connection pool.
producer_or_acquire(producer=None)

For use within a with-statement to get a producer from the pool if one is not already provided

参数:producer – If not provided, then a producer will be acquired from the producer pool.
mail_admins(subject, body, fail_silently=False)

Sends an email to the admins in the ADMINS setting.

select_queues(queues=[])

Select a subset of queues, where queues must be a list of queue names to keep.

now()

Return the current time and date as a datetime object.

set_current()

Makes this the current app for this thread.

finalize()

Finalizes the app by loading built-in tasks, and evaluating pending task decorators

Pickler

Helper class used to pickle this application.

Canvas primitives

See Canvas: Designing Workflows for more about creating task workflows.

class celery.group(task1[, task2[, task3[, … taskN]]])

Creates a group of tasks to be executed in parallel.

Example:

>>> res = group([add.s(2, 2), add.s(4, 4)])()
>>> res.get()
[4, 8]

A group is lazy so you must call it to take action and evaluate the group.

Will return a group task that when called will then call of the tasks in the group (and return a GroupResult instance that can be used to inspect the state of the group).

class celery.chain(task1[, task2[, task3[, … taskN]]])

Chains tasks together, so that each tasks follows each other by being applied as a callback of the previous task.

If called with only one argument, then that argument must be an iterable of tasks to chain.

Example:

>>> res = chain(add.s(2, 2), add.s(4))()

is effectively (2 + 2) + 4):

>>> res.get()
8

Calling a chain will return the result of the last task in the chain. You can get to the other tasks by following the result.parent‘s:

>>> res.parent.get()
4
class celery.chord(header[, body])

A chord consists of a header and a body. The header is a group of tasks that must complete before the callback is called. A chord is essentially a callback for a group of tasks.

Example:

>>> res = chord([add.s(2, 2), add.s(4, 4)])(sum_task.s())

is effectively \Sigma ((2 + 2) + (4 + 4)):

>>> res.get()
12

The body is applied with the return values of all the header tasks as a list.

class celery.signature(task=None, args=(), kwargs={}, options={})

Describes the arguments and execution options for a single task invocation.

Used as the parts in a group or to safely pass tasks around as callbacks.

Signatures can also be created from tasks:

>>> add.subtask(args=(), kwargs={}, options={})

or the .s() shortcut:

>>> add.s(*args, **kwargs)
参数:
  • task – Either a task class/instance, or the name of a task.
  • args – Positional arguments to apply.
  • kwargs – Keyword arguments to apply.
  • options – Additional options to Task.apply_async().

Note that if the first argument is a dict, the other arguments will be ignored and the values in the dict will be used instead.

>>> s = signature("tasks.add", args=(2, 2))
>>> signature(s)
{"task": "tasks.add", args=(2, 2), kwargs={}, options={}}
__call__(*args **kwargs)

Call the task directly (in the current process).

delay(*args, **kwargs)

Shortcut to apply_async().

apply_async(args=(), kwargs={}, )

Apply this task asynchronously.

参数:
  • args – Partial args to be prepended to the existing args.
  • kwargs – Partial kwargs to be merged with the existing kwargs.
  • options – Partial options to be merged with the existing options.

See apply_async().

apply(args=(), kwargs={}, )

Same as apply_async() but executed the task inline instead of sending a task message.

freeze(_id=None)

Finalize the signature by adding a concrete task id. The task will not be called and you should not call the signature twice after freezing it as that will result in two task messages using the same task id.

返回:celery.AsyncResult instance.
clone(args=(), kwargs={}, )

Return a copy of this signature.

参数:
  • args – Partial args to be prepended to the existing args.
  • kwargs – Partial kwargs to be merged with the existing kwargs.
  • options – Partial options to be merged with the existing options.
replace(args=None, kwargs=None, options=None)

Replace the args, kwargs or options set for this signature. These are only replaced if the selected is not None.

Add a callback task to be applied if this task executes successfully.

返回:other_signature (to work with reduce()).

Add a callback task to be applied if an error occurs while executing this task.

返回:other_signature (to work with reduce())
set()

Set arbitrary options (same as .options.update(…)).

This is a chaining method call (i.e. it will return self).

Gives a recursive list of dependencies (unchain if you will, but with links intact).

Proxies

celery.current_app

The currently set app for this thread.

celery.current_task

The task currently being executed (only set in the worker, or when eager/apply is used).

上一个主题

API Reference

下一个主题

celery.app

本页