django - celery parallel tasking error 'no result backend configured' -
running django-celery 3.1.16, celery 3.1.17, django 1.4.16. trying run parallel tasks using 3 workers , collect results using following:
from celery import group positions = [] jobs = group(celery_calculate_something.s(data.id) data in a_very_big_list) results = jobs.apply_async() positions.extend(results.get())
the task celery_calculate_something returns object place in results list:
app.task(ignore_result=false) def celery_calculate_something(id): <do stuff>
no matter try, same result when calling get() on results:
no result backend configured. please see documentation more information.
however, results backend configured - have many other tasks ignore_result=false merrily adding tasks meta table in django_celery. using results returned group(). should note not set explicitly in settings - seems django-celery has set automatically you.
i have worker collecting events using:
manage.py celery worker -l info -e
and celerycam running
python manage.py celerycam
inspecting results object returned (an instance of groupresult) can see backend attr instance of disabledbackend. problem? have mis-understood?
you did not configure results backend, need tables store results, since have django-celery add installed_apps in settings.py file , perform migration (python manage.py migrate
) after open celery.py file , modify backend djcelery.backends.database:databasebackend.
here's example
app = celery('almanet', broker='amqp://guest@localhost//', backend='djcelery.backends.database:databasebackend', include=['alm_crm.tasks'] #references tasks. donc forget put whole absolute path. )
after can import results from celery import result
can save result , extract result job.id
from celery import group positions = [] jobs = group(celery_calculate_something.s(data.id) data in a_very_big_list) results = jobs.apply_async() results.save() some_task_result = result.groupresult.restore(results.id) print some_task_results.ready()
Comments
Post a Comment