This is the results of sed -i -e 's|/etc/airtime-saas/|/etc/airtime/|' `grep -irl 'airtime-saas' airtime_mvc/ python_apps/` :P It might need more testing, the airtime-saas part never really made sense, zf1 has environments for that, ie you would create a saas env based on production for instance. I beleive legacy upstream was using this to share configuration between customers (ie. analyser runs only once and writes to a shared S3 bucket). I assume they mount the airtime-saas folder onto individual customers instances with a global config. Like I said, I don't feel that this makes sense since all it does is make hacking at the configs in airtime-saas a bit easier. A serious SaaS operation should be using something like puppet or ansible to achieve this.
65 lines
2.1 KiB
ReStructuredText
65 lines
2.1 KiB
ReStructuredText
airtime-celery
|
|
==============
|
|
|
|
airtime-celery is a Celery_ daemon for handling backend tasks asynchronously.
|
|
Communication and the Celery results backend are both handled with amqp (RabbitMQ).
|
|
|
|
Installation
|
|
============
|
|
|
|
$ sudo python setup.py install
|
|
|
|
Each instance of airtime-celery has its own worker, and multiple instances can be run in parallel.
|
|
`Celery is thread-safe`_, so this parallelization won't cause conflicts.
|
|
|
|
.. _Celery: http://www.celeryproject.org/
|
|
.. _Celery is thread-safe: http://celery.readthedocs.org/en/latest/userguide/application.html
|
|
|
|
Usage
|
|
=====
|
|
|
|
This program must be run with sudo:
|
|
|
|
$ sudo service airtime-celery {start | stop | restart | graceful | kill | dryrun | create-paths}
|
|
|
|
Developers
|
|
==========
|
|
|
|
To debug, you can run celery directly from the command line:
|
|
|
|
$ cd /my/airtime/root/python_apps/airtime-celery
|
|
$ RMQ_CONFIG_FILE=${LIBRETIME_CONF_DIR}/airtime.conf celery -A airtime-celery.tasks worker --loglevel=info
|
|
|
|
This worker can be run alongside the service without issue.
|
|
|
|
You may want to use the setuptools develop target to install:
|
|
|
|
$ cd /my/airtime/root/python_apps/airtime-celery
|
|
$ sudo python setup.py develop
|
|
|
|
You will need to allow the "airtime" RabbitMQ user to access all exchanges and queues within the /airtime vhost:
|
|
|
|
$ sudo rabbitmqctl set_permissions -p /airtime airtime .\* .\* .\*
|
|
|
|
Logging
|
|
=======
|
|
|
|
By default, logs are saved to:
|
|
|
|
/var/log/airtime/airtime-celery[-DEV_ENV].log
|
|
|
|
Troubleshooting
|
|
===============
|
|
|
|
If you run into issues getting Celery to accept tasks from Airtime:
|
|
|
|
1) Make sure Celery is running ($ sudo service airtime-celery status).
|
|
|
|
2) Check the log file (/var/log/airtime/airtime-celery[-DEV_ENV].log) to make sure Celery started correctly.
|
|
|
|
3) Check your $LIBRETIME_CONF_DIR/airtime.conf rabbitmq settings. Make sure the settings here align with
|
|
$LIBRETIME_CONF_DIR/$ENVIRONMENT/rabbitmq.ini.
|
|
|
|
4) Check RabbitMQ to make sure the celeryresults and task queues were created in the correct vhost.
|
|
|
|
5) Make sure the RabbitMQ user (the default is airtime) has permissions on all vhosts being used.
|