Rewrite config from /etc/airtime-saas to plain /etc/airtime
This is the results of sed -i -e 's|/etc/airtime-saas/|/etc/airtime/|' `grep -irl 'airtime-saas' airtime_mvc/ python_apps/` :P It might need more testing, the airtime-saas part never really made sense, zf1 has environments for that, ie you would create a saas env based on production for instance. I beleive legacy upstream was using this to share configuration between customers (ie. analyser runs only once and writes to a shared S3 bucket). I assume they mount the airtime-saas folder onto individual customers instances with a global config. Like I said, I don't feel that this makes sense since all it does is make hacking at the configs in airtime-saas a bit easier. A serious SaaS operation should be using something like puppet or ansible to achieve this.
This commit is contained in:
parent
4557395a86
commit
e28ad471f9
8 changed files with 23 additions and 18 deletions
|
@ -28,7 +28,7 @@ Developers
|
|||
To debug, you can run celery directly from the command line:
|
||||
|
||||
$ cd /my/airtime/root/python_apps/airtime-celery
|
||||
$ RMQ_CONFIG_FILE=/etc/airtime/airtime.conf celery -A airtime-celery.tasks worker --loglevel=info
|
||||
$ RMQ_CONFIG_FILE=${LIBRETIME_CONF_DIR}/airtime.conf celery -A airtime-celery.tasks worker --loglevel=info
|
||||
|
||||
This worker can be run alongside the service without issue.
|
||||
|
||||
|
@ -57,8 +57,8 @@ If you run into issues getting Celery to accept tasks from Airtime:
|
|||
|
||||
2) Check the log file (/var/log/airtime/airtime-celery[-DEV_ENV].log) to make sure Celery started correctly.
|
||||
|
||||
3) Check your /etc/airtime/airtime.conf rabbitmq settings. Make sure the settings here align with
|
||||
/etc/airtime-saas/production/rabbitmq.ini.
|
||||
3) Check your $LIBRETIME_CONF_DIR/airtime.conf rabbitmq settings. Make sure the settings here align with
|
||||
$LIBRETIME_CONF_DIR/$ENVIRONMENT/rabbitmq.ini.
|
||||
|
||||
4) Check RabbitMQ to make sure the celeryresults and task queues were created in the correct vhost.
|
||||
|
||||
|
|
|
@ -10,7 +10,7 @@ from boto.s3.key import Key
|
|||
# https://github.com/docker/docker-registry/issues/400
|
||||
u'fix getaddrinfo deadlock'.encode('idna')
|
||||
|
||||
CLOUD_CONFIG_PATH = '/etc/airtime-saas/cloud_storage.conf'
|
||||
CLOUD_CONFIG_PATH = os.path.join(os.getenv('LIBRETIME_CONF_DIR', '/etc/airtime'), 'cloud_storage.conf')
|
||||
STORAGE_BACKEND_FILE = "file"
|
||||
SOCKET_TIMEOUT = 240
|
||||
|
||||
|
|
|
@ -6,7 +6,7 @@ from libcloud.storage.providers import get_driver
|
|||
from libcloud.storage.types import Provider, ContainerDoesNotExistError, ObjectDoesNotExistError
|
||||
|
||||
|
||||
CLOUD_CONFIG_PATH = '/etc/airtime-saas/cloud_storage.conf'
|
||||
CLOUD_CONFIG_PATH = os.path.join(os.getenv('LIBRETIME_CONF_DIR', '/etc/airtime'), 'cloud_storage.conf')
|
||||
STORAGE_BACKEND_FILE = "file"
|
||||
|
||||
class CloudStorageUploader:
|
||||
|
|
|
@ -8,8 +8,9 @@ import os
|
|||
import airtime_analyzer.airtime_analyzer as aa
|
||||
|
||||
VERSION = "1.0"
|
||||
DEFAULT_RMQ_CONFIG_PATH = '/etc/airtime/airtime.conf'
|
||||
DEFAULT_CLOUD_STORAGE_CONFIG_PATH = '/etc/airtime-saas/production/cloud_storage.conf'
|
||||
LIBRETIME_CONF_DIR = os.getenv('LIBRETIME_CONF_DIR', '/etc/airtime')
|
||||
DEFAULT_RMQ_CONFIG_PATH = os.path.join(LIBRETIME_CONF_DIR, 'airtime.conf')
|
||||
DEFAULT_CLOUD_STORAGE_CONFIG_PATH = os.path.join(LIBRETIME_CONF_DIR, os.getenv('ENVIRONMENT', 'production'), 'airtime.conf')
|
||||
DEFAULT_HTTP_RETRY_PATH = '/tmp/airtime_analyzer_http_retries'
|
||||
|
||||
def run():
|
||||
|
|
|
@ -22,7 +22,7 @@ def teardown():
|
|||
def test_basic():
|
||||
filename = os.path.basename(DEFAULT_AUDIO_FILE)
|
||||
q = Queue.Queue()
|
||||
#cloud_storage_config_path = '/etc/airtime-saas/production/cloud_storage.conf'
|
||||
#cloud_storage_config_path = os.path.join(os.getenv('LIBRETIME_CONF_DIR', '/etc/airtime'), '/production/cloud_storage.conf')
|
||||
#cloud_storage_config = config_file.read_config_file(cloud_storage_config_path)
|
||||
cloud_storage_config = SafeConfigParser()
|
||||
cloud_storage_config.add_section("current_backend")
|
||||
|
|
Loading…
Add table
Add a link
Reference in a new issue