Rewrite config from /etc/airtime-saas to plain /etc/airtime
This is the results of sed -i -e 's|/etc/airtime-saas/|/etc/airtime/|' `grep -irl 'airtime-saas' airtime_mvc/ python_apps/` :P It might need more testing, the airtime-saas part never really made sense, zf1 has environments for that, ie you would create a saas env based on production for instance. I beleive legacy upstream was using this to share configuration between customers (ie. analyser runs only once and writes to a shared S3 bucket). I assume they mount the airtime-saas folder onto individual customers instances with a global config. Like I said, I don't feel that this makes sense since all it does is make hacking at the configs in airtime-saas a bit easier. A serious SaaS operation should be using something like puppet or ansible to achieve this.
This commit is contained in:
parent
4557395a86
commit
e28ad471f9
8 changed files with 23 additions and 18 deletions
|
@ -10,7 +10,7 @@ from boto.s3.key import Key
|
|||
# https://github.com/docker/docker-registry/issues/400
|
||||
u'fix getaddrinfo deadlock'.encode('idna')
|
||||
|
||||
CLOUD_CONFIG_PATH = '/etc/airtime-saas/cloud_storage.conf'
|
||||
CLOUD_CONFIG_PATH = os.path.join(os.getenv('LIBRETIME_CONF_DIR', '/etc/airtime'), 'cloud_storage.conf')
|
||||
STORAGE_BACKEND_FILE = "file"
|
||||
SOCKET_TIMEOUT = 240
|
||||
|
||||
|
|
|
@ -6,7 +6,7 @@ from libcloud.storage.providers import get_driver
|
|||
from libcloud.storage.types import Provider, ContainerDoesNotExistError, ObjectDoesNotExistError
|
||||
|
||||
|
||||
CLOUD_CONFIG_PATH = '/etc/airtime-saas/cloud_storage.conf'
|
||||
CLOUD_CONFIG_PATH = os.path.join(os.getenv('LIBRETIME_CONF_DIR', '/etc/airtime'), 'cloud_storage.conf')
|
||||
STORAGE_BACKEND_FILE = "file"
|
||||
|
||||
class CloudStorageUploader:
|
||||
|
|
Loading…
Add table
Add a link
Reference in a new issue