This is the workaround for <https://github.com/savonet/liquidsoap/issues/390>.
I still need to do proper testing on it and maybe we should figure out the proper "formula" for getting to the 0.04 value.
The podcast downloader fails pretty badly when the podcast name contains non ascii chars. The main fail happens during logging; I have learnt way to much about pythons stupid unicode implementation.
This adds addtional debug logging and also outputs the real reason a download fails properly. The content of the tags should be written as UTF-8 or whater is input into it, this commit mainly touches (and fixes) logging.
* [x] regonfigured the build matrix with more php jobs and a separate python job (we can add more python jobs later)
* [x] run tests on travis' trusty beta container (it's closer to what we need anyway)
* [x] install packages needed for analyzer tests in build env
* [x] added docs on how to run nosetests locally
* [x] don't run initctl in analyzer setup so setup can also be used on travis (and add it to the install script directly)
* [x] ignore replaygain checks on travis (it has proven quite impossible to get the needed python-gi module to work in the provided virtualenv)
I tried a lot of solutions to get the replaygain checks to run. I needed to decide that this has gone far enough, maybe someone who is more of a pythonista than me can take a crack at it and get it solved. Even without running those tests on CI/CD there are still plenty others.
This PR only has parts of what are needed for getting python tests running on travis as per #15. I only took a quick shot at anything not analyzer and figured I would not be able to "fix" them without digging a bit deeper (ie. also getting rid of std_err_override).
This is the results of sed -i -e 's|/etc/airtime-saas/|/etc/airtime/|' `grep -irl 'airtime-saas' airtime_mvc/ python_apps/` :P
It might need more testing, the airtime-saas part never really made sense, zf1 has environments for that, ie you would create a saas env based on production for instance.
I beleive legacy upstream was using this to share configuration between customers (ie. analyser runs only once and writes to a shared S3 bucket). I assume they mount the airtime-saas folder onto individual customers instances with a global config. Like I said, I don't feel that this makes sense since all it does is make hacking at the configs in airtime-saas a bit easier. A serious SaaS operation should be using something like puppet or ansible to achieve this.
* Fix various small bugs in auto ingestion and tab implementation
* Update TaskManager run conditions to piggyback on API calls - guarantees a certain frequency of requests and greatly reduces chances of lock contention