Run python-rq worker process on application start
I hosted my Django app on Heroku but due to few limitations I moved from Heroku to cloud based server. I followed this tutorial on running background tasks in Python. Everything is running fine except that I have to manually run python worker.py to start worker process. On Heroku we can use Procfile to run processes when app starts but now I am on a cloud based server running ubuntu 14.04. So what is the alternative to Procfile? worker.py import os import redis from rq import Worker, Queue, Connection listen = ['high', 'default', 'low'] redis_url = os.getenv('REDISTOGO_URL', 'redis://localhost:6379') conn = redis.from_url(redis_url) if __name__ == '__main__': with Connection(conn): worker = Worker(map(Queue, listen)) worker.work()
I ended up using upstart. I created a new config file rqworker.py using sudo nano \etc\init\rqworker.conf with the following code: description "Job queues for directory" start on runlevel  stop on runlevel [!2345] respawn setuid myuser setgid www-data exec python3.5 worker.py Then I just started the service sudo service rqworker start and now my worker processes are running in the background.
Use a process manager like upstart, systemd or supervisor.
solve an equation containing integrals
How can I convert a SHA256 hash from integer to string and back?
Urllib2 timeot doesn't work
Exception running Python Flask app on AWS Elastic Beanstalk
Finding mean of a values in a dictionary
Searching for a particular string in python
Vectorized implementation of an image convolve function
Using variables from a returned function as arguments in another function
Why use user nobody nogroup in upstart services or daemon services
Grouping a list in python specifically
How to use python to loop through all possible results in postgresql？
Python: multiple assignment vs. individual assignment speed
reading graph from titan using python without rexter server
Overwriting builtins python class
How to get name of a required page in flask?
Outputting to a file in HDFS using a subprocess