No Downtime Code Releases

93 views
Skip to first unread message

bliy...@rentlytics.com

unread,
Apr 19, 2016, 1:37:19 AM4/19/16
to Django users
Hey,

I have two issues I'm looking at solving at work, and I'm looking for a couple suggestions as to how other people have solved this.  The two things are:

* scale out their django installation to allow for smaller releases (I'm thinking microservices, but it could also be internal django apps or who knows what else)
* minimizing the impact of migrations during releases (aka we want to be able to release in the middle of the afternoon

Currently we put up a maintenance page whenever we are doing database operations (aka migrations).  This seems like a recommended best practice.

One way I was thinking about addressing this issue was to break all of our models out into a separate repo.  That way we'd only need to deploy migrations when the models themselves have deployed.  For code that needs the models, we could pip install the repo as an app and away we go.  Likewise it seems like I could break up different parts of our app via a similar strategy.

Does this seem viable?  How have other people solved this kind of problem?

Thanks,

-Ben

Avraham Serour

unread,
Apr 19, 2016, 4:13:30 PM4/19/16
to django-users
I don't think you would gain anything by separating your models to a different repository, what are you trying to gain here?

if you put a maintenance page when doing migrations it won't matter if the models are from a different package or not.

you could still run migrations on a live system, you just should take into account that there could still be parts of the system using something is not there yet/anymore

so you should break migrations into 2 whenever you are adding or removing something.

when adding a model or field you should first run the migrations and only after that deploy the new code using the new model/field

when removing something you should first stop using it and then migrate.

you could plan your deployment/releases and know in advance if you are either adding or removing something and never add and remove in the same release
meaning commit and deploy the model and only after that commit the code using the new model

or you can checkout the code on the side and runs migrations using this separate env, this way you could add a new model and use it in the same commit.

for removing you can just do it backwards.


Avraham



--
You received this message because you are subscribed to the Google Groups "Django users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to django-users...@googlegroups.com.
To post to this group, send email to django...@googlegroups.com.
Visit this group at https://groups.google.com/group/django-users.
To view this discussion on the web visit https://groups.google.com/d/msgid/django-users/e5fd0359-9e8b-4cce-b3e1-4880951a2a8e%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Fred Stluka

unread,
Apr 19, 2016, 7:08:57 PM4/19/16
to django...@googlegroups.com
Ben,

I minimize downtime as much as possible by doing things in
advance like copying all of the new files to a staging area on
the PROD system, automatically inserting PROD passwords,
running collectstatic, dumping the DB in case of problems,
etc.  Then, I put up the maintenance page, quickly rsync the
new files into place, run migrations, and hide the maintenance
page.

We used to shoot for releases with no downtime by copying
the *.py files into place, and letting Django notice and re-load
the *.pyc files automatically, but we ran onto some strange
issues sometimes.  Seems like Django continued to use some
of the cached *.pyc files for a while.

It's worked out better to always delete all *.pyc files before
rsyncing the *.py files into place, and to always restart the
Apache server just before hiding the maintenance page, so
we're sure everything gets reloaded cleanly.

This has also been a good idea as we've added more caching:
- Template files
- Fully assembled pages
- DB data
- etc.

Hope this helps,
--Fred
Fred Stluka -- mailto:fr...@bristle.com -- http://bristle.com/~fred/
Bristle Software, Inc -- http://bristle.com -- Glad to be of service!
Open Source: Without walls and fences, we need no Windows or Gates.

Avraham Serour

unread,
Apr 19, 2016, 9:49:59 PM4/19/16
to django-users
you can easily do code reloading with uwsgi graceful reload, no user will ever know you reloaded your application, no need to juggle, I mean you aren't using mod_wsgi or anything like that right?

Vijay Khemlani

unread,
Apr 19, 2016, 10:57:18 PM4/19/16
to django...@googlegroups.com
Also, you don't need to restart Apache / nginx or whatever, or delete the pyc files, just reload uwsgi / gunicorn.

graeme

unread,
Apr 20, 2016, 4:29:48 AM4/20/16
to Django users


On Wednesday, April 20, 2016 at 3:19:59 AM UTC+5:30, Avraham Serour wrote:
you can easily do code reloading with uwsgi graceful reload, no user will ever know you reloaded your application, no need to juggle, I mean you aren't using mod_wsgi or anything like that right?


Graceful reloading by itself does not really solve the problem - the biggest problem is ensuring migrations do not break code (.e.g because a column is missing).



Daniel Chimeno

unread,
Apr 20, 2016, 8:26:50 AM4/20/16
to Django users

bliy...@rentlytics.com

unread,
Apr 20, 2016, 6:59:27 PM4/20/16
to Django users
My thought process for separating the models into a separate repo is something like this:

I am predominately putting up our (heroku) maintenace page when migrations are run
If the models are in separte repo, I only need to run migrations when that repo is deployed.  If migrations are not deployed in my non-model repos, I would skip putting up the maintenance page. 

Most of our db changes are to accommodate back-end data warehousing.  The other/additional way I was thinking about spliting our app up was separating the backend processes from our front end API's that primarily deal with security and authorization.  The security and authorization models do not change very often, so an app that only deals with that part would not need to restart very often.  That said, there are a couple models that we need to run that that are shared with our back-end processes.  Having those models split out would potentially let us pip install them into a separate service.

However it sounds like you guys are saying there are other concerns--namely flushing the pyc files.  I'm not exactly sure this is relevant to me since we are using heroku, and an entire new server/slug is deployed when we do a code release. It's not clear to me if that slug deployment would drop connections or cause other kinds of problems.

Alex Heyden

unread,
Apr 20, 2016, 7:07:56 PM4/20/16
to django...@googlegroups.com
I wouldn't recommend tying your repository structure to your deployment needs. Put that in the deployment script instead. You can see if any migrations are pending with manage.py migrate --list

Tom Christie

unread,
Apr 21, 2016, 11:44:33 AM4/21/16
to Django users
I'd also recommend against splitting your models into a separate repo.

There's a decent article on migrations without downtime here: http://pankrat.github.io/2015/django-migrations-without-downtimes/

Essentially "don't run your migrations at the same time as you deploy" and "split deployment/migration process into multiple steps, so you don't ever need to be in maintenance mode". Doing this is more work, so weigh those costs against your costs from being in maintenance mode for a few seconds every now and then, and figure out which is more important from a business point of view.

For your other consideration of "smaller releases" I'd strongly recommend feature flags. Together with a well-tested codebase they'll make it far easier to get to continuous integration and release.

Small, frequent releases also require good monitoring. I'd recommend taking a look at opbeat on that front.

Cheers,

   Tom
Reply all
Reply to author
Forward
0 new messages