oops replied to coop only-- resending to list:
I'm not aware of special tools for testing that are useful among the suite
of tools that I use (SQLAlchemy, Alembic).
However, what we've done for Socorro is add a "alembic downgrade -1" and
"alembic upgrade heads" to every integration test run. This is part of the
script that travis-ci runs. This has caught just about every migration
problem possible to catch within the constraints of our existing data
fixtures. It has also enabled everyone on the team to write migrations and
feel confident that it was tested before deploy.
The main barrier to having a well-oiled migrations system in my experience
is that very few people write the migrations, in addition to them being
unreviewed and untested. This results in the bulk of the responsibility to
make these falling on maybe one person who has a hard time getting code
review or participation from the rest of the team (no judgement intended!
-- its just how I've experienced this problem).
I've found that the best way to get migrations working is to:
* cause the acceptance test from automated testing to fail if there's
something wrong
* use a migration system that works well with existing code base and team
practices
For socorro, this meant: migrations in python, NOT SHELL; using a
well-supported and up-to-date package (alembic); using an ORM for the
modeling -- not necessarily for writing queries in the app (SQLAlchemy).
I gave a presentation about our use of alembic at PyCon this year that
talks about most of what I just said, but with slides and for 45 minutes
instead of a couple minutes (lol):
https://www.youtube.com/watch?v=_ZdqwCr4c7Q
> _______________________________________________
> dev-webdev mailing list
>
dev-w...@lists.mozilla.org
>
https://lists.mozilla.org/listinfo/dev-webdev
>