deleting a sqlite database

1,486 views
Skip to first unread message

lars van gemerden

unread,
Feb 25, 2012, 6:57:56 AM2/25/12
to sqlalchemy
Hi all,

I looking for a way to delete/empty a sqlite database completely (can
be in memory or file). I have tried:

- metadata.drop_all(engine); which does not work with cyclic
references

- solution from http://www.sqlalchemy.org/trac/wiki/UsageRecipes/DropEverything;
it requires names of foreign key columns to be returned by the
inspector class, however these names are empty

- some other ways.

I am unittesting dynamic generation of tables/classes dynamically from
definitions in another database. To be able to run multiple tests in
one run, my tearDown method must delete/empty the database.

Any tips or solutions? Have I overlooked something obvious? It seems
that it should not be that complicated.

Cheers, Lars

lars van gemerden

unread,
Feb 25, 2012, 6:58:12 AM2/25/12
to sqlalchemy

Michael Bayer

unread,
Feb 25, 2012, 9:46:06 AM2/25/12
to sqlal...@googlegroups.com

On Feb 25, 2012, at 6:57 AM, lars van gemerden wrote:

> Hi all,
>
> I looking for a way to delete/empty a sqlite database completely (can
> be in memory or file). I have tried:
>
> - metadata.drop_all(engine); which does not work with cyclic
> references
>

The cycles here are only between the ForeignKey objects themselves, in Python. SQLite doesn't care. Add the use_alter=True flag to each ForeignKey in a cycle.


> - solution from http://www.sqlalchemy.org/trac/wiki/UsageRecipes/DropEverything;
> it requires names of foreign key columns to be returned by the
> inspector class, however these names are empty

are you using "PRAGMA foreign_keys = ON" ? I've run this recipe with that flag on and off and while it doesn't get the foreign key constraints explicitly, it still drops all the tables just fine. Perhaps I'm not actually achieving the "foreign_keys=ON" part since SQLite still lets me drop a table that others are dependent on.


lars van gemerden

unread,
Feb 26, 2012, 12:47:02 PM2/26/12
to sqlalchemy
I was wrong, the method emptied the database, but I was checking the
tables in the metadata.

This time I am also removing the tables from the metadata, but if i
generate the same tables in two separate test methods (with a call to
tearDown ans setUp in between), I still get an error about a backref
name on a relationship already existing. If I run the same two tests
in separate runs the error does not occur and from that i conclude
that somehow the metadata retains the class definitions. My tearDown
and setUp are:

class SQLAtest(unittest.TestCase):

def setUp(self):
Base.metadata.create_all(engine)
self.session = Session()

def tearDown(self):
self.session.close()
tools.drop_tables(Base.metadata, engine, excludes)

with tools.drop_tables removing all tables from the database (using
the DropEverything function) and all tables except from the excludes
from the metadata (with metadata.remove(table) ).

Some tables (with class definitions) are defined in modules, while
others are generated on the flight (from the class definitions). From
the first i just want to remove the records (or regenerate the tables
from the existing metadata), while the others i want to remove
completely.

I have also considered creating new metadata in setUp, but i would not
know how to reinitialize it with the classes/tables hard coded in the
modules.

Is there a way to achieve this or am i missing something?

Cheers, Lars


On Feb 25, 3:46 pm, Michael Bayer <mike...@zzzcomputing.com> wrote:
> On Feb 25, 2012, at 6:57 AM, lars van gemerden wrote:
>
> > Hi all,
>
> > I looking for a way to delete/empty a sqlite database completely (can
> > be in memory or file). I have tried:
>
> > - metadata.drop_all(engine); which does not work with cyclic
> > references
>
> The cycles here are only between the ForeignKey objects themselves, in Python.   SQLite doesn't care.   Add the use_alter=True flag to each ForeignKey in a cycle.
>
> > - solution fromhttp://www.sqlalchemy.org/trac/wiki/UsageRecipes/DropEverything;

Michael Bayer

unread,
Feb 26, 2012, 4:47:37 PM2/26/12
to sqlal...@googlegroups.com

On Feb 26, 2012, at 12:47 PM, lars van gemerden wrote:

> I was wrong, the method emptied the database, but I was checking the
> tables in the metadata.
>
> This time I am also removing the tables from the metadata, but if i
> generate the same tables in two separate test methods (with a call to
> tearDown ans setUp in between), I still get an error about a backref
> name on a relationship already existing.

OK I think you're mixing concepts up here, a backref is an ORM concept. The Table and Metadata objects are part of Core and know absolutely nothing about the ORM or mappings. Removing a Table from a particular MetaData has almost no effect as all the ORM mappings still point to it. In reality the MetaData.remove() method is mostly useless, except that a create_all() will no longer hit that Table, foreign key references will no longer find it, and you can replace it with a new Table object of the same name, but again nothing to do with the ORM and nothing to do with the state of that removed Table, which still points to that MetaData and will otherwise function normally.

If you want to remove mappings, you can call clear_mappers(). The use case for removing individual mappers is not supported as there is no support for doing all the reverse bookkeeping of removing relationships(), backrefs, and inheritance structures, and there's really no need for such a feature.

Like MetaData.remove(), there's almost no real world use case for clear_mappers() except that of the SQLAlchemy unit tests themselves, or tests of other ORM-integration layers like Elixir, which are testing the ORM itself with various kinds of mappings against the same set of classes.

Unit tests in an outside world application would normally be against a schema that's an integral part of the application, and doesn't change with regards to classes. There's virtually no reason in normal applications against a fixed schema to tear down mappings and table metadata between tests. SQLAlchemy docs stress the Declarative pattern very much these days as we're really trying to get it across that the composition of class, table metadata, and mapping is best regarded as an atomic structure - it exists only as that composite, or not at all. Breaking it apart has little use unless you're testing the mechanics of the mapping itself.

Throughout all of this, we are *not* talking about the tables and schema that are in the actual database. It is typical that unit tests do drop all those tables in between test suites, and recreate them for another test suite. Though I tend to favor not actually dropping / recreating and instead running the tests within a transaction that's rolled back at the end as it's much more efficient, especially on backends like Oracle, Postgresql, MSSQL where creates/drops are more expensive. Dropping and recreating the tables in the database though is independent of the structure represented by Metadata/Table, though, that structure lives on and can be reused. Metadata/Table describes only the *structure* of a particular schema. They are not linked to the actual *presence* of those tables within a target schema.


lars van gemerden

unread,
Feb 27, 2012, 9:55:57 AM2/27/12
to sqlalchemy
Thank you for the extensive reply. It makes things a lot clearer;
still i am not sure about how to continue.

Conceptually i would like to create 2 sets of tables/classes in a
database (as part of a prototype):

1) one set of tables/classes with the parameters to generate other
classes/tables from,
2) one set of tables/classes that is automatically generated from the
parameters in the first set. It will feature joined inheritance with
only one root base table/class.

The only database link between these two sets is the 'polymorphic on'
column in the root base table in set 2, which is a foreign key to a
Type table in set 1.

For a typical test i would like to:

1) create records in set 1 of tables (representing classes/tables with
their attributes/foreign keys and fields),
2) from these records generate the tables/classes, where the tables
will be in set 2.
3) add records to the generated tables/classes and test whether
adding, updating, deleting and querying works as intended.

To be able to perform multiple of these tests in one run, i need to
empty the tables of set 1. However i need to completely remove any
data (mappings, class definitions, records, tables) from set 2,
between individual tests.

I (naively) thought of some ways this might be possible:

1) use two separate metadata objects for the same database, bind them
to separate 'Base' classes, one for each set and replace the one
representing set 2 before each individual test,
2) find some way to remove all data concerning set 2 of tables from
mappings, metadata, database, etc. between tests,
3) use two databases, one for each set of tables and forego the
foreign key realtionship between then (or maybe copy set 1 to the
second database)

Please advise on which of these approaches are possible, more
straightforward, ... or whether another approach might be more
appropriate.

Cheers, Lars

Michael Bayer

unread,
Feb 27, 2012, 10:25:45 AM2/27/12
to sqlal...@googlegroups.com

On Feb 27, 2012, at 9:55 AM, lars van gemerden wrote:

> Thank you for the extensive reply. It makes things a lot clearer;
> still i am not sure about how to continue.
>
> Conceptually i would like to create 2 sets of tables/classes in a
> database (as part of a prototype):
>
> 1) one set of tables/classes with the parameters to generate other
> classes/tables from,
> 2) one set of tables/classes that is automatically generated from the
> parameters in the first set. It will feature joined inheritance with
> only one root base table/class.

>
> The only database link between these two sets is the 'polymorphic on'
> column in the root base table in set 2, which is a foreign key to a
> Type table in set 1.
>
> For a typical test i would like to:
>
> 1) create records in set 1 of tables (representing classes/tables with
> their attributes/foreign keys and fields),
> 2) from these records generate the tables/classes, where the tables
> will be in set 2.
> 3) add records to the generated tables/classes and test whether
> adding, updating, deleting and querying works as intended.
>
> To be able to perform multiple of these tests in one run, i need to
> empty the tables of set 1. However i need to completely remove any
> data (mappings, class definitions, records, tables) from set 2,
> between individual tests.


so what I would do is:

1. the "other tables" generated would be in their own MetaData object. After you've done drop_all() for those, throw that whole MetaData away.
2. the "other classes" generated, I'm assuming these are generated as the result of some function being called and aren't declared within a module. When you're done with them, throw them away too.
3. The next run of tests would redo the entire process, using a new MetaData object, and generating new classes.

Key here is that there's no point at which the dynamically generated classes/tables/mappers need to be present as de-composed elements, once they've been composed. They're just thrown away.

Using two Base classes would be an easy way to achieve this.

> --
> You received this message because you are subscribed to the Google Groups "sqlalchemy" group.
> To post to this group, send email to sqlal...@googlegroups.com.
> To unsubscribe from this group, send email to sqlalchemy+...@googlegroups.com.
> For more options, visit this group at http://groups.google.com/group/sqlalchemy?hl=en.
>

Michael Bayer

unread,
Feb 27, 2012, 10:28:10 AM2/27/12
to sqlal...@googlegroups.com

>
>>
>> The only database link between these two sets is the 'polymorphic on'
>> column in the root base table in set 2, which is a foreign key to a
>> Type table in set 1.

also, I'd set up this foreign key using a literal Column so you don't have to worry about string lookups:

ForeignKey(MyRoot.__table__.c.type)


and definitely don't make any backrefs into the fixed system ! Otherwise then yes we might have to look at clearing out mappers entirely between tests.

lars van gemerden

unread,
Feb 27, 2012, 5:44:00 PM2/27/12
to sqlalchemy
Thank you very much again .. that worked like a charm.

For future reference, to give some idea, this is the function i now
use to generate the root base table/class for the generated joined
class inheritance tree:

def createObject():
dct = dict(association_tables = {},
__tablename__ = 'Object',
id = Column(Integer, primary_key = True),
type_name = Column(String(50),
ForeignKey(datatypes.Type.__table__.c.name), nullable = False, default
= 'Object'),
created_at = Column(DateTime, default = func.now()),
__mapper_args__ = {'polymorphic_on': 'type_name'},
type = relationship(datatypes.Type, uselist = False))
return type('Object', (declarative_base(bind = datatypes.engine),
MethodMixin), dct)

with datatypes.Type the main class/table holding the generated class/
table parameters, MethodMixin holding some extra methods (like
__repr__), association_tables holding the associations for many-to-
many relationships (still to be tested). Classes and corresponding
tables are named the same.

I haven't tested much, but for now everything seems to work. Note that
each time this function runs, an new 'Base' class is created for the
generated class 'Object'.

Questions are very welcome.

Cheers, Lars
> >> For more options, visit this group athttp://groups.google.com/group/sqlalchemy?hl=en.
Reply all
Reply to author
Forward
0 new messages