compatibility between SQLite and PostgreSQL

32 views
Skip to first unread message

Alceu Rodrigues de Freitas Junior

unread,
Aug 14, 2020, 10:00:46 AM8/14/20
to sqlalchemy
Greetings,

My name is Alceu and this is my first post to this group and I'm a newbie regarding SQLAlchemy.

I have some projects that are using SQLAlchemy and PostgreSQL and they are running fine until we have to run automated tests that uses PostgreSQL and they took ~5 minutes to complete, even after trying to improve PostgreSQL configuration for better performance.

Those tests are meant to be executed locally, so SQLite (in memory) would be a much faster option.

The problem is, creating a model for the SQLite dialect results in something incompatible with PostgreSQL (I'm quite aware that their inner mechanisms are very different).

Is it possible to generate models that could be compatible between the two databases systems, so I could switch from to another depending on the environment? Retaining data is not something required since all data created from the tests will be disposable anyway.

Thanks!
Alceu

polkapiquee

unread,
Aug 14, 2020, 11:55:30 PM8/14/20
to sqlalchemy
Yes it is possible to create SQLAlchemy models that work on both PostgreSQL and SQLite. I've done it myself. It works ok for simple models and it is relatively easy to mimic Postgres functions as SQLite UDFs.

However it will get harder if you need to use RETURNING queries or other postgres goodies. You may be able to factor out an abstraction but too many of these and you're not really testing your production code any more.

So it depends on how what PostgreSQL goodies you feel you need.

I would also say that speeding up unit tests is normally possible especially if you lean heavily on native postgres mechanisms for loading fixtures and tearing them down. Stored procedures can be helpful with the teardown or save points.

Good luck!

Giles Brown

Jonathan Vanasco

unread,
Aug 17, 2020, 2:22:55 PM8/17/20
to sqlalchemy


On Friday, August 14, 2020 at 10:00:46 AM UTC-4, Alceu Rodrigues de Freitas Junior wrote:
 
I have some projects that are using SQLAlchemy and PostgreSQL and they are running fine until we have to run automated tests that uses PostgreSQL and they took ~5 minutes to complete, even after trying to improve PostgreSQL configuration for better performance.

Those tests are meant to be executed locally, so SQLite (in memory) would be a much faster option.

I usually target PostgreSQL production and continuous-integration, and SQLite for developer testing and unit-tests..  There are some tradeoffs on leveraging SQLlite in-memory vs file-based regarding: locking, concurrency, and per-test setups.  In my experience, a file-backed sqlite is better on higher-level tests and memory-backed is otherwise fine..


The problem is, creating a model for the SQLite dialect results in something incompatible with PostgreSQL (I'm quite aware that their inner mechanisms are very different).

Is it possible to generate models that could be compatible between the two databases systems, so I could switch from to another depending on the environment? Retaining data is not something required since all data created from the tests will be disposable anyway.

Exactly what issues are you having, and why do you need to create a different model?  You should be able to use the same model for both.

When it comes to managing core differences between the databases - like the very different database functions --  I use the @compiles decorator to write custom functions that will compile differently on each backend.

What you're talking about is a very common usage of SqlAlchemy. Please share more details so we can help you get up and running.
Reply all
Reply to author
Forward
0 new messages