--SQLAlchemy -The Python SQL Toolkit and Object Relational MapperTo post example code, please provide an MCVE: Minimal, Complete, and Verifiable Example. See http://stackoverflow.com/help/mcve for a full description.---You received this message because you are subscribed to the Google Groups "sqlalchemy" group.To unsubscribe from this group and stop receiving emails from it, send an email to sqlalchemy+...@googlegroups.com.To view this discussion on the web visit https://groups.google.com/d/msgid/sqlalchemy/17383085-4a83-49d0-bf0a-1653552b7d59n%40googlegroups.com.
Hi Mike,
Thanks for that info. It was just what I needed. I also want to thank you for your YouTube tutorials on SQLAlchemy. They are fantastic.
I don’t want to make this a huge post, but I have a real pet peeve concerning ORMs. I come from a Java background where I used MyBatis as my ORM. What I love about MyBatis was
- I could use raw SQL which I personally feel is superior. My argument here is simple: Why learn another “language” for issuing SQL statements when we have already spent a fair amount of time learning SQL. Also, raw SQL is easily testable with either command line or GUI tools?
- The ORM should just use the mapped models in order to execute SQL using mapping that in and of themselves doesn’t/shouldn’t care about the tables. Unless you are creating a table with the ORM which I have found to be rare, the ORM shouldn’t care about the table structure other than field names with the possibility of aliases and data types. Why define more than what we need in order to populate a plain old object (POO – language agnostic). Why include characteristics like primary key, nullability, etc? Some Pydantic-like validation is handy, but can be table agnostic. Let’s extract the data via SQL and return POOs. In that regard, I liken the ORM to a Data Transfer Object (DTO).
- As I have already mentioned, how often do you really use an application to create tables. Often, they already exist. Furthermore, it is just more natural to use command‑line SQL or a GUI to create the tables. In fact, it is not uncommon to use a GUI like PgAdmin or DBeaver to create the database elements that you need and then use that tool to derive all sorts of scripts to perform common activities such as backup, restore, etc. that can be scheduled.
There is a very handy Java framework call BeanIO ( http://beanio.org/) that I feel exemplifies the points I am trying to make. With BeanIO, it is possible to extract data from a variety of file formats and populate POJOs. BeanIO is only interested in the layout of the data. It is a convenience framework that allows for OOP design of an application. I feel that MyBatis does this also. It has substantial DB integration, but strives to connect the POJO to the database without enforcing design. Using so-called Entity’s enforces a design that ORMs should not be forced to obey if all you are looking for is a translation from SQL to a POO.
Once again, thanks for you help and sorry for my ranting, but as I’ve said I have a pet peeve with ORMs that are enforcing more than I think is necessary to translate SQL to a POO.
To view this discussion on the web visit https://groups.google.com/d/msgid/sqlalchemy/105bbc06-869b-4512-ad51-5a6ab6ea3eb7n%40googlegroups.com.
Hi Simon,
Thanks for responding to my post. It turns out that MyBatis can do exactly what you are saying which essentially sounds like a bulk ETL process. Again, the key difference is that MyBatis doesn’t require that the mapping be done with all the DB-specific definitions which I frankly prefer. There is a tool, the MyBatis generator, that does exactly this and I have used it when I didn’t want to write my own mapping files since the tables had hundreds of fields.
In many cases, you are correct in that I was only loading data. The data was retrieved by raw SQL and could involve joins with other tables much as in a view. I just needed a data transfer mechanism to translate the SQL results to a POJO. Your experience differed in that you did need to create the tables with your Python code. I agree that SQLAlchemy is perfect for that. I created the tables ahead of time usually with command-line psql or, as you said, the tables already existed. In fact, I’d sometimes create temp tables with the schema of existing tables and I also did that with command-line psql in a Bash script.
Thanks for your insights.
Rgds
mjg