Issuing Raw SQL and Returning a List of Objects

276 views
Skip to first unread message

Mike Graziano

unread,
Aug 17, 2023, 4:26:59 PM8/17/23
to sqlalchemy
To all,

I am new to Python and SQLAlchemy.  I was a Java developer who used the MyBatis ORM.  I was using PONY ORM for a while, but was concerned that SQLAlchemy is the gold standard for ORMs and Python, but there is something about MyBatis that I can't seem to find in SQLAlchemy, but maybe I have not googled enough.

In short, I don't want to use SQLAlchemy's select.  I want to issue raw SQL and have SQLAlchemy's ORM capability populate a collection of objects.  That may not be pythonic, but I'd like to know how to do it with declarative mapping.  Is it possible and, if so, I'd love to see an example.  

Thx & rgds

mjg

Mike Bayer

unread,
Aug 17, 2023, 8:04:58 PM8/17/23
to noreply-spamdigest via sqlalchemy
the raw SQL to ORM mapping pattern has a lot of limitations but it is documented at https://docs.sqlalchemy.org/en/20/orm/queryguide/select.html#getting-orm-results-from-textual-statements .
--
SQLAlchemy -
The Python SQL Toolkit and Object Relational Mapper
 
 
To post example code, please provide an MCVE: Minimal, Complete, and Verifiable Example. See http://stackoverflow.com/help/mcve for a full description.
---
You received this message because you are subscribed to the Google Groups "sqlalchemy" group.
To unsubscribe from this group and stop receiving emails from it, send an email to sqlalchemy+...@googlegroups.com.

Mike Graziano

unread,
Aug 21, 2023, 1:48:18 PM8/21/23
to sqlalchemy

Hi Mike,

 

Thanks for that info.  It was just what I needed. I also want to thank you for your YouTube tutorials on SQLAlchemy. They are fantastic.

 

I don’t want to make this a huge post, but I have a real pet peeve concerning ORMs.  I come from a Java background where I used MyBatis as my ORM.  What I love about MyBatis was

 

-   I could use raw SQL which I personally feel is superior.  My argument here is simple: Why learn another “language” for issuing SQL statements when we have already spent a fair amount of time learning SQL.  Also, raw SQL is easily testable with either command line or GUI tools?

-   The ORM should just use the mapped models in order to execute SQL using mapping that in and of themselves doesn’t/shouldn’t care about the tables.  Unless you are creating a table with the ORM which I have found to be rare, the ORM shouldn’t care about the table structure other than field names with the possibility of aliases and data types.  Why define more than what we need in order to populate a plain old object (POO – language agnostic).  Why include characteristics like primary key, nullability, etc?  Some Pydantic-like validation is handy, but can be table agnostic.  Let’s extract the data via SQL and return POOs.  In that regard, I liken the ORM to a Data Transfer Object (DTO).

-   As I have already mentioned, how often do you really use an application to create tables.  Often, they already exist.  Furthermore, it is just more natural to use command‑line SQL or a GUI to create the tables.  In fact, it is not uncommon to use a GUI like PgAdmin or DBeaver to create the database elements that you need and then use that tool to derive all sorts of scripts to perform common activities such as backup, restore, etc. that can be scheduled.

 

There is a very handy Java framework call BeanIO ( http://beanio.org/) that I feel exemplifies the points I am trying to make.  With BeanIO, it is possible to extract data from a variety of file formats and populate POJOs.  BeanIO is only interested in the layout of the data. It is a convenience framework that allows for OOP design of an application.  I feel that MyBatis does this also.  It has substantial DB integration, but strives to connect the POJO to the database without enforcing design.  Using so-called Entity’s enforces a design that ORMs should not be forced to obey if all you are looking for is a translation from SQL to a POO.

 

Once again, thanks for you help and sorry for my ranting, but as I’ve said I have a pet peeve with ORMs that are enforcing more than I think is necessary to translate SQL to a POO.

Simon King

unread,
Aug 23, 2023, 12:49:23 PM8/23/23
to sqlal...@googlegroups.com
My perspective: the SQLAlchemy ORM really comes into its own when you are making use of its Unit of Work system to load a batch of objects from the database, manipulate those objects, and then flush your changes back to the database. If you are only *loading* data then you don't need a lot of the functionality of the ORM, and you might consider using SQLAlchemy Core instead.

Using SQLAlchemy Core to execute SQL strings is very simple:


You can use the objects that come back from those calls directly (they have attributes named after the columns from the query), or you could trivially convert them into instances of some class that you've defined.

It sounds like the sort of work you do involves writing code to access pre-existing databases, in which case writing SQL directly makes a lot of sense, and you have no need for the schema-definition parts of SQLAlchemy. But there are other classes of application for which the schema-definition tools are very useful. I have written many applications for which the database didn't already exist, so allowing SQLAlchemy to create the tables was the obvious way to go (with Alembic for migrations as the schema changed over time). SQLAlchemy also gives a certain amount of independence from the underlying database, meaning that I can run most of my tests using SQLite despite using Postgres or MySQL in production.

In summary: use the right tool for the job :-)

Simon


Mike Graziano

unread,
Aug 24, 2023, 8:06:11 AM8/24/23
to sqlalchemy

Hi Simon,

 Thanks for responding to my post.  It turns out that MyBatis can do exactly what you are saying which essentially sounds like a bulk ETL process.  Again, the key difference is that MyBatis doesn’t require that the mapping be done with all the DB-specific definitions which I frankly prefer.  There is a tool, the MyBatis generator, that does exactly this and I have used it when I didn’t want to write my own mapping files since the tables had hundreds of fields. 

In many cases, you are correct in that I was only loading data.  The data was retrieved by raw SQL and could involve joins with other tables much as in a view.  I just needed a data transfer mechanism to translate the SQL results to a POJO.  Your experience differed in that you did need to create the tables with your Python code.  I agree that SQLAlchemy is perfect for that.  I created the tables ahead of time usually with command-line psql or, as you said, the tables already existed.  In fact, I’d sometimes create temp tables with the schema of existing tables and I also did that with command-line psql in a Bash script. 

Thanks for your insights.

 Rgds

 mjg

mkmo...@gmail.com

unread,
Nov 2, 2023, 11:13:07 AM11/2/23
to sqlalchemy
Hi Mike,

If I understand correctly, you want to work with raw sql and don't want any ORM getting in your way. I'm the same way, and it is trivial to use SQLAlchemy Core for this purpose.

results = conn.execute(text('select foo, bar from baz')).mappings().fetchall()  # mappings().fetchall() returns a list of dict like objects
for row in results:
    print(row['foo'], row['bar'])

result = conn.execute(text('select foo, bar from baz')).fetchall()  # fetchall() without mappings() returns a list of named tuple like objects
for row in results:
    print(row.foo, row.bar)
    print(row[0], row[1])

Mike Graziano

unread,
Nov 2, 2023, 1:49:13 PM11/2/23
to sqlalchemy
Hi there,

This is great.  Thanks for adding to the discussion.

Rgds

mjg

Reply all
Reply to author
Forward
0 new messages