sqlalchemy.exc.InvalidRequestError: Can't attach instance another instance with key is already present in this session

2,049 views
Skip to first unread message

Evgenii

unread,
Jul 19, 2021, 8:31:30 AM7/19/21
to sqlalchemy

Hello!
I’m using many-to-many relation, and this relationship bar_list must have list of
instances. Some of them can be repeated (ex. [inst1, inst2, inst1]).
I attach very simplified code there (all of database interaction is hidden
under the hood, user accesses database at top level, but this example reflects
my problem).

foo_bar_association = Table(
    'foo_bar', Base.metadata,
    Column('foo_id', Integer, ForeignKey('foo.id')),
    Column('bar_id', Integer, ForeignKey('bar.id'))
)

class FooTable(Base):
    __tablename__ = 'foo'

    id = Column(Integer, primary_key=True)
    type = Column(String, nullable=False)

    bar_list = relationship('BarTable',
                            secondary=foo_bar_association,
                            lazy='subquery')

    def __init__(self, type_, bar_list):
        self.type = type_
        self.bar_list = bar_list

class BarTable(Base):
    __tablename__ = 'bar'

    id = Column(Integer, primary_key=True)
    name = Column(String, nullable=False)

    def __init__(self, name):
        self.name = name

When I pass two exact instances [bar_one, bar_same_one](as a relationship) have to be related (before session.close()) to different sessions I have this error:
sqlalchemy.exc.InvalidRequestError: Can't attach instance another instance with key is already present in this session.

with Session() as session:
    bar_one = session.query(BarTable).get(1)

with Session() as session:
    bar_same_one = session.query(BarTable).get(1)

with Session() as session:
    foo = FooTable('some_type', [bar_one, bar_same_one])
    session.add(foo)
    session.commit()

But I don’t have any error after I create instances in same session:

with Session() as session:
    bar_one = session.query(BarTable).get(1)
    bar_same_one = session.query(BarTable).get(1)

with Session() as session:
    foo = FooTable('some_type', [bar_one, bar_same_one])
    session.add(foo)
    session.commit()

And after:

with Session() as session:
    foo = FooTable('some_type', [bar_one, bar_one])
    session.add(foo)
    session.commit()

I can make a work around:

  1. Find unique instances in bar_list and replace not unique with unique
    or
  2. Get all bar_list ids and get all instances in same session before adding
    foo instance.
    But both of them are not pure python way and seems to be complicated and ugly.
    I hope there is simple alchemy solution (as adding simple attribute in relationship)
Python 3.7.10
SQLAlchemy==1.4.15

Mike Bayer

unread,
Jul 19, 2021, 9:22:01 AM7/19/21
to noreply-spamdigest via sqlalchemy
This is all expected behavior, the main reason you're having problems is that you are using multiple sessions and mixing their results together.    If you need to do this, there are few approaches, the most basic being to use the merge() method: https://docs.sqlalchemy.org/en/14/orm/session_api.html?highlight=session%20merge#sqlalchemy.orm.Session.merge

however the main issue is that you are mixing results from multiple sessions, which in the vast majority of cases is unnecessary.  The session corresponds to working in a single transaction at a time, and you should normally be able to complete all the work you have for a particular operation within that single scope.  
--
SQLAlchemy -
The Python SQL Toolkit and Object Relational Mapper
 
 
To post example code, please provide an MCVE: Minimal, Complete, and Verifiable Example. See http://stackoverflow.com/help/mcve for a full description.
---
You received this message because you are subscribed to the Google Groups "sqlalchemy" group.
To unsubscribe from this group and stop receiving emails from it, send an email to sqlalchemy+...@googlegroups.com.

Evgenii

unread,
Jul 20, 2021, 4:11:46 AM7/20/21
to sqlalchemy

Mike, thank you for the answer.
But I have another problem with deleting the instance.
Even though all instances belong to the same session (it is possible to push foo instance):

with Session() as session:
    b1 = session.query(BarTable).get(1)
    b2 = session.query(BarTable).get(1)
    foo = FooTable('some_type', [b1, b2])
    session.add(foo)
    session.commit()

I get this error:

sqlalchemy.orm.exc.StaleDataError: DELETE statement on table 'foo_bar' expected to delete 1 row(s); Only 2 were matched.

After simple delete ex.:

with Session() as session:
    foo = session.query(FooTable).get()
    session.delete(foo)
    session.commit()
Pls tell how to delete instance with repeated elements in relation.

P.S. It seems the error to be raised implicit, because "ONLY 2 were matched"

понедельник, 19 июля 2021 г. в 16:22:01 UTC+3, Mike Bayer:

Mike Bayer

unread,
Jul 20, 2021, 9:10:12 AM7/20/21
to noreply-spamdigest via sqlalchemy
if you are mapping ORM classes to the same table that is also used as the "secondary" table in a relationship() that can lead to the ORM inserting more than one row for that table.   based on the name "foo_bar" I would imagine something like this might be going on.  



Pls tell how to delete instance with repeated elements in relation.

So assuming this is a development database where you can start all over again, the approach here is to *never have* repeated elements in a relationship.  When you make a "secondary" table, make sure you set the columns that refer to the related tables inside of a constraint, such as UNIQUE constraint, or more commonly make them the primary key.  I should add this to the docs at https://docs.sqlalchemy.org/en/14/orm/basic_relationships.html#many-to-many as this seems to not be mentioned:

association_table = Table('association', Base.metadata,
    Column('left_id', Integer, ForeignKey('left.id'), primary_key=True),
    Column('right_id', Integer, ForeignKey('right.id'), primary_key=True)
)

class Parent(Base):
    __tablename__ = 'left'
    id = Column(Integer, primary_key=True)
    children = relationship("Child",
                    secondary=association_table)

class Child(Base):
    __tablename__ = 'right'
    id = Column(Integer, primary_key=True)





Evgenii

unread,
Jul 20, 2021, 10:00:33 AM7/20/21
to sqlalchemy
As it was mentioned before, I create repeated elements in relationship deliberately. Moreover, alchemy allows me to do that, but it fails during deleting instances and modifying relationships.
Unfortunately, this is not that case, where I can start all over again. All examples are maximally simplified and depersonalized and clearly describes my problem.
As you STRONGLY recommend *never have* repeated elements in a relationship, is there another way to do this in alchemy?
Just imagine in your simple example that Parent instance has list of two absolutely same children (that children have same ids).
Is this possible to do via alchemy?

вторник, 20 июля 2021 г. в 16:10:12 UTC+3, Mike Bayer:

Mike Bayer

unread,
Jul 20, 2021, 10:23:42 AM7/20/21
to noreply-spamdigest via sqlalchemy


On Tue, Jul 20, 2021, at 10:00 AM, Evgenii wrote:
As it was mentioned before, I create repeated elements in relationship deliberately.

Assuming this implies the table can have no candidate key, this is an antipattern in SQL and there are lots of answers/articles/etc on the web why all relational database tables need to have some kind of uniqueness to each row (candidate key).    Pure duplicated rows add essentially no information, and it's not possible to remove some of the rows and not all of them.

The best you can do here is to use the association object pattern such that each row in your "Secondary" table contains additional data, such as a "sort by" key or similar, that is also part of the primary key and can be used to differentiate multiple rows that refer to the same entities.




Moreover, alchemy allows me to do that,

this is correct.   the unit of work unfortunately does not unconditionally dedupe when inserting rows into "secondary", as it does not require that the table's existing contents are loaded into memory in order to persist, therefore a unique/primary constraint on secondary should be implemented.   If you use collection_class=set, then the collection is deduped on the Python side if it's loaded, but this is not a guarantee of uniqueness.

but it fails during deleting instances and modifying relationships.

correct because there is no way to delete only some of the "dupes" - if there are dupes in the table, that means things have gone wrong.  for a production DB, this should  be repaired manually and proper constraints applied.    SQLAlchemy detects this dupe condition as soon as it can as it can refer not just to dupe rows but also mapping mis-configurations.



Unfortunately, this is not that case, where I can start all over again. All examples are maximally simplified and depersonalized and clearly describes my problem.
As you STRONGLY recommend *never have* repeated elements in a relationship, is there another way to do this in alchemy?

Just imagine in your simple example that Parent instance has list of two absolutely same children (that children have same ids).
Is this possible to do via alchemy?

you would need to add additional columns to your "secondary" table that distinguish between the "dupe" rows and use the association object pattern for persistence.



Reply all
Reply to author
Forward
0 new messages