Sure, if your process will be the only one inserting and changing these
rows. Working through your 20k python objects in batches of 1000 or
whatever size you like, collect the key values from the python objects.
Run a database select to see which of those keys are present in the
database, and then divide your batch into two parts: data needing insert
and data needing update.
If you've got write contention for this data you'd need to work more
granularly (likely row by row) instead, keeping in mind the database
engine's transaction model and ideally taking advantage of any tools the
db engine provides (like ON DUPLICATE or sql's MERGE) . Performance and
engine agnosticism may be mutually exclusive here.