Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

Storing large files in multiple schemas: BLOB or BYTEA

0 views
Skip to first unread message

tigran2-...@riatest.com

unread,
Oct 10, 2012, 6:12:05 AM10/10/12
to

Hi,

 

I need to store large files (from several MB to 1GB) in Postgres database. The database has multiple schemas. It looks like Postgres has 2 options to store large objects: LOB and BYTEA. However we seem to hit problems with each of these options.

 

1. LOB. This works almost ideal, can store up to 2GB and allows streaming so that we do not hit memory limits in our PHP backend when reading the LOB. However all blobs are stored in pg_catalog and are not part of schema. This leads to a big problem when you try to use pg_dump with options –n and –b to dump just one schema with its blobs. It dumps the schema data correctly however then it include ALL blobs in the database not just the blobs that belong to the particular schema.

Is there a way to dump the single schema with its blobs using pg_dump or some other utility?

 

2. BYTEA. These are correctly stored per schema so pg_dump –n works correctly however I cannot seem to find a way to stream the data. This means that there is no way to access the data from PHP if it is larger than memory limit.

 

Is there any other way to store large data in Postgres that allows streaming and correctly works with multiple schemas per database?

 

Thanks.

John DeSoi

unread,
Oct 11, 2012, 9:05:25 AM10/11/12
to

On Oct 10, 2012, at 6:12 AM, tigran2-...@riatest.com wrote:

> 2. BYTEA. These are correctly stored per schema so pg_dump –n works correctly however I cannot seem to find a way to stream the data. This means that there is no way to access the data from PHP if it is larger than memory limit.

You can get the octet length and then use the substring function to grab large columns in chunks. See

http://www.postgresql.org/docs/current/interactive/functions-binarystring.html


John DeSoi, Ph.D.



--
Sent via pgsql-php mailing list (pgsq...@postgresql.org)
To make changes to your subscription:
http://www.postgresql.org/mailpref/pgsql-php

John DeSoi

unread,
Oct 11, 2012, 8:59:13 AM10/11/12
to

On Oct 10, 2012, at 6:12 AM, tigran2-...@riatest.com wrote:

> 2. BYTEA. These are correctly stored per schema so pg_dump –n works correctly however I cannot seem to find a way to stream the data. This means that there is no way to access the data from PHP if it is larger than memory limit.

0 new messages