Note that there are some explanatory texts on larger screens.

plurals
  1. POInserting Large Object into Postgresql returns 53200 Out of Memory error
    text
    copied!<p>Postgresql 9.1 NPGSQL 2.0.12</p> <p>I have binary data I am wanting to store in a postgresql database. Most files load fine, however, a large binary (664 Mb) file is causing problems. When trying to load the file to postgresql using Large Object support through Npgsql, the postgresql server returns 'out of memory' error.</p> <p>I'm running this at present on a workstation with 4Gb RAM, with 2Gb free with postgresql running in an idle state.</p> <p>This is the code I am using, adapted from <a href="http://npgsql.projects.pgfoundry.org/docs/manual/UserManual.html" rel="nofollow">PG Foundry Npgsql User's Manual</a>.</p> <pre><code>using (var transaction = connection.BeginTransaction()) { try { var manager = new NpgsqlTypes.LargeObjectManager(connection); var noid = manager.Create(NpgsqlTypes.LargeObjectManager.READWRITE); var lo = manager.Open(noid, NpgsqlTypes.LargeObjectManager.READWRITE); lo.Write(BinaryData); lo.Close(); transaction.Commit(); return noid; } catch { transaction.Rollback(); throw; } } </code></pre> <p>I've tried modifying postgresql's memory settings from defaults to all manner of values adjusting:</p> <ul> <li>shared_buffers </li> <li>work_mem</li> <li>maintenance_work_mem</li> </ul> <p>So far I've found postgresql to be a great database system, but this is a show stopper at present and I can't seem to get this sized file into the database. I don't really want to have to deal with manually chopping the file into chunks and recreating client side if I can help it.</p> <p>Please help!?</p>
 

Querying!

 
Guidance

SQuiL has stopped working due to an internal error.

If you are curious you may find further information in the browser console, which is accessible through the devtools (F12).

Reload