6

Is there a way to backup a PostgreSQL database with a huge number of tables?

An attempt with pg_dump for a database of about 28000 tables resulted in the following error message:

pg_dump: WARNING:  out of shared memory
pg_dump: SQL command failed
pg_dump: Error message from server: ERROR:  out of shared memory
HINT:  You might need to increase max_locks_per_transaction.
pg_dump: The command was: LOCK TABLE public.link10292 IN ACCESS SHARE MODE
pg_dump: *** aborted because of error

An increase of max_locks_per_transaction to 256 instead of 64 resulted in a failure to start the server.

Anything else I can try? (PostgreSQL 9.0, Mac OS X.)

krlmlr
  • 371
  • 1
  • 3
  • 12

1 Answers1

4

You should really increase max_locks_per_transaction. As specified in http://www.postgresql.org/docs/9.0/static/runtime-config-locks.html , changing this parameter may require to also change the System V shared memory. You have to increase that value also. In OS X this can be done as explained in http://www.postgresql.org/docs/9.0/static/kernel-resources.html#SYSVIPC (look for OS X section). I think you should really change at least shmmax.

eppesuig
  • 5,122
  • 1
  • 16
  • 15