7

I'm running out of space from my hard drive which has a 18GB PostgreSQL database.

When I try pg_dump to a different drive, PostgreSQL creates temporary files on the disk it's dumping from, so I run out of space and the dump fails. This is the command I used:

pg_dump -U myusername 'database' --format=tar > /anotherdrive/filetodump.tar

It eats free space the same way as with gzip.

I tried this from Stackoverflow and a small file is created in the new directory, but nothing else, and pg_dump still writes to the original disk.

How do I change temp directory for pg_dump?

Note: My work_mem setting is pretty high already, I can't change that.
My db version is 9.0.13.

Erwin Brandstetter
  • 185,527
  • 28
  • 463
  • 633
kissgyorgy
  • 345
  • 1
  • 3
  • 8

1 Answers1

7

The temporary files are a side effect of the --format=tar option.

A backup in default plain format goes through without temporary tables in the local drive. Omit the option to produce a backup in plain format without temporary local files.

Erwin Brandstetter
  • 185,527
  • 28
  • 463
  • 633