Questions tagged [pg-dump]
12 questions
91
votes
10 answers
pg_dump and pg_restore: input file does not appear to be a valid archive
I have used pg_dump on one machine and copied result file to another, where I tried to restore it. I believe schema is the same. However, I get:
pg_restore: [archiver] input file does not appear to be a valid archive
I have done following…
gruszczy
- 1,013
- 1
- 8
- 7
24
votes
1 answer
Postgresql 13 - Speed up pg_dump to 5 minutes instead of 70 minutes
We use pg_dump nightly to make a snapshot of our database. We did for a long time with a simple command
pg_dump -Fc database_name
This takes about an hour and produces a file of 30+GByte.
How can we speed up things?
Janning
- 1,721
6
votes
1 answer
Using pg_dump when connecting via a service
I connect to my postgres server using psql "service=subscription". How do I use the service with pg_dump?
I tried to do the following:
pg_dump -h "service=subscription" > /home/fogest/dump.out
This however did not work.
How should I be doing…
ComputerLocus
- 203
4
votes
1 answer
pg_dump with -w flag does not read from .pgpass file
I am trying to backup my PostgreSQL database called crewdb on Ubuntu 18.04 LTS from a script with the following command in it:
pg_dump -h localhost -p 5432 -U postgres -w -C -F p -b -v -f ~/Dropbox\/postgres_backup/crewdb.backup.sql crewdb
I know…
speed bump
- 145
2
votes
1 answer
docker postgres backup creates an empty file when run as a cron job
I'm trying to create a cronjob that creates db backups every night.
My crontab has the job:
* * * * * /home/user/scripts/backup.sh (have it set to go off every min for testing)
In backup.sh, I have:
docker exec -it dbContainer pg_dump -U username -d…
Ethan Solomon
- 131
- 3
2
votes
2 answers
pg_dump complains PostgreSQL version 11 is not installed after installing postgres-client-13
We just upgraded our PostgreSQL servers to v13. We heavily use pg_dump on Ubuntu 18 systems to transfer data between databases. After upgrading the servers, pg_dump would complain about a version mismatch.
Easy enough, I installed the…
Christian
- 91
2
votes
0 answers
Not able to pg_dump due to permission, despite having root permissions
I am trying to run this command
sudo pg_dump -U bm_clients -Z 9 -v baydb |aws s3 cp - s3://thebay.com/bay.dump.gz
The output is as follows:
pg_dump: reading extensions
pg_dump: identifying extension members
pg_dump: reading schemas
pg_dump: reading…
shubhendu
- 121
1
vote
1 answer
How to pg_dump from old version of postgres?
I'm really scratching my head. I was under the impression that newer versions of pg_dump can export from older versions. However, I keep getting this error:
pg_dump: error: aborting because of server version mismatch
pg_dump: detail: server version:…
FlavorScape
- 831
0
votes
1 answer
Postgres pg_dump using compression - Keeping schema only for some tables
I am currently taking nightly exports of a Postgres 10.5 database, but taking schema only for 2 very large tables that I don't need the data for. I am compressing the export when done, and would really like to use pg_dump's internal compression…
emmdee
- 2,397
- 12
- 43
- 65
0
votes
0 answers
Replicate AWS RDS Postgres instances
Faced a situation we need to consolidate few small postgres instances to a bigger one
Cant figure out how to replicate "old" db new data to "new" db when the replacement happens
Ill try to simple it :
old db instance name is X
new db instance is…
Eyal Solomon
- 75
- 2
- 10
0
votes
1 answer
how to run a bash backup script with simultaneous processes without getting stuck with infinite looping?
I wrote this script for run postgresql pg_dump of multiples schemas. Since it takes much time to finishes all i'd improved it by running simultaneous dumps with this:
for SCHEMA in $SCH
do
while [ $(jobs | wc -l | xargs) -ge $PROC ]; do sleep 5;…
0
votes
1 answer
Will the DB dump be consistent if a table is being inserted each second?
I want to create a consistent backup (dump) of a PostgreSQL database using sudo -u postgres pg_dump database_name > db_backup.sql. There some multiple tables that are being inserted a row each second, therefore these tables are quite huge.
I have…
tukusejssirs
- 113
- 5