All we need is an easy explanation of the problem, so here it is.
I have a Rails app server that uses Postgres for the DB backend. I’m spinning up a new server and I want to get a clean and up to date copy of the database. I’ve tried it once so far while the server was in production by doing the following:
- As user postgres:
pg_dump databasename > databasename.sql
databasename.sqlfile over to the new server
- As user postgres pn the new server enter psql and create the database and alter ownership
- As user postgres run
psql databasename < databasename.sql
All of this will occur after I shutdown Nginx/Passenger so all sessions will be terminated and no more data can come in/go out of the system. The only thing running will be postgres which no one can connect to directly.
Does this sound like a reasonable plan to migrate the database from one server to another? The database is 12MB in size, probably 30 tables, and has maybe 50K records so it’s pretty small.
Any advice on how to do this cleanly and to make sure my data is “sane” would be great.
How to solve :
I know you bored from this bug, So we are here to help you! Take a deep breath and look at the explanation of your problem. We have many solutions to this problem, But we recommend you to use the first method because it is tested & true method that will 100% work for you.
This is a reasonable plan if a planned maintenance of a few minutes up to an hour maybe is no big thing for you.
The database is real small and should be imported within seconds on the new server.
If the downtime is an issue for you, you should consider replicating the server with Slony (http://slony.info/) and then make the replicating slave finally the new master.
Note: Use and implement method 1 because this method fully tested our system.
Thank you 🙂