List:General Discussion« Previous MessageNext Message »
From:John Mancuso Date:July 10 2007 8:54pm
Subject:RE: mysqldump/restore
View as plain text  
#!/usr/bin/perl


#dump all files
@db=`mysql -e -ppassword "show databases" | sed "s/-//g" | sed "s/
//g"`;
foreach $db(@db){
	chomp($db);
	`mysqldump $db -ppassword > $db.sql`;
}

#multiple files will load simultaneously. multiple processes should run
much faster
foreach $db(@db){
	`mysql -ppassword $db < $db.sql &`;
} 


How 'bout something like that? Not perfect but should be ok

John Mancuso
Linux Administrator/MySQL DBA
IT Infrastructure
American Home Mortgage
w: 631-622-6382
c: 516-652-2475

-----Original Message-----
From: B. Keith Murphy [mailto:kmurphy@stripped] 
Sent: Tuesday, July 10, 2007 4:36 PM
To: mysql
Subject: mysqldump/restore

hey all. Let me throw out a quick question. We are upgrading from some
version of 4.1 to the latest version of 5.0. MySQL recommends that you
dump and re-import data with this type of move. Not only do they
recommend the dump/import - it would be nice to optimize all our tables
at the same time!! 

I have done a test dump of a couple of gigs of data with just a straight
mysqldump. That worked, but the import is happening slower than
Christmas. I know that one technique for speeding up the imports is to
where you configure the dump so that it generates file(s) with the
schema for the table(s) and file(s) with actual data. Then you can use
the load data infile command to import the data after generating the new
tables (maybe by using 'mysql <table1.sql'). 

So here is the problem. It worked, but it was slow. Not because of the
speed of the import but because I am having to issue a command (and type
in the root password) for each table import. I could do the following:
'cat *.sql > database.sql' to get a complete schema for all the tables
in the database and then just 'mysql < database.sql' to set up all the
tables at the same time. You can't do that with the import of the actual
data. You have to individually load the data into each table. something
like 'load data infile /root/table1.txt'. 

This is fine for one..but what happens when you have 100 tables? I can't
take the time to sit there and hit the up-arrow to retrieve the command
and then type in the new file name. Is there some way to automate this
data import? 

Thanks, 

Keith 

--
B. Keith Murphy
Database Administrator
iContact
2635 Meridian Parkway, 2nd Floor
Durham, North Carolina 27713
(o) 919-433-0786
(c) 850-637-3877 
Thread
mysqldump/restoreB. Keith Murphy10 Jul
  • RE: mysqldump/restoreJohn Mancuso10 Jul
    • View pocedures/backup proceduresAndrey Dmitriev11 Jul
      • Re: View pocedures/backup proceduresDan Nelson11 Jul