List:General Discussion« Previous MessageNext Message »
From:Mogens Melander Date:July 18 2007 8:44am
Subject:Re: speeding imports
View as plain text  
On Tue, July 17, 2007 13:31, Baron Schwartz wrote:
> Mogens Melander wrote:
>> On Tue, July 17, 2007 04:29, Baron Schwartz wrote:
>>> B. Keith Murphy wrote:
>>>> The problem is that I am realizing that this dump/import is going to
>>>> take
>>>> hours and in some cases days. I am looking for any way to speed this up.
>>>> Any suggestions?
>>> The fastest way I've found is to do SELECT INTO OUTFILE on the master,
>>> which
>>> selects into a sort of tab-delimited format by default -- don't specify
>>> any
>>> options like field terminators or whatnot.  This file can then be imported
>>> directly into LOAD DATA INFILE, again without options.
>>> I think this is faster than loading files full of SQL statements, which
>>> have to be parsed and query-planned etc.

That method has proven "very" quick in the past.

>>> I thought mysqldump had an option to dump this way, but I can't see it
>>> now.
>> I think you are looking for the --single-transaction option :)
> I found the option I meant:
>    -T, --tab=name      Creates tab separated textfile for each table to given
>                        path. (creates .sql and .txt files). NOTE: This only
>                        works if mysqldump is run on the same machine as the
>                        mysqld daemon.

Yup, that was what i was trying to write 8^) using this one with the other.


Mogens Melander
+45 40 85 71 38
+66 870 133 224

This message has been scanned for viruses and
dangerous content by MailScanner, and is
believed to be clean.

speeding importsB. Keith Murphy17 Jul
  • Re: speeding importsBaron Schwartz17 Jul
    • Re: speeding importsMogens Melander17 Jul
      • Re: speeding importsBaron Schwartz17 Jul
        • Re: speeding importsMogens Melander18 Jul
  • Re: speeding importsAndrew Hutchings17 Jul