List:General Discussion« Previous MessageNext Message »
From:Jerry Schwartz Date:January 3 2011 11:08pm
Subject:RE: This just seems to slow
View as plain text  
Folks, this is getting creepy. It seems like each of the variations you've 
given me works some times and not others. I haven't found the pattern yet.

During the hours it took my initial data loads to finish, I rewrote the import 
process so that (I hope) I won't have to go through this again.

Thanks for your help.

Regards,

Jerry Schwartz
Global Information Incorporated
195 Farmington Ave.
Farmington, CT 06032

860.674.8796 / FAX: 860.674.8341
E-mail: jerry@stripped
Web site: www.the-infoshop.com


>-----Original Message-----
>From: mos [mailto:mos99@stripped]
>Sent: Monday, January 03, 2011 3:25 PM
>To: Jerry Schwartz; mysql@stripped
>Subject: RE: This just seems to slow
>
>Jerry,
>     Try this:
>
>mysqlimport -uusername -ppassword --verbose --debug-info --delete
>--columns=`dm_history_dm_id`,`DM_History_Customer_ID` --local
>--fields-terminated-by=","  --fields-optionally-enclosed-by="\""
>--lines-terminated-by="\r\n" --host=localhost yourdbname t_dmu_history.txt
>
>
>I use Load Data Infile all the time, usually inside of a program like Delphi.
>
>Mike
>
>At 09:56 AM 1/3/2011, Jerry Schwartz wrote:
>>Okay, I have a confession to make: I have never gotten Load Data Infile or
>>mysqlimport to work.
>>
>>Here's my CSV file, named "t_dmu_history.txt":
>>
>>13071,299519
>>13071,299520
>>13071,299521
>>13071,299522
>>13071,299524
>>13071,299526
>>13071,299527
>>...
>>
>>Here's my mysqlimport command:
>>
>>mysqlimport -uaccess -pxxx --delete
>>--columns=`dm_history_dm_id`,`DM_History_Customer_ID`
>>  --local --silent
>>--fields-terminated-by=',' --lines-terminated-by='\r\n' --host=localhost
>>maintable_usa t_dmu_history.txt
>>
>>I'm running on Windows Vista, and mysqlimport is "Ver 3.7 Distrib 5.1.31, 
>>for
>>Win32 (ia32)"
>>
>>It runs for awhile, but I wind up with only one record:
>>
>>localhost >select * from t_dmu_history;
>>+------------------+------------------+------------------------+
>>| t_dmu_history_id | DM_History_DM_ID | DM_History_Customer_ID |
>>+------------------+------------------+------------------------+
>>|                1 |            13071 |                   NULL |
>>+------------------+------------------+------------------------+
>>1 row in set (0.00 sec)
>>
>>Obviously mysqlimport is parsing the input file incorrectly, but I don't 
>>know
>>why.
>>
>>Here's the table itself:
>>
>>+------------------------+---------+------+-----+---------+----------------+
>>| Field                  | Type    | Null | Key | Default | Extra          |
>>+------------------------+---------+------+-----+---------+----------------+
>>| t_dmu_history_id       | int(11) | NO   | PRI | NULL    | auto_increment |
>>| DM_History_DM_ID       | int(11) | YES  | MUL | NULL    |                |
>>| DM_History_Customer_ID | int(11) | YES  | MUL | NULL    |                |
>>+------------------------+---------+------+-----+---------+----------------+
>>
>>        Table: t_dmu_history
>>Create Table: CREATE TABLE `t_dmu_history` (
>>   `t_dmu_history_id` int(11) NOT NULL AUTO_INCREMENT,
>>   `DM_History_DM_ID` int(11) DEFAULT NULL,
>>   `DM_History_Customer_ID` int(11) DEFAULT NULL,
>>   PRIMARY KEY (`t_dmu_history_id`),
>>   KEY `DM_History_DM_ID` (`DM_History_DM_ID`),
>>   KEY `DM_History_Customer_ID` (`DM_History_Customer_ID`)
>>) ENGINE=InnoDB AUTO_INCREMENT=2 DEFAULT CHARSET=utf8
>>
>>Regards,
>>
>>Jerry Schwartz
>>Global Information Incorporated
>>195 Farmington Ave.
>>Farmington, CT 06032
>>
>>860.674.8796 / FAX: 860.674.8341
>>E-mail: jerry@stripped
>>Web site: www.the-infoshop.com
>>
>>
>> >-----Original Message-----
>> >From: mos [mailto:mos99@stripped]
>> >Sent: Sunday, January 02, 2011 11:42 PM
>> >To: mysql@stripped
>> >Subject: Re: This just seems to slow
>> >
>> >Jerry,
>> >       Use "Load Data Infile" when loading a lot of data. Whoever is 
>> > giving
>> >you the data should be able to dump it to a CSV file. Your imports will be
>> >much faster.
>> >
>> >Mike
>> >
>> >At 07:51 PM 1/2/2011, you wrote:
>> >>I'm trying to load data into a simple table, and it is taking many hours
>> >>(and
>> >>still not done). I know hardware, etc., can have a big effect, but 
>> >>NOTHING
>> >>should have this big an effect.
>> >>
>> >>=====
>> >>us-gii >show create table t_dmu_history\G
>> >>*************************** 1. row ***************************
>> >>        Table: t_dmu_history
>> >>Create Table: CREATE TABLE `t_dmu_history` (
>> >>   `t_dmu_history_id` int(11) NOT NULL AUTO_INCREMENT,
>> >>   `DM_History_DM_ID` int(11) DEFAULT NULL,
>> >>   `DM_History_Customer_ID` int(11) DEFAULT NULL,
>> >>   PRIMARY KEY (`t_dmu_history_id`),
>> >>   KEY `DM_History_DM_ID` (`DM_History_DM_ID`),
>> >>   KEY `DM_History_Customer_ID` (`DM_History_Customer_ID`)
>> >>) ENGINE=InnoDB AUTO_INCREMENT=1446317 DEFAULT CHARSET=utf8
>> >>=====
>> >>
>> >>Here's a snip of what the input file looks like:
>> >>=====
>> >>SET autocommit=1;
>> >>
>> >>#
>> >># Dumping data for table 'T_DMU_History'
>> >>#
>> >>
>> >>INSERT INTO `T_DMU_History` (`DM_History_DM_ID`, 
>> >>`DM_History_Customer_ID`)
>> >>VALUES (13071, 299519);
>> >>INSERT INTO `T_DMU_History` (`DM_History_DM_ID`, 
>> >>`DM_History_Customer_ID`)
>> >>VALUES (13071, 299520);
>> >>INSERT INTO `T_DMU_History` (`DM_History_DM_ID`, 
>> >>`DM_History_Customer_ID`)
>> >>VALUES (13071, 299521);
>> >>INSERT INTO `T_DMU_History` (`DM_History_DM_ID`, 
>> >>`DM_History_Customer_ID`)
>> >>VALUES (13071, 299522);
>> >>INSERT INTO `T_DMU_History` (`DM_History_DM_ID`, 
>> >>`DM_History_Customer_ID`)
>> >>VALUES (13071, 299524);
>> >>INSERT INTO `T_DMU_History` (`DM_History_DM_ID`, 
>> >>`DM_History_Customer_ID`)
>> >>VALUES (13071, 299526);
>> >>INSERT INTO `T_DMU_History` (`DM_History_DM_ID`, 
>> >>`DM_History_Customer_ID`)
>> >>VALUES (13071, 299527);
>> >>INSERT INTO `T_DMU_History` (`DM_History_DM_ID`, 
>> >>`DM_History_Customer_ID`)
>> >>VALUES (13071, 299528);
>> >>INSERT INTO `T_DMU_History` (`DM_History_DM_ID`, 
>> >>`DM_History_Customer_ID`)
>> >>VALUES (13071, 299529);
>> >>INSERT INTO `T_DMU_History` (`DM_History_DM_ID`, 
>> >>`DM_History_Customer_ID`)
>> >>VALUES (13071, 299531);
>> >>INSERT INTO `T_DMU_History` (`DM_History_DM_ID`, 
>> >>`DM_History_Customer_ID`)
>> >>VALUES (13071, 299532);
>> >>INSERT INTO `T_DMU_History` (`DM_History_DM_ID`, 
>> >>`DM_History_Customer_ID`)
>> >>VALUES (13071, 299533);
>> >>INSERT INTO `T_DMU_History` (`DM_History_DM_ID`, 
>> >>`DM_History_Customer_ID`)
>> >>VALUES (13071, 299534);
>> >>INSERT INTO `T_DMU_History` (`DM_History_DM_ID`, 
>> >>`DM_History_Customer_ID`)
>> >>VALUES (13071, 299535);
>> >>INSERT INTO `T_DMU_History` (`DM_History_DM_ID`, 
>> >>`DM_History_Customer_ID`)
>> >>VALUES (13073, 298880);
>> >>INSERT INTO `T_DMU_History` (`DM_History_DM_ID`, 
>> >>`DM_History_Customer_ID`)
>> >>VALUES (13073, 298881);
>> >>INSERT INTO `T_DMU_History` (`DM_History_DM_ID`, 
>> >>`DM_History_Customer_ID`)
>> >>VALUES (13073, 298882);
>> >>INSERT INTO `T_DMU_History` (`DM_History_DM_ID`, 
>> >>`DM_History_Customer_ID`)
>> >>VALUES (13073, 298883);
>> >>INSERT INTO `T_DMU_History` (`DM_History_DM_ID`, 
>> >>`DM_History_Customer_ID`)
>> >>VALUES (13073, 298884);
>> >>INSERT INTO `T_DMU_History` (`DM_History_DM_ID`, 
>> >>`DM_History_Customer_ID`)
>> >>VALUES (13073, 298885);
>> >>INSERT INTO `T_DMU_History` (`DM_History_DM_ID`, 
>> >>`DM_History_Customer_ID`)
>> >>VALUES (13073, 298886);
>> >>INSERT INTO `T_DMU_History` (`DM_History_DM_ID`, 
>> >>`DM_History_Customer_ID`)
>> >>VALUES (13073, 298887);
>> >>INSERT INTO `T_DMU_History` (`DM_History_DM_ID`, 
>> >>`DM_History_Customer_ID`)
>> >>VALUES (13073, 298889);
>> >>INSERT INTO `T_DMU_History` (`DM_History_DM_ID`, 
>> >>`DM_History_Customer_ID`)
>> >>VALUES (13073, 298890);
>> >>INSERT INTO `T_DMU_History` (`DM_History_DM_ID`, 
>> >>`DM_History_Customer_ID`)
>> >>VALUES (13073, 298891);
>> >>=====
>> >>
>> >>There are about 870000 records.
>> >>
>> >>I realize that using one INSERT per row is going to hurt, but I don't
>> >>control
>> >>the format of the incoming data.
>> >>
>> >>Besides, I'd have thought this would be pretty quick regardless of how
>> >>clumsy
>> >>the method was.
>> >>
>> >>Is that "autocommit" a problem? This is a bulk load into an empty table,
> 
>> >>so
>> >>I'm not worried about ACID.
>> >>
>> >>Any suggestions?
>> >>
>> >>
>> >>
>> >>
>> >>
>> >>Regards,
>> >>
>> >>Jerry Schwartz
>> >>Global Information Incorporated
>> >>195 Farmington Ave.
>> >>Farmington, CT 06032
>> >>
>> >>860.674.8796 / FAX: 860.674.8341
>> >>E-mail: jerry@stripped
>> >>Web site: www.the-infoshop.com
>> >>
>> >>
>> >>
>> >>
>> >>
>> >>--
>> >>MySQL General Mailing List
>> >>For list archives: http://lists.mysql.com/mysql
>> >>To unsubscribe:    http://lists.mysql.com/mysql?unsub=1
>> >
>> >
>> >--
>> >MySQL General Mailing List
>> >For list archives: http://lists.mysql.com/mysql
>> >To unsubscribe:    http://lists.mysql.com/mysql?unsub=1




Thread
This just seems to slowJerry Schwartz3 Jan
  • Re: This just seems to slowmos3 Jan
    • RE: This just seems to slowDaevid Vincent3 Jan
      • Re: This just seems to slowWagner Bianchi3 Jan
        • RE: This just seems to slowJerry Schwartz3 Jan
      • RE: This just seems to slowJerry Schwartz3 Jan
        • Re: This just seems to slowMySQL)3 Jan
    • RE: This just seems to slowJerry Schwartz3 Jan
      • RE: This just seems to slowGavin Towey3 Jan
        • RE: This just seems to slowJerry Schwartz3 Jan
      • RE: This just seems to slowmos3 Jan
        • RE: This just seems to slowJerry Schwartz4 Jan
          • RE: This just seems to slowmos4 Jan
            • RE: This just seems to slowJerry Schwartz4 Jan
  • mysqlimport doesn't work for meJerry Schwartz3 Jan
    • Re: mysqlimport doesn't work for meCarsten Pedersen3 Jan
      • RE: mysqlimport doesn't work for meJerry Schwartz3 Jan