List:General Discussion« Previous MessageNext Message »
From:mos Date:June 1 2002 6:44pm
Subject:Re: I need 50.000 inserts / second
View as plain text  
At 07:07 PM 5/31/2002, you wrote:
>Hi Jeremy,
>
>I've done some tests grouping the INSERTs in groups of 1000 and 10000, and
>it got very faster, in my Pentium III 700 from job I could get less than 3s.
>But I'm still doing concatenation to get the INSERT clause in the form
>"values (x,x), (x,x) ...".  Is there a way to make a template query in
>mysql++ that could take 1000 or more parameters??
>
>Please tell me the alternative you were thinking about.
>
>Best regards,
>Cesar

Cesar,
         I haven't read the whole thread so I don't know if you tried this 
or not. But have you tried LOAD DATA INFILE 'filename.txt'???

Syntax:

LOAD DATA [LOW_PRIORITY | CONCURRENT] [LOCAL] INFILE 'file_name.txt'
[REPLACE | IGNORE]
INTO TABLE tbl_name
[FIELDS
[TERMINATED BY '\t']
[[OPTIONALLY] ENCLOSED BY '']
[ESCAPED BY '\\' ]
]
[LINES TERMINATED BY '\n']
[IGNORE number LINES]
[(col_name,...)]

This is faster than using Insert. You also have a lot of options like 
ignoring duplicate rows or replacing them with the new data. Give it a try 
and let us know how much faster it is. Cheers. :-)

Mike



>----- Original Message -----
>From: "Jeremy Zawodny" <jzawodn@stripped>
>To: "Cesar Mello - Axi" <cesar@stripped>
>Cc: <mysql@stripped>
>Sent: Thursday, May 30, 2002 9:04 PM
>Subject: Re: I need 50.000 inserts / second
>
>
> > On Fri, May 31, 2002 at 01:49:11AM -0300, Cesar Mello - Axi wrote:
> > > Hello,
> > >
> > > I intend to use MySQL in a data acquisition software. The actual
> > > version stores the acquired data straight in files. The sample rate
> > > can get up to 50 kHz. I would like to know if there is some way to
> > > improve MySQL insert rate.  The following C++ code with mysql++
> > > takes 5 seconds to execute in my Athlon 1.33 machine:
> >
> > Is the process disk bound or cpu bound?  Put another way, does the cpu
> > max out during this process?
> >
> > > sql_create_2 (teste1, 1, 2, double, datahora, double, valor1)
> > >
> > > int main() {
> > >   try { // its in one big try block
> > >
> > >     Connection con(use_exceptions);
> > >     con.connect("cesar");
> > >     Query query = con.query();
> > >
> > >     teste1 row;
> > >     // create an empty stock object
> > >
> > >  for (int i=1;i<50000;i++)
> > >  {
> > >   row.datahora = (double) i;
> > >   row.valor1 = i / 1000;
> > >
> > >   query.insert(row);
> > >   query.execute();
> > >  }
> > >
> > > As you can see there are only two fields: a double timestamp and a
> > > double value. In the real application there are some more double
> > > values. I need to decrease this time to less than 1 second. Is there
> > > any kind of buffered inserts or maybe a way that I could pass a
> > > matrix?
> >
> > You can boost the performance by inserting several records at the same
> > time in one querey using MySQL's extended insert syntax:
> >
> >   INSERT INTO my_table VALUES (foo, foo), (bar, bar), (z, z)...
> >
> > Try batching them in groups of 100 or even 1000 to see how it
> > performs.
> >
> > > I'm shocked with the performance of MySQL, a similar query to
> > > compute 1 million records takes 1.17 seconds in MySQL and around 6
> > > seconds in the current system. So if I can decrease the insert time
> > > I'll definetly use MySQL!
> >
> > Welcome to MySQL. :-)
> >
> > There's an alternative that I have in mind, but it'd be good to see
> > how much closer those get you before getting into it.
> >
> > Jeremy
> > --
> > Jeremy D. Zawodny, <jzawodn@stripped>
> > Technical Yahoo - Yahoo Finance
> > Desk: (408) 349-7878   Fax: (408) 349-5454   Cell: (408) 685-5936
> >
> > MySQL 3.23.51: up 1 days, processed 28,531,290 queries (274/sec. avg)
> >
>
>
>---------------------------------------------------------------------
>Before posting, please check:
>    http://www.mysql.com/manual.php   (the manual)
>    http://lists.mysql.com/           (the list archive)
>
>To request this thread, e-mail <mysql-thread110681@stripped>
>To unsubscribe, e-mail <mysql-unsubscribe-mos99=fastmail.fm@stripped>
>Trouble unsubscribing? Try: http://lists.mysql.com/php/unsubscribe.php

Thread
I need 50.000 inserts / secondCesar Mello - Axi31 May
  • Re: I need 50.000 inserts / secondJeremy Zawodny31 May
  • Re: I need 50.000 inserts / secondSteve Edberg31 May
  • Re: I need 50.000 inserts / secondHarald Fuchs31 May
  • Re: I need 50.000 inserts / secondMark31 May
    • Re: I need 50.000 inserts / secondBenjamin Pflugmann31 May
    • Re: I need 50.000 inserts / secondDan Nelson31 May
      • Re: I need 50.000 inserts / secondJeremy Zawodny1 Jun
  • Re: I need 50.000 inserts / secondMark31 May
  • Re: I need 50.000 inserts / secondCesar Mello - Axi31 May
    • Re: I need 50.000 inserts / secondmos2 Jun
  • Re: I need 50.000 inserts / secondRichard Clarke2 Jun
RE: I need 50.000 inserts / seconddomi31 May
  • Re: I need 50.000 inserts / secondCesar Mello - Axi31 May
    • Re: I need 50.000 inserts / secondT├Ánu Samuel31 May
    • Re: I need 50.000 inserts / secondBrent Baisley31 May