List:General Discussion« Previous MessageNext Message »
From:domi Date:May 31 2002 6:20am
Subject:RE: I need 50.000 inserts / second
View as plain text  
Hi !!

You could maybe buffer the data in Your application
and then run inserts later... like this.

struct oneRow{
  double timestamp;
  double data;
  etc, etc
}

struct oneRow rows[num_of_rows];

for (int i = 1; i < num_of_rows; i++)
{
   // collect data
   rows[i].timestamp = (double) i;
   rows[i].data = i / 1000;
   etc etc 
}

// then you can loop through your
// data and do inserts "delayed"
i = 0;
while(rows[++i].timestamp]
{
   // do inserts
}

I might be missunderstanding You since I don't get this together...
   You wrote:
   The following C++ code with mysql++ takes 5 seconds to 
   execute in my Athlon 1.33 machine:

   And later on:
   I'm shocked with the performance of MySQL, a similar query 
   to compute 1 million records takes 1.17 seconds in MySQL

So , if "similar" query makes 1 million in about a second, how
come You have problems with "only" 50 K/s with another "similar"
query....

=d0Mi=

> Hello,
> 
> I intend to use MySQL in a data acquisition software. The actual version
> stores the acquired data straight in files. The sample rate can get up to 50
> kHz. I would like to know if there is some way to improve MySQL insert rate.
> The following C++ code with mysql++ takes 5 seconds to execute in my Athlon
> 1.33 machine:
> 
> sql_create_2 (teste1, 1, 2, double, datahora, double, valor1)
> 
> int main() {
>   try { // its in one big try block
> 
>     Connection con(use_exceptions);
>     con.connect("cesar");
>     Query query = con.query();
> 
>     teste1 row;
>     // create an empty stock object
> 
>  for (int i=1;i<50000;i++)
>  {
>   row.datahora = (double) i;
>   row.valor1 = i / 1000;
> 
>   query.insert(row);
>   query.execute();
>  }
> 
> 
> As you can see there are only two fields: a double timestamp and a double
> value. In the real application there are some more double values. I need to
> decrease this time to less than 1 second. Is there any kind of buffered
> inserts or maybe a way that I could pass a matrix?
> 
> I'm shocked with the performance of MySQL, a similar query to compute 1
> million records takes 1.17 seconds in MySQL and around 6 seconds in the
> current system. So if I can decrease the insert time I'll definetly use
> MySQL!
> 
> Thank you for the attention.
> 
> Best regards,
> Cesar
> 
> 
> 
> ---------------------------------------------------------------------
> Before posting, please check:
>    http://www.mysql.com/manual.php   (the manual)
>    http://lists.mysql.com/           (the list archive)
> 
> To request this thread, e-mail <mysql-thread110604@stripped>
> To unsubscribe, e-mail <mysql-unsubscribe-domi=dcs.net@stripped>
> Trouble unsubscribing? Try: http://lists.mysql.com/php/unsubscribe.php
> 
> 
Thread
I need 50.000 inserts / secondCesar Mello - Axi31 May
  • Re: I need 50.000 inserts / secondJeremy Zawodny31 May
  • Re: I need 50.000 inserts / secondSteve Edberg31 May
  • Re: I need 50.000 inserts / secondHarald Fuchs31 May
  • Re: I need 50.000 inserts / secondMark31 May
    • Re: I need 50.000 inserts / secondBenjamin Pflugmann31 May
    • Re: I need 50.000 inserts / secondDan Nelson31 May
      • Re: I need 50.000 inserts / secondJeremy Zawodny1 Jun
  • Re: I need 50.000 inserts / secondMark31 May
  • Re: I need 50.000 inserts / secondCesar Mello - Axi31 May
    • Re: I need 50.000 inserts / secondmos2 Jun
  • Re: I need 50.000 inserts / secondRichard Clarke2 Jun
RE: I need 50.000 inserts / seconddomi31 May
  • Re: I need 50.000 inserts / secondCesar Mello - Axi31 May
    • Re: I need 50.000 inserts / secondT├Ánu Samuel31 May
    • Re: I need 50.000 inserts / secondBrent Baisley31 May