List:General Discussion« Previous MessageNext Message »
From:Johan De Meersman Date:May 16 2014 2:04pm
Subject:Re: Advices for work with big tables
View as plain text  
----- Original Message -----
> From: "Antonio Fernández Pérez" <antoniofernandez@stripped>
> Subject: Advices for work with big tables
> 
> Hi,
> 
> I write to the list because I need your advices.
> 
> I'm working with a database with some tables that have a lot of rows, for
> example I have a table with 8GB of data.
> 
> How can I do to have a fluid job with this table?

The two easiest points of optimisation are:
 * Make sure your queries can use indexes as much as possible
   * Percona Toolkit has good stuff for figuring that out. Have a look at pt-query-digest.
 * Provide enough memory to your MySQL server to fit the entire database (and don't forget
to allocate it to your InnoDB bufferpool if that applies :-) )
   * failing that, at least enough memory to keep the most frequently used dataset in
memory
     * failing *that*, but here you're running into disk bottlenecks already, enough
memory to keep your indexes in memory
   * faster disks for data that doesn't fit in memory (SSD, FusionIO etc)

Memory required for the full dataset: select sum(data_length+index_length) from
information_schema.tables;
Memory required for indexes: select sum(index_length) from information_schema.tables;

There's no easy way to figure out your active data set size, that depends on what queries
are performed most often.


Depending on what type of disk cabinet you have, it may be possible to replace some drives
with full SSDs, or with disks that have a built-in SSD cache.


-- 
Unhappiness is discouraged and will be corrected with kitten pictures.
Thread
Advices for work with big tablesAntonio Fernández Pérez16 May 2014
  • Re: Advices for work with big tablesReindl Harald16 May 2014
  • Re: Advices for work with big tablesJohan De Meersman16 May 2014
  • Re: Advices for work with big tablesshawn l.green16 May 2014