>>>>> "Sergey" == Sergey Vojtovich <svoj@stripped> writes:
Sergey> Nothing else on my mind.
Sergey> On Tue, Mar 02, 2010 at 08:11:27AM -0500, Zardosht Kasheff wrote:
>> Are there other problems besides what has been listed? I ask because
>> the performance advantage we can gain from this optimization is huge.
I would recommend you to do the following:
- Do it always for ALTER TABLE (this either works or fails so the
optimization is always good here)
- If there is no triggers and no unique constraints, always do the
index build in the background. (This is more or less what
- Before enabling the optimization, check that there is no triggers
on the table.
If you want to this this 'completely right' for bulk load:
- Add a table_flag: HA_UNIQUE_DONE_IN_BACKGROUND (to sql/handler.h)
- In sql/sql_insert.cc:mysql_insert(), after:
if (duplic == DUP_REPLACE &&
(!table->triggers || !table->triggers->has_delete_triggers()))
if (table->table_flags() & HA_UNIQUE_DONE_IN_BACKGROUND)
int count= table->found_duplicates();
- Do the same in sql/sql_load.cc
- In your_handler::start_bulk_insert(), setup your engine to do
background index and in end_bulk_insert() wait until all index are
generated and save number of duplicates for 'found_duplicates' call.