In the last episode (Dec 21), Asif Lodhi said:
> Would you like to express your opinion as to what design strategy to
> take if a table (used for read operations only) is supposed to get
> more than 3GB of data per day? With 1000 simultaneous users ?
With that data rate, you'll definitely have to use partitioning or
MERGE tables, and generate a table per day or something similar.
Otherwise aging records out of the table would be impossible. You
didn't mention the insert rate, size of the records inserted, or
expected queries, so it's difficult to suggest which storage engine to
use. The archive engine will significntly reduce your 3gb data size,
for example, but doesn't support indexes.