Before I spend more time on trying to get query times of <2 seconds on
tables with between 500,000 and 1,250,000 rows, I thought I'd ask if
anyone has done any web-based statistical pages which are driven by
mysql databases with tables this size.
Is it feasible to try and directly query tables this size from a web
application to return graphs, etc based on the returned recordsets
provided my mysql server is configured correctly and the tables are
constructed correctly, or would a it be better to create scheduled
queries and dump the summarised data into smaller tables for querying by
a web-based application? Obviously I'm trying to avoid waits of longer
than around 15 seconds for a stats page.
I've tried spending time going through the docs and mailing list
archives to understand the way queries use indexes in mysql - I estimate
a few more weeks of doing this might help some ;) but I suspect I may be
expecting too much by tring to use larger tables directly to drive such