In message <PPEEKJBNGLCGILHOJMMOMEBPLEAA.dan@stripped>, "Dan Harrington" write
> I have an ASCII CSV or Tab Delimited file that is roughly 3.5
> gigabytes, and I want to load it into a mysql database so I can do
> some analysis. First of all, I'm wondering, is there anything I
> should be aware of, or worried about, size-wise?
Any ideas of how many records?
> I know that I can't even look at the file using basic text functions
> in my Linux box like 'head' or 'split'
But, you can hack a small perl script that reads line by line flushing
output, so you can at least look at it,.
> There is a list of the fields in the file, so I know what my table
> should look like, but I don't want to crash the SQL server if its
> too large a file, or something else like that.
I'd doubt that MySQL would crash about a few Million records. Well,
you'd check if the disk space is there, too :-)-O
> I didn't know how big it was originally, so I was just going to use
> phpMyAdmin to load the file through a web browser.... though I
> don't know if that will work either. Is there a size limitation to
> HTTP-POST (I assume it uses that method to upload).
Pumping 3.5 GB through an network unnecessarily is fairly silly,
Dr. Eberhard W. Lisse \ / Obstetrician & Gynaecologist (Saar)
<el@stripped> el108 * | Swakopmund State Hospital
Private Bag 5004 \ / Telephone: +49 177 214 3196 (cell)
Swakopmund, Namibia ;____/ Currently on Post Graduate Study Leave