List:General Discussion« Previous MessageNext Message »
From:Dan Harrington Date:May 22 2002 4:41pm
Subject:Loading massive data set from CSV
View as plain text  
Greetings everyone,

I have an ASCII CSV or Tab Delimited file that is 
roughly 3.5 gigabytes, and I want to load it into a mysql
database so I can do some analysis.

First of all, I'm wondering, is there anything I should be aware of, 
or worried about, size-wise?

I know that I can't even look at the file using basic text functions
in my Linux box like 'head' or 'split'

Initially, I was thinking I'd use 'split' to break it into smaller chunks
but split won't read it.

I can't even use 'wc -l' to find out how many lines or records are in the file.

There is a list of the fields in the file, so I know what my table should 
look like, but I don't want to crash the SQL server if its too large a file,
or something else like that.  I didn't know how big it was originally, so I 
was just going to use phpMyAdmin to load the file through a web browser....
though I don't know if that will work either.  Is there a size limitation
to HTTP-POST (I assume it uses that method to upload).

Comments?

Thanks
Dan

Thread
connecting with myodbcMir S Islam13 Mar
  • connecting with myodbcsinisa14 Mar
  • Re: Pre-release of MySQL 3.23.50Philip Molter23 Apr
    • Re: Pre-release of MySQL 3.23.50(Trond Eivind Glomsrød)23 Apr
  • Re: Pre-release of MySQL 3.23.50Unknown Sender23 Apr
    • Re: Pre-release of MySQL 3.23.50Michael Widenius15 May
      • Loading massive data set from CSVDan Harrington22 May
        • Re: Loading massive data set from CSVEberhard Lisse22 May
        • Re: Loading massive data set from CSVGavin Brown22 May
  • Download of sourceLuciano Barcaro29 Apr
    • Re: Download of sourcePaul DuBois29 Apr
  • Re: Download of sourceLuciano Barcaro29 Apr
Re: Pre-release of MySQL 3.23.50David Huxtable25 Apr