I'm writing a search engine in PERL right now...it runs great and fast
too...except for one thing. The program that goes out and fetches the web
pages is currently doing it in sequential order (waiting for #304 to
finish before doing #305). I would like to program it to fork off into up
to 30 or 40 processes to handle this (using all the proper SQL transaction
code, etc), however I get the errors...
DBD::mysql::st execute failed: Lost connection to MySQL server during
query at ./spider.pl line 157, <GEN0> chunk 32.
when I do this. I'm trying to determine the possible reasons for this.
One question would be, if I form a connection before forking, can all
processes use the same $DBH handle (speaking perl here) or do I need to
rather re-connect in each instance with its own handle? If that's not the
problem, what else could be causing it?
Jonathan A. Zdziarski
Sr. Systems Administrator
|• Forking Processes||Jonathan A. Zdziarski||28 Mar|
| • Forking Processes||Michael Widenius||29 Mar|