Roy Schestowitz <newsgroups@xxxxxxxxxxxxxxx> wrote:
> The issue here is not bandwidth. If so much bandwidth if devoured in
> the process of fetching data (e.g. scanning gigantic text files),
There is no way how this can use up bandwidth except when you send a "fake
header" to keep the connection alive (which uses not that much bandwidth).
> might sooner or later suffer from latency in page delivery. I suggest
> a makeover that involves MySQL, or maybe better -- PostgreSQL.
Based on what? I would say: if you understand how the files work, and
there is little overhead, keep with this system. Switch to a DB if you
have time to learn most ins and outs.
> To make the transition less painful, is there any simple way of
> splitting these text files? Relational databases are more elegant, but
> they are not a must.
John Perl SEO tools: http://johnbokma.com/perl/
or have them custom made
Experienced (web) developer: http://castleamber.com/