Home Messages Index
[Date Prev][Date Next][Thread Prev][Thread Next]
Author IndexDate IndexThread Index

Re: Browser limits

__/ [ BearItAll ] on Wednesday 16 August 2006 14:47 \__

> I wouldn't normally use a browser to download large files, much easier with
> wget. So I haven't hit the 2G limit before.
> 
> But I was on a site this morning and the only way their gave was via the
> browser, though they did tell you that most browsers would hit a size limit
> including Firefox (of cause I missed that bit of text until I discovered
> the file wasn't fully downloaded, as you do). I thought that Opera would
> have no trouble, but it turns out that is the same.
> 
> Gzilla is the one to use for these large downloads, it's working fine so
> far.
> 
> But I just wondered if anyone had seen a reason for the download size
> limit? I can understand the limit of number of concurrent downloads,
> browsers are too big-n-bulky for that sort of job, but the file size limit
> doesn't seem to have a reason other than a number the browser writers
> picked out of the air.

I can only think of the file size limit (4GB for NTFS) as a factor. I can
recall being forced to slice zip files to make file transfers possible.
Also, as my external hard-drive comes with a braindead filesystem, tars need
to be sliced and reassembled. Could the limit, which needs to be imposed
somewhere (probably not if properly implemented), be driven by some obscure
convention? Vista (beta 2) ISO is 3.5 GB.

[Date Prev][Date Next][Thread Prev][Thread Next]
Author IndexDate IndexThread Index