zlacker

[return to "Copyparty – Turn almost any device into a file server"]
1. skibz+M2[view] [source] 2025-07-28 15:14:17
>>saint1+(OP)
The author of this tool uploaded a YouTube video demonstrating it a few days ago: https://www.youtube.com/watch?v=15_-hgsX2V0

At one point in his demo, he uploads a file but terminates the upload more or less halfway. Then he begins downloading the file - which only progresses to the point it had been uploaded, and subsequently stalls indefinitely. And, finally, he finishes uploading the file (which gracefully resumes) and the file download (which is still running) seamlessly completes.

I found that particularly impressive.

◧◩
2. nkrisc+S6[view] [source] 2025-07-28 15:40:43
>>skibz+M2
It's very impressive, particularly if you remember waking up to a failed download from the night before over dial-up.
◧◩◪
3. Mister+7D[view] [source] 2025-07-28 18:45:55
>>nkrisc+S6
Most files were available via FTP which supported resume.
◧◩◪◨
4. henry7+H21[view] [source] 2025-07-28 20:55:21
>>Mister+7D
Not most. There was (and still is) so much locked behind HTTP on poor servers
◧◩◪◨⬒
5. dspill+vA2[view] [source] 2025-07-29 10:22:17
>>henry7+H21
The vast majority of web servers out there¹ support partial download and have done for years. That the most common UA for accessing them (web browsers) don't support the feature² without addons, is not a server-side problem.

Sometimes there are server-side problems: some dynamic responses (i.e. files that are behind a user account so need the right to access checked before sending) are badly designed so that they uneccesarily break sub-range downloads. This could be seen as a “poor server” issue, but I think it is more a “daft dev/admin” or “bad choice of software” problem.

--------

[1] admittedly not all, but…

[2] wget and curl do, though not automatically without a wrapper script

[go to top]