Beefy Boxes and Bandwidth Generously Provided by pair Networks
Just another Perl shrine
 
PerlMonks  

Limiting bandwidth

by sgifford (Prior)
on Oct 26, 2004 at 00:28 UTC ( [id://402398]=perlquestion: print w/replies, xml ) Need Help??

sgifford has asked for the wisdom of the Perl Monks concerning the following question:

I'm using Net::FTP to download a large file every day. The script that does this lives on a borrowed server on a borrowed network, and the organization that is graciously letting me use their network has started complaining that our download is taking up too much bandwidth. I'd like to slow down the rate at which the file is downloaded. Does anybody know of a good way to do that?

I'd much prefer a way to do this that doesn't involve changing any OS-level network settings (for example firewalls or kernel-supported rate limiting). I only have remote access to the box, and I'm reluctant to do anything that could inadvertantly cut off my access.

I see that many programs (such as wget) have options to limit bandwidth, so this must be possible in userspace.

Thanks for any advice!

Replies are listed 'Best First'.
Re: Limiting bandwidth
by BrowserUk (Patriarch) on Oct 26, 2004 at 01:10 UTC

    For a very quick fix to avoid upsetting your generous benefactor, you could simply add a select delay into the while loop in /lib/Net/FTp.pm (~ line 488). I'd put it just before the line (~490)

    select undef, undef, undef, 0.005; ## Adjust. > slower; < faster. last unless $len = $data->read($buf,$blksize);

    You can then adjust the value manually to strike a balance between your needs and theirs.

    Adding a more formal parameter that allowed the rate to be specified in terms of kb/second wouln't be too hard to do for your system using a little math and some fudge factor.

    Making that generic enough to offer it as a patch would be considerably harder as you would have to incorporate some means of self calibration across systems with different performance. And once it became a part of the standard interface, woe betide the author if it only achieved 1.9kb/sec when requested for 2kb/sec.


    Examine what is said, not who speaks.
    "Efficiency is intelligent laziness." -David Dunham
    "Think for yourself!" - Abigail
    "Memory, processor, disk in that order on the hardware side. Algorithm, algorithm, algorithm on the code side." - tachyon
Re: Limiting bandwidth
by jfroebe (Parson) on Oct 26, 2004 at 01:12 UTC

    Well, if you aren't already, compress the file prior to downloading it. Then use as you suggested, wget to limit the bandwidth. Read the source of wget to see how it is done then implement it in perl if you really need it in perl.

    Setting the blocksize on Net::FTP won't make much difference at all.

    Not much else to tell ya

    Jason L. Froebe

    Team Sybase memberNo one has seen what you have seen, and until that happens, we're all going to think that you're nuts. - Jack O'Neil, Stargate SG-1

Re: Limiting bandwidth
by steves (Curate) on Oct 26, 2004 at 00:48 UTC

    Net::FTP supports a BlockSize setting that might have an effect if set very low. But I think that's doubtful.

    I think my first try would be to call get with a filehandle as my local file (get supports either a file name or a filehandle). You could then presumably tie the filehandle to a homegrown object that implements a WRITE method that limits bandwidth by going to sleep periodically or something. Completely untried, but that would be fun to give a shot ...

Re: Limiting bandwidth
by crenz (Priest) on Oct 26, 2004 at 10:18 UTC

    Another approach is to think about whether you really have to copy the whole file every day. Using rsync might help your bandwith usage also.

      Also, rsync has a --bwlimit=KBPS option.

Re: Limiting bandwidth
by jarich (Curate) on Oct 26, 2004 at 12:26 UTC
    What kind of file is it? For example is it a log file for some program? Is it a file that gets changed often or do you download it every day just in case it was changed yesterday? Do you need all the date contained in it?

    With answers to these questions we might be able to help reduce the times you need to download it at all, and that might make your benefactors very happy.

    A few scenarios come to mind:

    1. It's a file which only changes once every 2 days or less often
    2. It's a log file or similar and you only care about the last X many lines, where X is much smaller than the total number of lines
    3. It's a file which changes all the time and you care about all of it

    Let's consider these options:

    It's a file which only changes rarely

    Run a hashing algorithm over the file (eg md5sum) and download the result. If the hash is the same as the previous day then don't download it.

    A disadvantage with this is you may need to ask their sysadmin to assist you somewhat, by giving you sufficent access to run this command, or by agreeing to run a scheduled task every night which creates the hash for you.

    It's a log file

    Rotate the file more often (daily), so that you only need to download the bits you care about.

    It's a file which changes all the time

    This is the hard one. All you can do here is:

    • Make sure you compress the file
    • Shape your traffic
    • Hope they'll still like you.
    • Consider whether you can download it every two days late at night, instead of daily, so as to minimise the disruption caused.

    Of course, if it is one of the first two, then so long as you compress you file as well; one of those methods should help a lot.

    Hope this helps.

    jarich

      Hi jarich, It's a ZIP file full of images that changes every day. So unfortunately none of these ideas will help with my particular situation, but they're definitely good things to keep in mind!
Re: Limiting bandwidth
by neilh (Pilgrim) on Oct 26, 2004 at 01:12 UTC
    You could use Shaper/CBQ to limit the bandwidth used.
    That is, of course, if they'll install it for you.

    Neil

Re: Limiting bandwidth
by TedPride (Priest) on Oct 26, 2004 at 07:31 UTC

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: perlquestion [id://402398]
Approved by Old_Gray_Bear
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others goofing around in the Monastery: (5)
As of 2024-03-28 17:11 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found