Here's the situation in my humble monastic cell... I recently bought a Synology NAS, installed Synology Drive and started syncing my drives to have a backup at hand in case an electronic fiend struck down my system.
The thing is that I also use a nice little script I wrote to scan various sites for free e-books and comic strips that I can then download. The e-book script uses WWW::Mechanize::Firefox and MozRepl on Firefox 45.9 while the comics script uses LWP::Simple. The latter runs without a problem at all times. However, whenever the Synology Drive (SD) software is running, my e-book script simply stagnates and can hardly communicate with MozRepl on Firefox. Regularly, they would exchange data at a rate of hundreds of KB/s but, when SD is doing its syncing tasks, the data rate drops to meagre B/s. Even if I pause SD, I see no improvement. The only thing that does the trick is a reboot and making sure SD remains paused while my script is running.
I have tried applying network filters that leave ample bandwidth for Perl and MozRepl to use but to no effect, of course, as they are both running on the same machine. I have tried to limit Synology Drive's CPU usage but that didn't work either. It never appeared to have a voracious appetite for CPU time, anyway. I've read that most syncing apps simply use too much memory to bring our machines to their knees but with 64GB on board I would be surprised to see it go in a flash down SD's drainpipe (problems appear even if SD starts running a couple of minutes before I trigger my script).
So... I am at a loss as to how to go about spotting the problem (let alone solving it). Any pointers anybody?