Beefy Boxes and Bandwidth Generously Provided by pair Networks
Don't ask to ask, just ask
 
PerlMonks  

Confuse and lost in Transfer files process

by vrempire (Sexton)
on Sep 18, 2000 at 07:59 UTC ( [id://32913]=perlquestion: print w/replies, xml ) Need Help??

vrempire has asked for the wisdom of the Perl Monks concerning the following question:

<html><head><title></title></head><body> Hi,
Currently,I am having a problem regarding the way to transfer the files across the server.
Before this,I am using the Net::FTP to transfer the files.But since,I am not the administrator of the server,I do not have the privilege to install the libnet on it.
So,I try another alternative way,which is using the UNIX and FTP command inside the perl script.
This is the script:
#!/usr/bin/perl print "Content-type:text/html\n\n"; use File::Copy; $remotehost1 ="a"; $remotepath ="/b"; $remoteuser ="c"; $remotepass ="d"; @fileftp = qw(x.wml y.wml z.wml); $cmd="ftp -n"; foreach $wmlfile (@fileftp) { if(-f $wmlfile) { $ftp_commands = " open $remotehost1 user $remoteuser $remotepass cd $remotepath asc put $wmlfile bye "; open (CMD, "|$cmd"); print CMD $ftp_commands; close (CMD); print "File $wmlfile has been transferred \n"; print "<br>\n"; $finish = 'transfer'; copy ("$wmlfile","./$finish/$wmlfile"); } }

This script works fine,where
1.The file succeedly put inside the remotehost1 server.OK
2.The files @fileftp can be copied to 'transfer' folder.OK
except
Now I want to delete the files from @fileftp in the original folder in local host after I copy them to 'transfer'.
But ,when I insert another lines below the programs,which are
foreach $filefinish (@fileftp) { unlink $filefinish; }

Then the 'Hell Breaks Loose'.What happened is
1.The files cannot be put inside the remote host server.KO.
2.The files from @fileftp can be copied from the local folder to 'transfer'.OK.
3.The files where I put the @fileftp in the first place,is deleted.(Above the 'transfer' directory).OK
After I execute it inside perl interpreter,it shows
C:\Perl\bin>perl file.pl File x.wml has been transferred <br> File y.wml has been transferred <br> File z.wml has been transferred <br> x.wml: File not found y.wml: File not found z.wml: File not found C:\perl\bin>
Please help me on this problem,since I have tried to solve it in many other different ways,but still stuck in the same place.I really really apreciate it if anyone can tell me what's wrong with it and how to solve it.Thank you in advance. </body> </html>

Replies are listed 'Best First'.
Re: Confuse and lost in Transfer files process
by tye (Sage) on Sep 18, 2000 at 08:26 UTC

    My first reaction is that you have 3 processes running FTP trying to authenticate to the remote server, each with all of their commands buffered up when your script runs ahead and deletes all of the files. A short while later, each FTP process gets authenticated and reads the commands from the input buffer and can't access the files that have already been deleted.

    The only thing that makes me think this shouldn't be the case is:

    close (CMD);
    But this is indented differently than the rest of your script so perhaps you added that to the posting but it isn't actually in the code that is causing you problems.

    You see, an explicit close on a file handle to a piped command also waits for the subprocess to exit. So the presence of that code should make what I described very unlikely.

    My second reaction is "Don't you ever check for failure?!" I can almost excuse a quick hack missing some checks for failure. But once that quick hack starts not working the way you expected, one of the first things you need to do is add any checks for failure that you left out at first. Well, you left them all out:

    sub fail { # Allow failures to show on the web page. print @_; exit 0; } foreach $wmlfile (@fileftp) { if(-f $wmlfile) { $ftp_commands = " open $remotehost1 user $remoteuser $remotepass cd $remotepath asc put $wmlfile bye "; open (CMD, "|$cmd") or fail "Can't fork() to run $cmd ($wmlfile): $!\n"; print CMD $ftp_commands or fail "Can't write commands to $cmd ($wmlfile): $!\n"; close (CMD) or fail "$cmd ($wmlfile) failed? ($?): $!\n"; print "File $wmlfile has been transferred \n"; print "<br>\n"; $finish = 'transfer'; copy ($wmlfile,"./$finish/$wmlfile") or fail "Can't save $wmlfile to $finish: $!\n"; } }

    The error message for close failing is a bit ambiguous because it failing could mean the subprocess returned a non-zero status or it could be some error in the parent process such as not being able to flush some buffered input (thought the subprocess failing seem by far the most likely reason in this case).

    Update: I've replaced the calls to die so that the errors would be visible on the web page, since this is a CGI script.

            - tye (but my friends call me "Tye")
Re: Confuse and lost in Transfer files process
by swiftone (Curate) on Sep 18, 2000 at 20:05 UTC
    The others appear to have helped to debug your code, but here's my suggestion: Install libnet and do it in perl. I know you aren't the system administrator, but you can find quick and easy advice on how to install and use modules when you aren't the admin at: Using modules without admin privelages
RE: Confuse and lost in Transfer files process
by moen (Hermit) on Sep 18, 2000 at 11:32 UTC
    I tested your code and everything worked just fine. No error no nothing, except the expected. I did this on a Win2000 box, then i also tested this at a Win98 box, and guess what.. i got File Not Found errors. Figures..ehh? :) By the way, works fine on Linux too.

      Ah, a Win9x "quirk". I'll have to play with that when I get back to my Win9x box.

      So I suggest you add an extra synchronization step. The most reliable I've come up with is something like:

      sub fail { # Allow failures to show on the web page. print @_; exit 0; } $finish = 'transfer'; foreach $wmlfile (@fileftp) { if(-e "$finish/$wmlfile") { unlink "$finish/$wmlfile" or die "Can't delete $finish/$wmlfile: $!\n"; } if(-f $wmlfile) { $ftp_commands = " open $remotehost1 user $remoteuser $remotepass cd $remotepath asc put $wmlfile lcd $finish get $wmlfile bye\n"; open (CMD, "|$cmd") or fail "Can't fork() to run $cmd ($wmlfile): $!\n"; print CMD $ftp_commands or fail "Can't write commands to $cmd ($wmlfile): $!\n"; close (CMD) or fail "$cmd ($wmlfile) failed? ($?): $!\n"; for( 0..100 ) { last if -f "$finish/$wmlfile"; sleep 5; } die "$finish/$wmfile not found!\n" unless -f "$finish/$wmlfile"; print "File $wmlfile has been transferred \n"; print "<br>\n"; } }

              - tye (but my friends call me "Tye")
Re: Confuse and lost in Transfer files process
by vrempire (Sexton) on Sep 22, 2000 at 04:25 UTC
    <html><head><title></title></head><body><Hi friends,well thank you of all your answer.I finally get my program run right after all.
    Since ,TIMTOWTDI,I did it using the same program I post,but with different arrangement.
    From what I understand from tye,I think my algorithm is right from first place,but when it is run inside the computer,then the internal working of processing the algorithm is not run very well.
    I do not know very detailed about this,but I think I got idea about it.
    It might be something involving the buffer and the speed the perl program is interpreted.Before the process to FTP the file is done,completely,the file has been unlink inside the buffer which is used to send the file to other server.Please correct me if I am wrong.
    So,that's why there is the time delay between the execution.
    C:\Perl\bin>perl file.pl File x.wml has been transferred <br> File y.wml has been transferred <br> File z.wml has been transferred <br> <--Time delay around 3 seconds--> <--The file is inside the output buffer to be send--> <--while waiting to the remotehost to open port or connection,the unli +nk process has unlink all the files--> <--when the remotehost has give permission to enter the server,the fil +es has already been deleted--> <--So,that's why the error mesage below is sent--> x.wml: File not found y.wml: File not found z.wml: File not found <--mission 1 which is to send the files to other server is not accompl +ished--> C:\perl\bin>

    So,what I did is ,I put the unlink process above the ftp proces,so each time the new files are been transferred,the program will delete the previous files which have been transferred already,and then go to ftp the new files.
    This means that,after the new files have been transferred,it will not be deleted,until another user run the program again.So,I can check what is the files the user has transferred.If the server cannot be connected,the files whic are selected to transfer is not deleted yet inside the localhost.So,I can take the files and then transfer it manually,by using the e-mail etc,since it is not been deleted yet.
    Anyway,now the program is working fine.Thanks for all your supports.
    V R E M P I R E
    P/S:Get the idea while washing the clothes, ;) </body></html>

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: perlquestion [id://32913]
Approved by root
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others imbibing at the Monastery: (5)
As of 2024-04-24 11:02 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found