×
INTELLIGENT WORK FORUMS
FOR COMPUTER PROFESSIONALS

Log In

Come Join Us!

Are you a
Computer / IT professional?
Join Tek-Tips Forums!
  • Talk With Other Members
  • Be Notified Of Responses
    To Your Posts
  • Keyword Search
  • One-Click Access To Your
    Favorite Forums
  • Automated Signatures
    On Your Posts
  • Best Of All, It's Free!
  • Students Click Here

*Tek-Tips's functionality depends on members receiving e-mail. By joining you are opting in to receive e-mail.

Posting Guidelines

Promoting, selling, recruiting, coursework and thesis posting is forbidden.

Students Click Here

Jobs

Net:::FTP interrupted

Net:::FTP interrupted

Net:::FTP interrupted

(OP)
Hello

I have been using Net::FTP in a script to backup server data with a cron run at night. It was working, then I bumped my head and ran out of allocated space on the backup server. I contracted for additional space but it *seems* like the process *may* still be hanging..... that is, transferring some data but not all. I have figured the script doesn't complete, because it has not been stopping the FTP server (proftpd) like it did at first, and the data transferred shows one file incomplete, and the other not transferred.

I have looked at /var/log/messages but that only lists logins. The config file /etc/proftpd.conf does not seem suited to writing error messages.

I am testing the following cron statement to send output to a log file.... I will know if it worked tomorrow:

01 1 * * 2,4,6 /backup/serverbackup.pl >>/backup/backup.log

I will be grateful for any suggestions to write results to a log file when the ftp->put procedure is completed, or if it is interrupted, stalled or shut down prematurely.

Thanks, Mike

CODE

#!/usr/bin/perl

# START FTP
system "/backup/startftp.pl"; 

$dateStamp = `/bin/date +"%w%a"`; chop ($dateStamp);

$HostingbackupPath = "/htdocs";
$HostingtarPath = "/backup/hostingfiles";
$HostingtarName = "HOSTING.$dateStamp.tar";
$HostinggzName = "$HostingtarPath/$HostingtarName.gz";
$MysqltarPath = "/backup/mysqlfiles";
$MysqltarName = "MYSQL.$dateStamp.tar";
$MysqlgzName = "$MysqltarPath/$MysqltarName.gz";
$EtctarPath = "/backup/etcfiles";
$EtcbackupPath = "/etc";
$EtctarName = "ETC.$dateStamp.tar";
$EtcgzName = "$EtctarPath/$EtctarName.gz";

@db = `cat /backup/dblist`;

foreach $db (@db) {
        chomp $db;
        {
                $dbase = "$db.$dateStamp";
                `/usr/bin/mysqldump -u root -pPASS $db > $MysqltarPath/$dbase`;
        }
}
system "tar cvfp $MysqltarPath/$MysqltarName $MysqltarPath/*.$dateStamp ";
system "gzip -f $MysqltarPath/$MysqltarName";



system "tar cvfp $HostingtarPath/$HostingtarName $HostingbackupPath/*";
system "gzip -f $HostingtarPath/$HostingtarName";



use Net::FTP;

$ftp = Net::FTP->new("111.222.333.444") or die "Couldn't connect to 111.222.333.444 $@\n";
$ftp->login("USER", "PASS") or die "Couldn't login\n";
$ftp->put("$HostinggzName") or die "Couldn't upload $HostinggzName\n";
$ftp->put("$MysqlgzName") or die "Couldn't upload $MysqlgzName\n";
$ftp->quit();

# STOP FTP
system "/backup/stopftp.pl"; 

RE: Net:::FTP interrupted

Why chop vs chomp?
$dateStamp = `/bin/date +"%w%a"`; chop ($dateStamp);

You can explore the following two options of Net::Ftp to get some ideas

Timeout - Set a timeout value (defaults to 120)
Debug - debug level (see the debug method in Net::Cmd)

RE: Net:::FTP interrupted

I would also wrap the whole think in a alarm, and when the alarm hits you could issue your stopftp command.

~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
noevil
Travis - Those who say it cannot be done are usually interrupted by someone else doing it; Give the wrong symptoms, get the wrong solutions;

RE: Net:::FTP interrupted

Depends on the FTP package you're using and if anonymous login vs an acct login. A couple of FTP packages will not write to the log file until the transfer has completed successfully - if you are anonymous user.

Monitor the log file, look for changes in size. If there has been no change to the file in some defined time period then that is an indication. Or a small change over a length of time, ...

Can you include the hash switch in the FTP command? Direct the STDOUT & STDERR to a log file and that way you can see something moving.
Generally you will get better performance using rsync. And if you use the --progress switch with STDOUT & STDERR to a log file, you will be able to read its progress. The security enhancement of rsync over FTP is a given - you have rsync transferring encrypted data, FTP does not. (Up to date rsync connects using port 22 by default.)

Red Flag This Post

Please let us know here why this post is inappropriate. Reasons such as off-topic, duplicates, flames, illegal, vulgar, or students posting their homework.

Red Flag Submitted

Thank you for helping keep Tek-Tips Forums free from inappropriate posts.
The Tek-Tips staff will check this out and take appropriate action.

Reply To This Thread

Posting in the Tek-Tips forums is a member-only feature.

Click Here to join Tek-Tips and talk with other members! Already a Member? Login

Close Box

Join Tek-Tips® Today!

Join your peers on the Internet's largest technical computer professional community.
It's easy to join and it's free.

Here's Why Members Love Tek-Tips Forums:

Register now while it's still free!

Already a member? Close this window and log in.

Join Us             Close