Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations Chriss Miller on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

TP script - Probable Memory leak - please help

Status
Not open for further replies.

FunkyChicken1970

IS-IT--Management
Joined
Nov 2, 2006
Messages
3
Location
GB
Hi,

I've been asked to write a script to poll a local folder and upload when there are files present. Whilst this script does the job there's a problem that it crashes after a few hours.. monitoring the task manager under Windows shows that each time files are FTP'd up an additional 4-8k of RAM is used.

I guess my main question is how do I release the memory? It seems that the FTP opperations are chewing up memory on each iteration.. Other comments are more than welcome as Perl is quite new to me but the basic script logic seems OK to me.

Below is the worker loop..

Thanks in advance!

Code:
use POSIX qw(strftime);
#use Time::Local;
use Net::FTP;
#use vars qw/ %opt /;
use File::chdir;
use File::Copy;
#use Getopt::Std;
use Net::SMTP;

$dirroot = "d:\\folder";
$dirout = "d:\\folder\\out";
$dirarchive = "d:\\folder\\archive";

$errors=0;
$smtprecipient = "me\@company.com";
$smtpsender = "report\@company.com";
$smtpserver = '192.168.0.200';
$emailsignature = "-\nEnd of report\n";
@messages = "";
#$today = strftime "%Y%m%d", localtime;

$ftpsite = "ftp.compnay.com";
$ftpusername = "username";
$ftppassword = "password";

##################
($second, $minute, $hour, $dayOfMonth, $month, $yearOffset, $dayOfWeek, $dayOfYear, $daylightSavings) = localtime();
$emailsubject = "FTP Upload Process - Started Successfully";
$emailbody = "Upload process has started at " . $hour . ", scheduled to complete at 17:59\n";
email();
$numtrades = 0;
$errors = 0;
$count = 0;
$CWD = $dirout;

while ($hour < 18){
	@files = glob "*.swf";
#	printf "Waiting for folder contents to stabilise..\n";
	sleep (5);
@files = glob "*.swf";
if (@files > 0) {
	$ftp = Net::FTP->new($ftpsite, Timeout => 30, Debug => 1);
	$ftp->login($ftpusername, $ftppassword) or push(@messages,"Unable to log into FTP site.\nCheck that the login details haven't changed,  aborting!\n");
	foreach (@files) {
#Upload one file at a time
		$ftp->put(@files[$count]) or push(@messages,"Unable to upload " . $filenameupload . ", aborting!\n");
#move to archive folder if verify ok
		move (@files[$count], $dirarchive);
		$count = $count +1;
	} #end of for each files
	$ftp->quit;
}
$numtrades = $numtrades + $count;
$count = 0;
@files = -1;
$ftp = -1;
#sleep (55);
}
 
Bit of a shot in the dark - perhaps putting the ftp call in a subroutine, with its variables declared with 'my' inside the sub would help? At least then each time you exited the sub, its variables would all go out of scope, and storage should be reclaimed?

Also, you don't seem to reset $hour anywhere in the while loop, so assuming it kicks off sometime before 1800 I can't see how it would ever terminate.

Might be better to @files = undef; $ftp = undef rather than using -1, too. And maybe uncomment the final sleep call to avoid a fairly tight polling loop.

Steve

[small]"Every program can be reduced by one instruction, and every program has at least one bug. Therefore, any program can be reduced to one instruction which doesn't work." (Object::PerlDesignPatterns)[/small]
 
Thanks for that, checking the time is back in and the sleep statement is uncommented again.

I've implemented a subroutine for the FTP upload functionality and also using the 'my' declaration.

I'm now using undef to trash @files & $ftp.

Memory usage is still creeping up though each time files are located and uploaded.
 
Any possibility of using the windows scheduler to run the task every minute from say 0800 through 1800? Then you wouldn't need the loop, and you'd get a new process each time, so your memory problem would go away.

You might need some kind of lock file, so if the previous one was still running, the new one could exit immediately.

I know it doesn't fix the problem, but it might give you a work-around.

Steve

[small]"Every program can be reduced by one instruction, and every program has at least one bug. Therefore, any program can be reduced to one instruction which doesn't work." (Object::PerlDesignPatterns)[/small]
 
I thought about doing that.. thing is though while we aren't doing many of these transfers each day right now.. if things go well then there will be a very large quantity of files to upload.

If the problem is related to how many files it uploads in one session (which it seems to be) then possibly I'm just delaying the problem until it becomes heavily utilised and buiness critical.
 
Is there a reason why glob is being done twice?

Code:
 @files = glob "*.swf";
#    printf "Waiting for folder contents to stabilise..\n";
    sleep (5);
@files = glob "*.swf";

I would also recommend using warnings or -w. Alos, try using eval to see if that makes a diff, as arrays use a lot of memory.
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top