I've got a script that has been throwing an intermittant error for the past month. Before that, it ran fine for over two years. The script connects to an ftp server and retrieves a bunch of binary files then processes them into our database. It runs once/hour and now it's failing about 50% of the time.
The error message is 'bad file number' and it's failing while attempting to connect to the remote server. The section of code where it fails is:
And the actual message that gets logged is "Couldn't connect to 'ftp.meteorlogix.com' ( : Bad file number)".
The system it runs on is Solaris. Oddly, the exact same script runs in our dev environment just fine (connects to the same remote server, once/hour). It has also run without errors every time that I've manually run it.
Any ideas??
Dave
The error message is 'bad file number' and it's failing while attempting to connect to the remote server. The section of code where it fails is:
Code:
my $ftp = Net::FTP->new( $ftpHost, Debug => 0 );
if( ! $ftp ) {
LogUtils::error( "Couldn't connect to '$ftpHost' ($@ : $!)" );
print( "Couldn't connect to '$ftpHost' ($@ : $!)\n" );
exit( -1 );
}
And the actual message that gets logged is "Couldn't connect to 'ftp.meteorlogix.com' ( : Bad file number)".
The system it runs on is Solaris. Oddly, the exact same script runs in our dev environment just fine (connects to the same remote server, once/hour). It has also run without errors every time that I've manually run it.
Any ideas??
Dave