NeilFawcett
Programmer
I was playing around with ways to speed up my file updates.
My original coding was along these lines:-
open LOCK,'>lock.lock';
flock LOCK,2;
open AC,"test.txt";
flock AC,1;
@rec=<AC>;
close AC;
# Process @rec in someway!
open AC,">test.txt";
flock AC,2;
print AC @rec;
close AC;
close LOCK;
unlink('lock.lock');
I tried replacing that loop with this one which runs almost twice as fast...
open AC,"+<test.txt" or open AC,"+>test.txt";
flock AC,2;
seek AC,0,0;
@rec=<AC>;
# Process @rec in someway!
seek AC,0,0;
print AC @rec;
truncate(AC, tell(AC));
close AC;
Now, when I put this coding into practice in a script being use quite heavily, periodically the file would get blanked out. By some debugging it appeared the first open statement was failing (even though the file had previously been created), so it was doing the "OR" condition which goes to the second open statement which creates a new blank file.
I then replaced the open line with:-
if(-e "test.txt"){open DF,"+<test.txt"}else{open AC,"+>test.txt";}
This seems far more stable. Infact where the first coding was nuking the file every 10-20mins, this coding has run for 20hrs now without clearing the file.
Any ideas for this issue?
Also should I put back the "lock.lock" coding around this entire update? The idea is to try and optimise this logic, so if I can get rid of unecessary stuff, good. But I don't want to risk file corruptions!
ps: This is on a Unix box!
My original coding was along these lines:-
open LOCK,'>lock.lock';
flock LOCK,2;
open AC,"test.txt";
flock AC,1;
@rec=<AC>;
close AC;
# Process @rec in someway!
open AC,">test.txt";
flock AC,2;
print AC @rec;
close AC;
close LOCK;
unlink('lock.lock');
I tried replacing that loop with this one which runs almost twice as fast...
open AC,"+<test.txt" or open AC,"+>test.txt";
flock AC,2;
seek AC,0,0;
@rec=<AC>;
# Process @rec in someway!
seek AC,0,0;
print AC @rec;
truncate(AC, tell(AC));
close AC;
Now, when I put this coding into practice in a script being use quite heavily, periodically the file would get blanked out. By some debugging it appeared the first open statement was failing (even though the file had previously been created), so it was doing the "OR" condition which goes to the second open statement which creates a new blank file.
I then replaced the open line with:-
if(-e "test.txt"){open DF,"+<test.txt"}else{open AC,"+>test.txt";}
This seems far more stable. Infact where the first coding was nuking the file every 10-20mins, this coding has run for 20hrs now without clearing the file.
Any ideas for this issue?
Also should I put back the "lock.lock" coding around this entire update? The idea is to try and optimise this logic, so if I can get rid of unecessary stuff, good. But I don't want to risk file corruptions!
ps: This is on a Unix box!