Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations strongm on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

Edit file in place and force it to be read again

Status
Not open for further replies.

perlnewbie9292

Programmer
Jul 15, 2009
25
0
0
US
Hello Perl experts, I am not sure if I am going about this the best way and hoping that someone here can point me in the right direction. What I am trying to do with no success is:

I have a .txt file which gets entries inserted into it with directory paths. Now what I would like to do is read that file and process each path one at a time (running a copy of the path and moving the file in that path to another directory) then remove the entry which I just process from the .txt file.

Now two questions I have:
With the code that I have so far I am not sure how to remove the line which I just finished processing from the .txt file. Then have the script reread the file that way when/if new lines gets added I am sure that I am reading from an updated file.

So once again in simpler terms this is what I am trying to do.

*read files with paths
*remove the /local/ from the path in the .txt file.
*copy or move from the entry in the .txt file to new location without the /local
*update the .txt file so the line which I just processed is *removed.


This is code that I have any help would be greatly appreciated as I am kinda stuck.

Code:
#!/usr/bin/perl
use strict;
use warnings;
use File::Path;

my $workingFile = "/tmp/test.txt"; #(This file contains fully qualified paths)

open(TXT,$workingFile)|| die("Cannot Open File $workingFile");
while (<TXT>) {
    my $line = $_;
    foreach ($line) {
        chomp $line;
        my $copyTo = $line;
        $copyTo =~ s/\/local//;
        chomp $copyTo;
        if ( -d "$copyTo" ) {
            print "$copyTo exists\n";
        } else {
           mkpath("$copyTo") or die "Could not create $copyTo using mkpath command $!\n";
           print "$copyTo does not exist creating now\n";
        }
        print "Going to run the Copy Command to copy: $line to --> $copyTo\n";
        system("/bin/cp", "-rf", "$line", "$copyTo");
        print "Copy was successfull sleeping\n";
        sleep 100;
    }
}
close(TXT);
 
If you mean that the file may be written by another process at the same time, then you need file locks or a way of synchronizing the other process and yours.
Also the only way of doing what you want to do (exactly what perl's in place editor does) is writing line by line to a temporary file, discarding the lines that have to be discarded, and at the end close the two files and rename the temporary file to your original one.

Franco
: Online engineering calculations
: Magnetic brakes for fun rides
: Air bearing pads
 
So what you are doing is using a text file as a work queue. Some other process(es) append data to the file, and your script consumes it.

As prex1 has noted, unless you take precautions your file will get corrupted.

Have you considered using a database table? The DBMS will handle all the serialisation for you without having to flock() about.

Steve

[small]"Every program can be reduced by one instruction, and every program has at least one bug. Therefore, any program can be reduced to one instruction which doesn't work." (Object::perlDesignPatterns)[/small]
 
Named pipes are unpredictable when multiple processes are reading and writing from them.

If 2 processes write to a named pipe at the same time, the process that reads from it will get both the inputs but not be able to tell which one came from which process, and the order it receives the data is unpredictable.

If 2 processes read from it at the same time while a process writes to it, only one process will get all the data and the other won't get any, or each of them will get parts of the data but the data one process gets, the other one won't get.

(just my observations from tinkering around when I discovered named pipes... mainly just using cat and echo on multiple terminals).

Cuvou.com | My personal homepage
Code:
perl -e '$|=$i=1;print" oo\n<|>\n_|_";x:sleep$|;print"\b",$i++%2?"/":"_";goto x;'
 
Admittedly I was assuming one reader and writer, but even if that wasn't the case I have never encountered the problems you describe.

I just tried in a Linux VM with a number of looping processes writing to the same pipe and 1 reader, and the written lines all seem to be handled atomically. If I introduce a second reader though then the problem you describe occurs...

Another option is to tail -f the input file and pipe that to the perl script's stdin...

Annihilannic.
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top