It would depend on the type of script you have. If you have a script that has heavy use of the systems commmands, it would take considerable longer than if you just had something open a text file and print to it (say for example).<br> <br><br>You might want to post your code.<br><br>Thanks.<br><br>-Vic
I don't see anything wrong with that. Have you tried to convert it and seen what happens? Do you get an error message. Otherwise, I don't see a problem with it.
I tried it on my machine, which is Win32 and running ActiveState Perl. It found nothing wrong with it.
I see a lot of people using syntax like this and I'm a bit puzzled:
[tt]
# read in the file
@input = <STDIN>;
# trim off the newline from the end of each line
chop (@input);
# Process the file
while ($input[$count] ne "" {
[/tt]
to read through the contents of a file and process each line, and there's nothing wrong with it (works fine) however...
It seems to me that if you have a large file - you will use a lot of memory that way; wouldn't it be better (so that you can deal with files of any size) to do it like this?
[tt]
while(<>){
# process each line of file
[tab]print; # print processed line
}
[/tt]
What am I missing here? Is there some obvious advantage to [tt]@input=<STDIN>;while($input[$count++])[/tt] that I'm just not seeing?
This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
By continuing to use this site, you are consenting to our use of cookies.