Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations Wanet Telecoms Ltd on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

Tie::File and memory consumption on Windows-platform, any ideas?

Status
Not open for further replies.

Windrider

Programmer
Jul 26, 2002
8
FI
Hi,

I would like to ask one question related to Tie::File. I'm using it on
ActivePerl in Windows -platform and try to write XML-based event-log from
my application.

With tied filehandle it's easy to add lines in the middle of the file and also
remove lines etc. And also the wellformedness of the logfile can be quaranteed.

The problem is that for some reason the memory consumption of the logging feature
starts to increase steadily and if the application is running for a long time and
there is much to log some problems will occur.

I have started to suspect that the memory consumption has something to do with
splice, when lines are spliced in the file. Is this a problem only on Windows platform and
have you seen such behaviour?

Of course there's xml-writing modules available, but since this is custom log and some
special functionality is needed, this approach would have been flexible.

Here's a small script, which should demostrate the problem (when the script is running
the memory consumption of perl.exe can be followed from task-manager):

There is some sleep just for not to eat memory too quickly

use Tie::File;
my $row = 0;
$sleepCnt = 0;

tie @file, 'Tie::File', 'test.xml' or die "\n\n-> failed to tie file\n\n";

splice @file, $row++, 0, "<TEST>";
splice @file, $row, 0, "</TEST>";

while(1) {
splice @file, $row++, 0, "<TAG id=\"testi tag\">Some added text $row</TAG>";
if(++$sleepCnt == 100) {
$sleepCnt = 0;
print "\nSleeping...";
sleep(1);
}
}
untie(@file);
 
From what I gather tie reads the whole file into memory by slurping it (thanks icrf). This would be less than ideal when considering it for logging a busy, long running application. Also because its in memory, means that the file on disk isn't necessarily up to date until after you untie the file.

What sort of logging features do you have that you can't just write the events to a text file, or even a rolling text file(s)

--Paul
 
Tie::File doesn't slurp. From the Tie::File docs:

"The file is not loaded into memory, so this will work even for gigantic files."
 
Ish,

Well caught.

Windrider,
Apologies for misreading a post that could've misdirected you

thread219-818625

This was to do with a Config File based on Tie

Slurping notwithstanding, memory usage is still a problem here, esp. for logging a busy, long standing application. Changes to the file have to be recorded somewhere (I'm thinking memory) and will be written to disk after the untie. Can you untie the file at regular intervals (after so many accesses)

Again apologies for misleading anyone

--Paul
 
Ignore last post, still not reading enough

Changes to the file are reflecterd immediately.

As far as the documentation goes, your problem wpouldn't appear to be with Tie at all (that is if it does what it says on the tin)

Implementation dependency perhaps?

--Paul
 
. . . and it *does* have a very nice tin!

It seems that Dominus has gone to great lengths to make sure it doesn't eat too much memory. By default, it'll use 2 Mebibytes at most, which won't affect your system (unless it's a really old).

Unfortunately, Windrider, I can't see anything in that code that would explain that memory consumption. All I can think of is to suggest you make sure that both Perl and Tie::File are latest/recent versions. Maybe someone else will have a better suggestion.
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top