Hello rcsen,
do you mean from the browser's perspective or from the server's perspective? Is it a CGI generated page? ????
This is easily done if it is a CGI generated page. Just grab the time, using the time function, at the beginning of the code and again at the end and substract. That will give you the number of seconds the program took to run.
Hi goBoating,
What I intended to do was, write a perl script such that if I give any url as a parameter, the code should be able to calculate the time to load. I should not add anything to the existing code. Could give me some idea for this?
well, you could get the filesize and size of all dependencies with perl from a url... then calculate the average time it would take at different speeds based on the sizes...
or you could actually have your perl program download these things, but, it would give the speed of the server that your site is on.... and, it might be a little slower depending on processor speeds and what not... adam@aauser.com
I think thendal might be barking up the right tree. You can use LWP to retrieve a page, but, you have to get the graphics separately. I have not run this, but, it might be close to what you are looking for......
#-- START CODE --#
#!/usr/local/bin/perl
use LWP::Simple;
$url = '
# this will get the HTML text
$content = get($url);
# parse the IMG SRC tags from the HTML
while ($content =~ /<IMG SRC="(.*?)"/gis)
{
# retrieve each IMG SRC
get($1); # I have not tried this, but, I don't see
# why it would not work
}
$stopped = time;
$elapsed = $stopped - $started;
#--- END CODE --#
One possibility: If the images' locations are written in relative instead of absolute directory listings, you'll have to strip the prefix directory out of the original url and add that onto it. you'll also have to check first to see whether it's an absolute or relative listing. not too much trouble, though.
"If you think you're too small to make a difference, try spending a night in a closed tent with a mosquito."
# this will get the HTML text
$content = get($url);
print "Got content\n";
# parse the IMG SRC tags from the HTML
while ($content =~ /<IMG.*?SRC="(.*?)"/gis)
{
# retrieve each IMG SRC
print "Getting $1\n";
get($1); # I have not tried this, but, I don't see
# why it would not work
}
$stopped = time;
print "Stopped: $stopped\n";
$elapsed = $stopped - $started;
print "Elapsed: $elapsed\n";
Hi goBoating,
Thankyou verymuch. It works fine. Could you say me what is nonleap seconds mean? For the "time" function, they had given as nonleap seconds from Jan 1,1970.
In leap years, there are 366 days (February 30 exists).
I believe the time function has neglected to add a day for leap years. It assumes 365 days in all years.
Sorry for disturbing again.
The elapsed time is returned in seconds. Is there any function to find the same in milliseconds? Because, all my pages loads in almost 1 or 2 seconds.
Can I convert the "time" function to calculate upto milliseconds? The time function in JAVA gives the output in milliseconds.
Sorry, the time function returns seconds. I guess you could run a series of loops on each page. Maybe get each one 10 or 20 or 100 times and use the aggregate value. Doing this to your server is reasonable, but, I would [red]not[/red] do this to someone else's server. It would be a real annoyance.
If you're interested in time in increments less than a second, the Perl Cookbook recipe 3.9 High-Resolution Timers suggests downloading the Time::HiRes module from CPAN - that is, if your system supports both the "syscall" in Perl as well as a system call like gettimeofday(2). I just did "man gettimeofday" and a manpage popped up - I guess I have gettimeofday - I'm on a Redhat 6.1 Linux system.
This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
By continuing to use this site, you are consenting to our use of cookies.