Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations Chriss Miller on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

using perl, can i pull all the links from a web page?

Status
Not open for further replies.

spewn

Programmer
Joined
May 7, 2001
Messages
1,034
i want to run a search of all the links on a particular web page...is there a module or something else that i can utilize to return to me all the links?

- g
 
CPAN is your friend:


EXAMPLES ^

This example extracts all links from a document. It will print one line for each link, containing the URL and the textual description between the <A>...</A> tags:

use HTML::TokeParser;
$p = HTML::TokeParser->new(shift||"index.html");

while (my $token = $p->get_tag("a")) {
my $url = $token->[1]{href} || "-";
my $text = $p->get_trimmed_text("/a");
print "$url\t$text\n";
}

------------------------------------------
- Kevin, perl coder unexceptional! [wiggle]
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top