Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations derfloh on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

LWP Question

Status
Not open for further replies.

luckydexte

Programmer
Joined
Apr 26, 2001
Messages
84
Location
US
Hello All,

I need to write a perl script that will send a URL to a server, the server will then process the URL, and they will send a URL back to my server with information.

For example, I will send:

The server will send me back:

My scripts needs to run from the command line. It will send a URL to a server, at this point the server knows what to do. I need to then capture the URL send back.

Any suggestion?

Thanks,

Brandon
 
Check out the recent post about "grabbing web pages..". It show a simple example of using LWP and offers another options that will let you play with cookies.

HTH


keep the rudder amid ship and beware the odd typo
 
goBoating,

I checked out the post you suggested but I think I am looking for something different. I am sending a server parameters, along with a return URL, and they are processing these parameters and sending them back to my server.

Example

This is being sent to a server, processed and return the return URL. I need a way to send a URL to this server from the command line and pickup the return URL, which both will be Perl scripts of course.

If you have any other ideas goBoating I would love to hear them.

Thanks,

Brandon
 
There's a thread in this forum about using a program called curl which appears to do exactly what you want.
Tracy Dryden
tracy@bydisn.com

Meddle not in the affairs of dragons,
For you are crunchy, and good with mustard.
 
You shouldn't think in terms of getting a return URL, but in get a return text string, which might happen to contain a URL. (I know this is picky, but it might help to see things more clearly). Also, when you say "send" a URL to the server, do you mean *request* that url from the server, as in a GET request by a browser, in which case the server will output the text string of the URL you want returned?

If this is the case, then try this (basically the script that goBoating mentioned):
Code:
#!/usr/bin/perl -w
use LWP::Simple;
$my_url = "[URL unfurl="true"]http://server?Param=func&fname=Brandon&returnURL=http://returnserver&lname=Goodman";[/URL]
$return_string = get($my_url);
// debug by printing out the return
print $return_string;

// or do whatever else you need to do with $return_string

The above script will run perfectly well from the command line. In fact, in it's present form, you could execute the script in this manner:

./script_name > outputfile.txt

to capture that string to a text file. Or if you wanted the command-line utility to take the URL as an argument, you could just do something like:

Code:
#!/usr/bin/perl -w
use LWP::Simple;
$my_url=shift(@ARGV);
chomp($my_url);
$return_string = get($my_url);
// debug by printing out the return
print $return_string;

Then you could call it from the command line:

./script_name > outputfile.txt

This seems like a perfectly standard operation to me. If you *don't* mean to use that initial URL as a GET request, what do you mean? If you *don't* mean take that returned string as a variable and do something with it, then what do you mean?
 
rycamor,

Thanks for the response. Let me try to be more clear. I am sending a GET request to a server. However, I am not doing it from a browser. I would be doing it from a script run at the command line

The script works that goBoating supplied but I am receiving the whole HTML file. What I am trying to get at is the URL to that html file being returned (It contains some parameters). Actually, this script does work because the data I need is in the HTML file. I can just parse the information I need.

I have come across a new problem. The script above will work for what I need to do. However, I need to send the GET request in a secure mode. This would mean send the request in https: as appost to http:. It does not seem to like this.

The server is setup up with an SSL from Verisign. When I run the script and change the URL to https: it does not even try to access the internet. Any idea or suggestions?

Thanks,

Brandon
 
For secure GET requests, use the Perl module "SSLeay" or use cURL ( which comes with a nice Perl module.

So if I understand right, you are getting the URL returned somewhere *inside* the text of the HTML page being returned. Then from there it's just a matter of using some regular expressions to isolate that string.

I think you will find that curl will do the job very nicely. In fact, you might not even need Perl, if you just use curl from the command line and pipe the output through a shell regex. Something like

curl | grep http:// | {regex_here}
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top