INTELLIGENT WORK FORUMS
FOR COMPUTER PROFESSIONALS

Log In

Come Join Us!

Are you a
Computer / IT professional?
Join Tek-Tips Forums!
  • Talk With Other Members
  • Be Notified Of Responses
    To Your Posts
  • Keyword Search
  • One-Click Access To Your
    Favorite Forums
  • Automated Signatures
    On Your Posts
  • Best Of All, It's Free!

*Tek-Tips's functionality depends on members receiving e-mail. By joining you are opting in to receive e-mail.

Posting Guidelines

Promoting, selling, recruiting, coursework and thesis posting is forbidden.

Jobs

How can I design a script to push or pull results from commands run on one server to another?

How can I design a script to push or pull results from commands run on one server to another?

How can I design a script to push or pull results from commands run on one server to another?

(OP)
For example, is there a way I push the results from a script run on a local server into a script on a remote server that does something with the results without having to land the results on disk?

Or is there a way I can pull the results from a script/command run on a remote server into a script on a local server that does something with the results without having to land the results on disk?

Can I use a pipe or something else to accomplish this? If so, could someone provide me with an example?

The problem I am trying to solve is a lack of disk space. I have data in a database on one server that I want to unload and load into a database on another server. I know how to run the unload command on the one server and the load command on the other server and I know how to put these in UNIX scripts. What I do not know is how to run the unload in a script that pushes the table output through to the remote server load program to load the results into the remote database without having to land the unloaded data on disk in an interim file during the process. I am fine to push or pull the data, whichever is easiest. If I have the option to do this either way, then I can decide which server has the most resources available to run this and use the push or pull approach based on that.

Thanks in advance.

RE: How can I design a script to push or pull results from commands run on one server to another?

You should look in the direction PHV is suggesting. In Oracle you can create a database link between two separate databases. Then your data transfer can be as simple as this...

CODE

INSERT INTO LOCAL_TABLE
SELECT * FROM REMOTE_TABLE; 

I'm sure other databases can do similar things.

RE: How can I design a script to push or pull results from commands run on one server to another?

(OP)
Netezza does not support access to remote databases like ORACLE and other databases do. It is not as sophisticated as a lot of the other relational databases in this respect. I am trying to find a way to migrate data to it without having to land the data in files. We do not have the storage for that.

RE: How can I design a script to push or pull results from commands run on one server to another?

Ok, so Netezza is one of your databases, what is the other? Is it Oracle? Id so, it looks like Oracle can integrate with Netezza pretty transparently. See the link.

http://docs.oracle.com/cd/E17904_01/integrate.1111...


RE: How can I design a script to push or pull results from commands run on one server to another?

Also take a look at the Netezza Data Loading Guide PDF provided by IBM. It looks like the nzload utility can pull data from an ODBC data source, which pretty much opens up most relational databases.





RE: How can I design a script to push or pull results from commands run on one server to another?

(OP)
The other database is DB2 LUW so the ORACLE information is not relevant at this time but we do have ORACLE databases and at some point in the future may need to load information from these sources so this is very helpful information that I may well need in the future. Thank you so much.

Yes, thanks, ODBC is an option and I have been reading and working with the Netezza Data Loading Guide. Some of the tables that will be involved with this project are quite large with billions and billions of rows. ODBC can be an option for small tables whose data types match. I am running into some issues getting columns with date or timestamp data types to work correctly. It is not a simple matter of INSERT INTO SELECT FROM because I cannot connect to both databases at the same time and because the date and timestamp formats between these two databases is different. I have however figured out that there are options I can use with the DB2 LUW EXPORT or the DB2HPU (High Performance Unload) utility and the NZLOAD utility to work around the date and timestamp conversion issue.

Testing of the DB2 LUW unload utilities and Netezza NZLOAD utility have proved to be much faster than the ODBC interface. In addition to this, I think that I may be able to use a pipe between the two utilities in order to avoid a huge storage requirement to land billions and billions of rows unloaded before loading them. Finally, if I use scripts to accomplish this, I can schedule them to run against our current production environment and have them stop and restart using date ranges by year when migrating huge amount of information that span a large number of years. In this way, I should be able to manage this in a way that does not have a huge impact on our current production operations.

The DB2HPU utility does not employ the database engine to unload the data so the combination of this and the NZLOAD utility look like the best solution. I plan to search this web site and the web to see if I can find a simple script example that uses a pipe from one server to another. The utilities allow me to connect to a local DB2 LUW database and unload data from a table and then use the NZLOAD utility and the Netezza ODBC interface you mention to load rows into tables in a remote Netezza database. I hope that a pipe will work for all of this to avoid having to get additional storage to support the migration. Also, batch scripts can be scheduled and unload SQL parameters can be used to break up the huge logical unit of work into smaller pieces by year to give me the flexibility to operate all of this in a way that plays nice with current production operational requirements and service level agreements.

Thanks for everyone's helpful information and ideas.

Red Flag This Post

Please let us know here why this post is inappropriate. Reasons such as off-topic, duplicates, flames, illegal, vulgar, or students posting their homework.

Red Flag Submitted

Thank you for helping keep Tek-Tips Forums free from inappropriate posts.
The Tek-Tips staff will check this out and take appropriate action.

Reply To This Thread

Posting in the Tek-Tips forums is a member-only feature.

Click Here to join Tek-Tips and talk with other members!

Resources

Close Box

Join Tek-Tips® Today!

Join your peers on the Internet's largest technical computer professional community.
It's easy to join and it's free.

Here's Why Members Love Tek-Tips Forums:

Register now while it's still free!

Already a member? Close this window and log in.

Join Us             Close