Smart questions
Smart answers
Smart people
INTELLIGENT WORK FORUMS
FOR COMPUTER PROFESSIONALS

Member Login




Remember Me
Forgot Password?
Join Us!

Come Join Us!

Are you a
Computer / IT professional?
Join Tek-Tips now!
  • Talk With Other Members
  • Be Notified Of Responses
    To Your Posts
  • Keyword Search
  • One-Click Access To Your
    Favorite Forums
  • Automated Signatures
    On Your Posts
  • Best Of All, It's Free!

Join Tek-Tips
*Tek-Tips's functionality depends on members receiving e-mail. By joining you are opting in to receive e-mail.

Posting Guidelines

Promoting, selling, recruiting, coursework and thesis posting is forbidden.
Jobs from Indeed

Link To This Forum!

Partner Button
Add Stickiness To Your Site By Linking To This Professionally Managed Technical Forum.
Just copy and paste the
code below into your site.

Data from SQL Server to Teradata

TDKID (TechnicalUser) (OP)
28 Jul 03 5:44
Hi,
   I have my source data in SQL Server DB, I need to pump it to Teradata OLAP DB. First time I need to data for about 150 GB and my daily load will be 5GB a day.

Which loading utility can I use

-Extract data from SQL Server thru DTS into flat file and load using Teradata Utilities (Fastload,Mload)
-Extract and load data directly using DTS
-Go for some 3rd party like informatica


It would be more help if some one can explain me which one would be the most efficient process to go about.

-TIA
BillDHS (Programmer)
28 Jul 03 10:33
You can use a Teradata utility called OLE Access Module to go directly from SS to Teradata. It skips the export to flat file step and uses Fastload to load the table.  The newer version also generates Multiload and BTEQ scripts.

I just ran an export job of a 17 mill row, 1.6 Gb table:
 Fastexport to flatfile = 70 minutes.
 OLE Access Mod, Teradata to SS = 120 minutes
 DTS Teradata to SS = < 180 minutes.
  
steverael (TechnicalUser)
11 Aug 03 16:53
OLE DB Access Module (aka OLELoad) in conjunction with FastLoad (for the initial load) will be the most performant. I have not tested in a long time, but I was able to load over 20GB per hour into Teradata from SQL Server. You should be able to at least get this on a dedicated fast ethernet connection.

For subsequent updates, either FastLoad to a staging table and perform an INSERT SELECT, or use MultiLoad.

Avoid DTS going into Teradata, as it inserts one row at a time.
Smiles99 (TechnicalUser)
19 Feb 04 15:45
TDKID,

As BillDHS and steverael previously pointed out, using the OleLoad utility found in the Teradata OLE DB Access Module is a very efficient way to load Teradata from Microsoft SQL Server.  Since you mentioned DTS (Microsoft Data Transformation Services), I thought I'd point out that you can also perform a load with the OLE DB Access Module from DTS.  This could be handy if you needed to perform some transformation/cleansing/etc. of the data prior to loading.  In this case, the data flow will look something like:

               Microsoft DTS
                    |
                    V
        Teradata OLE DB Access Module
                    |
                    V
            Teradata FastLoad
                    |
                    V
              Teradata RDBMS

Here is a small experiment that you can use to verify that it is possible to load directly from DTS:

On a system having Microsoft SQL Server 2000 installed, click "Start", then click "Programs", then click "Microsoft SQL Server", then click "Import and Export Data".  Follow the wizard steps to create a "Microsoft OLE DB Provider for SQL Server" to "Microsoft OLE DB Provider for SQL Server" package.  On the "Specify Table Copy or Query step", choose "Copy table(s) and view(s) from the source database".  On the "Save, Schedule, and replicate package" step, uncheck all of the check boxes in the "When" group box, check the "Save DTS Package" check box and the "Structured Storage File" radio button in the "Save" group box.
Start the "SQL Server Enterprise Manager" and find the "Data Transformation Services" icon.  Right click it and choose "Open package" to open the package that you created with the "Import and Export Data" wizard.  Right click the transform data task icon (it is an arrow) and click Workflow Properties.  On the "Options" tab in the "Execution" group box set the "DSO rowset provider" check box.  Also, copy down the ENTIRE "Name" of the transform data task.  It is important the get the entire name.  It may not all be displayed and you might have to click on it and then move the cursor all the way right and left over it to get the whole name.
Save the package.
Start OleLoad and for the "Select a source" group box, select the Microsoft OLE DB Provider for DTS Packages ("DTSPackageDSO") entry.  On the "All" tab of the "Data Link Properties" dialog box, set the value of the Provider String property to /Ffilename (where filename is the full pathname of your DTS package).
In the "Enter the name of the desired table" edit box, enter the name of the transform data task.

This is a very basic way of using this, but it at least shows that it works.  Of course in many cases it is probably better to load the data to Teradata first and then perform the needed transformation/cleansing/etc.

Keep smiling!

Reply To This Thread

Posting in the Tek-Tips forums is a member-only feature.

Click Here to join Tek-Tips and talk with other members!

Back To Forum

Close Box

Join Tek-Tips® Today!

Join your peers on the Internet's largest technical computer professional community.
It's easy to join and it's free.

Here's Why Members Love Tek-Tips Forums:

Register now while it's still free!

Already a member? Close this window and log in.

Join Us             Close