Yes I would recommend BCP. However, as an alternative I have used the following in the past, wrapped up in a Transact SQL stored procedure:<br>
<br>
1. Create a (temporary) table to hold the file. eg.<br>
<br>
create table #input_file ( line varchar(255) NULL)<br>
<br>
2. Use an extended procedure to run a command shell, issuing a 'type' command on the file:<br>
<br>
insert into #input_file (line) <br>
exec master..xp_cmdshell 'type c:\filename.csv'<br>
<br>
This loads the output of the command shell (ie. the file contents) into your table.<br>
<br>
3. Now write code to do substring manipulation to extract the columns from your csv data eg.<br>
<br>
update target_table<br>
from #input_file<br>
set target_table.time_stamp = convert(datetime,substring(#input_file.line, 1, 15))<br>
...<br>
etc etc<br>
<br>
4. Finally, if you've created a temporary table remember to clear it down and blow it away.<br>
<br>
One adantage of this method over BCP is that it fires any triggers that you've defined. <br>
<br>
Your DBA may complain about security - the 'xp_cmdshell' runs under the A/C that starts the DB, however they can create a separate NT A/C just to run xp_cmdshell, should they feel strongly.<br>
<br>
<br>
Hope that this helps.<br>
<br>
Incidentally my original use for this was to execute a 'dir /s' command in the shell to recursively build up a file tree and store file details in a DB table. <br>
<br>
Cheers,<br>
<br>
<A HREF="mailto:Martin.Ramsay@company-net.co.uk">Martin.Ramsay@company-net.co.uk</A>