Smart questions
Smart answers
Smart people
Join Tek-Tips Forums

Member Login

Remember Me
Forgot Password?
Join Us!

Come Join Us!

Are you a
Computer / IT professional?
Join Tek-Tips now!
  • Talk With Other Members
  • Be Notified Of Responses
    To Your Posts
  • Keyword Search
  • One-Click Access To Your
    Favorite Forums
  • Automated Signatures
    On Your Posts
  • Best Of All, It's Free!

Join Tek-Tips
*Tek-Tips's functionality depends on members receiving e-mail. By joining you are opting in to receive e-mail.

Posting Guidelines

Promoting, selling, recruiting, coursework and thesis posting is forbidden.
Jobs from Indeed

Link To This Forum!

Partner Button
Add Stickiness To Your Site By Linking To This Professionally Managed Technical Forum.
Just copy and paste the
code below into your site.

EBOUGHEY (Programmer) (OP)
15 Sep 03 21:04

I am taking over some code that someone has written.  I am getting an error when I try to update the table.

"Invalid delimited data:  text qualifier must be followed by a column delimiter (except the last column)."

The file is a tab delimited file and when I look at everything it looks fine.  Any quick info on this?



LogicalmanUS (Programmer)
15 Sep 03 21:10

 I take it this is a file data import from a text file?

EBOUGHEY (Programmer) (OP)
15 Sep 03 21:17
Yes it is.
LogicalmanUS (Programmer)
15 Sep 03 21:22
 Check the file, and see if there are any spaces in the first row (Column headers). For safety, there should never be any spaces in column names, use underscores (_) instead. IN SQL to reference a column name that has spaces (e.g. First Name), it would be referenced [First Name].

EBOUGHEY (Programmer) (OP)
15 Sep 03 21:47
the file doesn't have a header row.  It imports as col1-col014. The dts package renames the columns after they are imported. I don't have the file here (wish i did now).

I recall this happening before.  The programmer told me he had to open it up in excel and do something to one of the columns.  I just can't figure it out though.
LogicalmanUS (Programmer)
15 Sep 03 22:10

 Unfortunately, I think you need to get the file and import it into XL to see the offending column.

 By the sound of what the last programmer stated, it sounds like there may be one or more text data columns, and one contains either a control code or an extra comma.

 Sorry, I can't be of use without the file.

EBOUGHEY (Programmer) (OP)
15 Sep 03 22:12
No, thanks a lot for what you've given me.  I had no idea where to even start and now i do.

Thanks again,

LogicalmanUS (Programmer)
15 Sep 03 22:15

 Please post the result of what you find, I've got an interest in this now. You never know, I may get the same thing sooner, rather than later!

EBOUGHEY (Programmer) (OP)
16 Sep 03 14:18
Here is a sample of the data and the actual code the programmer used.  I hope this can help alleviate the problem.  I noticed he did not use a substr(ltrim on columns 1-3:

"17730454"    "1GNEK13Z83R129014"    "385005"    " RONALD CLYDE* GRAHAM"    "415 STILL FOREST TER"    "SANFORD"    "FL"    "32771-8367"    "4073284982"    "2003-03-07 00:00:00"    "2"    "1"    "9"    "2003"
"17767368"    "3GNFK16Z22G357492"    "385005"    " DIRK GLEN PRUSIA"    "6715 S SYLVAN LAKE DR"    "SANFORD"    "FL"    "32771-9052"    "4073228390"    "2003-04-25 00:00:00"    "2"    "1"    "9"    "2003"

insert into mastermailhistory(id, vin, clientnumber, buyername, address1, city, state, zip, phone, lastrepairorderstamp, dropweek, dropmonth, dropyear, droptype)
select col001, col002, col003, substring(ltrim(col004), 1, 60), substring(ltrim(col005), 1, 40), substring(ltrim(col006), 1, 30), substring(ltrim(col007), 1, 2), substring(ltrim(col008), 1, 10), substring(ltrim(col009), 1, 9),  substring(ltrim(col010), 1, 10), substring(ltrim(col011), 1, 1), substring(ltrim(col012), 1, 1), substring(ltrim(col013), 1, 2), substring(ltrim(col014), 1, 4)
from post where col002 <> vin and col013 <> dropmonth
JayKusch (MIS)
16 Sep 03 14:26
A quick trick, if you have MS Access, is to import it using Access. In most cases it will import the good records and subsequently push the bad records, w/ a reason, to an error table. Look at the error table for hints and the known offending records. Had to use Access to clean Web Log recs I later imported into SQL Server.


J. Kusch

SQLSister (Programmer)
16 Sep 03 16:03
Some problems that we had where we got this type of error converting were that the data had the actual delimiter character as part of the data or had too many or not enough delimters at the end of the line (I don't rememebr which it was) for it to recognize the end of a line. Sometimes we had line feeds in the data ina field and we had to strip those out or the datatypes didn't match up. In the long run to fix these problems with our import data, my programmers wrote a parser to clean up the records. I run that first, then my DTS package.
EBOUGHEY (Programmer) (OP)
16 Sep 03 16:39
Access brought the file right in.  No errors.

I'm really not that well versed in SQL but I wonder if having just col001 instead of:


makes a difference.  Would It?

EBOUGHEY (Programmer) (OP)
18 Sep 03 8:21
Does Anyone have any other suggestions?  Can I import this differently?  Especially with me being such a novice?

I'm desperate for some help now.  I am running so far behind because I can't post back to the file.

Thanks for any info you may have,

JayKusch (MIS)
18 Sep 03 9:56
If it loaded into Access correctly, can you then port it from Access to SQL Server.


J. Kusch

EBOUGHEY (Programmer) (OP)
19 Sep 03 8:41
I've not done that before but I can try that.

Would it work if I added a header column to the text file with col001, col002 etc?

JayKusch (MIS)
19 Sep 03 12:57
Adding a column header would only gain you a more user readable column for your commands. There would be no difference in how it is parsed and processed.


J. Kusch

Reply To This Thread

Posting in the Tek-Tips forums is a member-only feature.

Click Here to join Tek-Tips and talk with other members!

Back To Forum

Close Box

Join Tek-Tips® Today!

Join your peers on the Internet's largest technical computer professional community.
It's easy to join and it's free.

Here's Why Members Love Tek-Tips Forums:

Register now while it's still free!

Already a member? Close this window and log in.

Join Us             Close