Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

Bulk Insert Help

412 views
Skip to first unread message

Lez

unread,
Sep 9, 2009, 4:43:52 PM9/9/09
to
Hi Guys,

Just trying to complete a Bulk Insert and am getting an error:

Incorrect syntax near 'FIELDTERMINATOR'.

WITH
(
FIRSTROW = 2
FIELDTERMINATOR = '"",""',
ROWTERMINATOR = '\r\n',
)

I am doing am import of data from a CSV file with a text delimiter of quotes
and column delimiter of coma.

Can anyone advise how I can correct this.

Thanks

Russell Fields

unread,
Sep 9, 2009, 4:51:57 PM9/9/09
to
Put a comma after FIRSTROW = 2. Then get rid of the comma at the end of the
ROWTERMINATOR line to avoid the next error.

WITH
(
FIRSTROW = 2,


FIELDTERMINATOR = '"",""',
ROWTERMINATOR = '\r\n'

)

RLF

"Lez" <lj_nospam_girvan@/noSpam\hotmail.co.uk> wrote in message
news:615D8184-6CA4-4692...@microsoft.com...


> Hi Guys,
>
> Just trying to complete a Bulk Insert and am getting an error:
>
> Incorrect syntax near 'FIELDTERMINATOR'.
>

>

Erland Sommarskog

unread,
Sep 9, 2009, 6:16:12 PM9/9/09
to
As Russell mentioned, you are missing a comma and have one too much in
another place.

But I also think you have the field terminator wrong. Does your file really
go like this:

Header"",""Header"",""Header
Data"",""Data"",""Data

I would rather gues it goes:

"Header","Header","Header"
"Data","Data","Data"

Or even

Header,Header,Header
"Data","Data","Data"

In this case you need to use a format file, because you have one more
field than you think. Assuming the example above, your format file
should go:

8.0
4
1 SQLCHAR 0 0 "\"" 0 "" ""
2 SQLCHAR 0 0 "\",\"" 1 col1 ""
3 SQLCHAR 0 0 "\",\"" 2 col1 ""
4 SQLCHAR 0 0 "\"\r\n" 3 col1 ""

And assuming that there are no quotes in the header, you should not
include FIRSTROW = 2.

Explaining all this in a hurry is not easy, but a fundamental thing to
know about BCP is that it reads a binary stream. Thus, it has no
particular notion of lines. It will start reading the file, until it
finds the terminator for the first field in the first record. That
is, until it finds the first quote. Since the target column for this
field is colunm 0, the value is discarded. Then it continues to the
next terminator which is ",". This field has target colunm 1 and is
saved. Same goes for the next field. The last field has " + newliene
as terminator. And then it starts over. The first field of each
following records will be empty and discarded.


--
Erland Sommarskog, SQL Server MVP, esq...@sommarskog.se

Links for SQL Server Books Online:
SQL 2008: http://msdn.microsoft.com/en-us/sqlserver/cc514207.aspx
SQL 2005: http://msdn.microsoft.com/en-us/sqlserver/bb895970.aspx
SQL 2000: http://www.microsoft.com/sql/prodinfo/previousversions/books.mspx

Lez

unread,
Sep 10, 2009, 3:44:57 AM9/10/09
to
Hi Erland,

So can I ask, if we define the format first, how do I then run the insert
and get it to follow the format, do I structure it as:

BULK INSERT PROD
FROM 'D:\PROD.CSV'

WITH
(


8.0
4
1 SQLCHAR 0 0 "\"" 0 "" ""
2 SQLCHAR 0 0 "\",\"" 1 col1 ""
3 SQLCHAR 0 0 "\",\"" 2 col1 ""
4 SQLCHAR 0 0 "\"\r\n" 3 col1 ""

)
GO
--Check the content of the table.
SELECT *
FROM dbo.PROD
GO
--Drop the table to clean up database.
SELECT *
FROM dbo.PROD
GO

Finally, not sure if you know the answer to this, but the bulk insert will
need to run every 24 hours, and would like to know if there is a method that
we can set this up ?

Many thanks for your time and advise.

Lez


"Erland Sommarskog" <esq...@sommarskog.se> wrote in message
news:Xns9C822C6E...@127.0.0.1...

Lez

unread,
Sep 10, 2009, 4:56:15 AM9/10/09
to
Ok, Erland, I have figured out I have to save the format file separately,
then call the format file when we want to run the BULK INSERT.

The is a sample of the file I will be importing, I have added numbers to the
header to identify the columns:

"1SKU Code","2Manf. Ref","3Barcode","4Supplier Ref","5Product Ref","6Product
Name","7Box Qty","8Box Cost","9Unit RRP","10Unit Price","11Unit
Trade","12Unit Webprice","13Unit Staff Price","14VAT Rate","15Brand
Name","16Size Stylename","17Size Value","18Colour Stylename","19Colour
Value","20Other Stylename","21Other Value","22Created On","23Updated On"

This is the first row of Data:

"CGV4558","CGV4558",84591331,"ISL","","Adjustable Sea kayak Boat
Bag",1,17.80000000,44.99,44.99,38.30,44.99,23.15,15.00,"-","","","","","","","2005-06-14
17:34:49","2009-04-27 12:32:22"

This is my format file:

9.0
23


1 SQLCHAR 0 0 "\"" 0 "" ""
2 SQLCHAR 0 0 "\",\"" 1 col1 ""

3 SQLCHAR 0 0 "\",\"" 2 col2 ""
4 SQLCHAR 0 0 "\",\"" 3 col3 ""
5 SQLCHAR 0 0 "\",\"" 4 col4 ""
6 SQLCHAR 0 0 "\",\"" 5 col5 ""
7 SQLCHAR 0 0 ",\"" 6 col6 ""
8 SQLCHAR 0 0 ",\"" 7 col7 ""
9 SQLCHAR 0 0 ",\"" 8 col8 ""
10 SQLCHAR 0 0 ",\"" 9 col9 ""
11 SQLCHAR 0 0 ",\"" 10 col10 ""
12 SQLCHAR 0 0 ",\"" 11 col11 ""
13 SQLCHAR 0 0 ",\"" 12 col12 ""
14 SQLCHAR 0 0 ",\"" 13 col13 ""
15 SQLCHAR 0 0 "\",\"" 14 col14 ""
16 SQLCHAR 0 0 "\",\"" 15 col15 ""
17 SQLCHAR 0 0 "\",\"" 16 col16 ""
18 SQLCHAR 0 0 "\",\"" 17 col17 ""
19 SQLCHAR 0 0 "\",\"" 18 col18 ""
20 SQLCHAR 0 0 "\",\"" 19 col19 ""
21 SQLCHAR 0 0 "\",\"" 20 col20 ""
22 SQLCHAR 0 0 "\",\"" 21 col21 ""
23 SQLCHAR 0 0 "\",\"" 22 col22 ""
24 SQLCHAR 0 0 "\"\r\n" 23 col23 ""

When I execute the script I get this error message:

Msg 8152, Level 16, State 13, Line 1
String or binary data would be truncated.
The statement has been terminated.

(0 row(s) affected)

(0 row(s) affected)

Can you advise wher I am going wrong.The table it is being inserted into all
fields are set as nvarchar(50) with the exception of 6 which is
nvarchar(255) do I have to set the sizes of the fields, when I tried this I
just got more errors

This is my script:

BULK INSERT PROD

FROM 'D:\PROD.CSV'

WITH ( formatfile = 'D:\BI_prod_format.txt')

GO
--Check the content of the table.
SELECT *
FROM dbo.PROD
GO
--Drop the table to clean up database.
SELECT *
FROM dbo.PROD
GO


Any suggestions would be appreciated.

Regards
Lez


"Lez" <lj_nospam_girvan@/noSpam\hotmail.co.uk> wrote in message

news:30B77BAA-7C61-4153...@microsoft.com...

Erland Sommarskog

unread,
Sep 10, 2009, 8:50:40 AM9/10/09
to
Lez (lj_nospam_girvan@/noSpam\hotmail.co.uk) writes:
> Ok, Erland, I have figured out I have to save the format file separately,
> then call the format file when we want to run the BULK INSERT.

Yup.



> The is a sample of the file I will be importing, I have added numbers to
> the header to identify the columns:
>
> "1SKU Code","2Manf. Ref","3Barcode","4Supplier Ref","5Product
> Ref","6Product
> Name","7Box Qty","8Box Cost","9Unit RRP","10Unit Price","11Unit
> Trade","12Unit Webprice","13Unit Staff Price","14VAT Rate","15Brand
> Name","16Size Stylename","17Size Value","18Colour Stylename","19Colour
> Value","20Other Stylename","21Other Value","22Created On","23Updated On"
>
> This is the first row of Data:
>
> "CGV4558","CGV4558",84591331,"ISL","","Adjustable Sea kayak Boat
> Bag",1,17.80000000,44.99,44.99,38.30,44.99,23.15,15.00,"-
","","","","","","","2005-06-14
> 17:34:49","2009-04-27 12:32:22"

Crap. It's not going to work. BULK INSERT requires the delimiters to
be applied consistently. In this case, the headers are all quoted, but
of the data fields, only the string fields are quoted. I was think that
the headers were not quoted at all!

I did not check your format file in detail, but it seems that it would
match your data row. At least you seem to have been careful to use different
delimiteres depening on whether a field is quoted or not.

But the format file will only work if all records are the same, and the
header line is not. Before I start to think of your options, I would
like you to check one thing: are all *data* rows consistently quoted.
Judging from the sample it seems so. Sometimes when they produce a CSV
they only add quotes when needed. This makes thing completely hopeless for
a squared tool like BCP.


> Finally, not sure if you know the answer to this, but the bulk insert
> will need to run every 24 hours, and would like to know if there is a
> method that we can set this up ?

You can set up a job from SQL Server Agent for this.

--
Erland Sommarskog, SQL Server MVP, esq...@sommarskog.se

Books Online for SQL Server 2005 at
http://www.microsoft.com/technet/prodtechnol/sql/2005/downloads/books.mspx
Books Online for SQL Server 2000 at
http://www.microsoft.com/sql/prodinfo/previousversions/books.mspx

Lez

unread,
Sep 10, 2009, 3:23:18 PM9/10/09
to
Hi Erland,

To confirm, my data is consistent as it runs from a cron job on a daily
basis to a defined selection criteria.

Maybe as a solution the SSIS might be an alternative. I have been able to
run this ok, but do not seem to be able to re-run it after it has been
created and saved.

Not sure if you are familer with SSIS, but if you know how I can create a
SSIS template and run it as required that to would be helpful.

Regards
Lez

"Erland Sommarskog" <esq...@sommarskog.se> wrote in message

news:Xns9C829701F...@127.0.0.1...

Erland Sommarskog

unread,
Sep 10, 2009, 5:41:10 PM9/10/09
to
Lez (lj_nospam_girvan@/noSpam\hotmail.co.uk) writes:
> To confirm, my data is consistent as it runs from a cron job on a daily
> basis to a defined selection criteria.

Yeah, but that is no guarantee that the program is consistent. But judging
from the sample, it does not seem to be "intelligent" and only add quotes
when needed.


> Maybe as a solution the SSIS might be an alternative. I have been able to
> run this ok, but do not seem to be able to re-run it after it has been
> created and saved.
>
> Not sure if you are familer with SSIS, but if you know how I can create a
> SSIS template and run it as required that to would be helpful.

No, I have not worked with SSIS, so I can't help on that part.

But I have might have found a way to bulk-load this file after all!
Although I post it with some hesitation, because this goes way beyond
"newusers". This is really obscure. But here it goes. I have this
table:

create table slask(col1 varchar(222),
col2 int,
col3 char(3))

I have this data file (indented only for formatting):

"My data","My other data","My last data"
"First col",12,"ggg"
"First col",19,"ppp"

The aim is to load the last two lines, but not the first. The trick to
ignore it in entirety is to specify this terminator for the first field:

\n"

That is newline followed by a double quote. Consequently, the terminator
for the last field must not include the \n, but be only "\r.

Here is the format file:

9.0
4
1 SQLCHAR 0 0 "\n\"" 0 "" ""


2 SQLCHAR 0 0 "\"," 1 col1 ""
3 SQLCHAR 0 0 ",\"" 2 col2 ""

4 SQLCHAR 0 0 "\"\r" 3 col3 ""

Now, the clever reader may notice that the file is likely to include a
final \n that is not part of any record. Will BULK INSERT ignores this?
Nope, you get this error message:

Msg 4832, Level 16, State 1, Line 1
Bulk load: An unexpected end of file was encountered in the data file.

BUT! The command-line tool BCP is nice enough to ignore the extra \n:

C:\Temp>bcp tempdb..slask in slask.bcp -T -f slask.fmt

Starting copy...

2 rows copied.
Network packet size (bytes): 4096
Clock Time (ms.) Total : 47 Average : (42.55 rows per sec.)

Disclaimer: this is BCP from SQL 2008, but I would not expect this to
be a difference.

BCP and BULK INSERT follows the same principles, and work the same in
most cases. But once is implemented in ODBC and another in OLE DB,
and there are some subtle differences.

You should be able to run BCP from a CmdLine job in SQL Server Agent.

Final remark: you said the job was run from cron. That smells Unix.
In that case, I suspect that the line end is not \r\n, but only \n.
If I'm right, you should remove \r from the format file. It's good that
the last field in the file is quoted, else we would have had a problem!

--
Erland Sommarskog, SQL Server MVP, esq...@sommarskog.se

Links for SQL Server Books Online:

Lez

unread,
Sep 23, 2009, 9:15:54 AM9/23/09
to
Hi Erland,

Sorry for the delay in responding to this, just had other stuff to complete
also.

Ok this is really starting to make my head hurt now, sorry I have been
looking at this and reading it but it does not seem to be going in.

I am just struggling with the formatting:

first to confirm, text delimited fields should be defined as:

x SQLCHAR 0 0 "\",\"" x colx ""

and none text delimited fields should be defined as:?

x SQLCHAR 0 0 ",\"" x colx ""

With the first line to ignore the Delimiters:?

x SQLCHAR 0 0 "\n\"" 0 "" ""

And my final line for end of row:?

x SQLCHAR 0 0 "\"\r" x colx ""

You then mention "The command-line tool BCP" what is this? and I do not have
server agent on my server based SQL Studio Management, can this not be run
from a query window in Studio Management to test ?

an you explain the follow: C:\Temp>bcp tempdb..slask in slask.bcp -T -f
slask.fmt. I gather that this is the database, then the table, but not sure
what the rest of the line means?

Finally, do I need to save the format file with .fmt extension rather than
.txt? also is there any location preferred to save the format file?

Many thanks
Lez


"Erland Sommarskog" <esq...@sommarskog.se> wrote in message

news:Xns9C82F0FCC...@127.0.0.1...

Erland Sommarskog

unread,
Sep 23, 2009, 5:54:11 PM9/23/09
to
Lez (lez_nospam_@/noSpam\hotmail.co.uk) writes:
> Ok this is really starting to make my head hurt now, sorry I have been
> looking at this and reading it but it does not seem to be going in.

As I mentioned, the stuff I posted in my previous post is quite
advnanced.



> I am just struggling with the formatting:
>
> first to confirm, text delimited fields should be defined as:
>
> x SQLCHAR 0 0 "\",\"" x colx ""
>
> and none text delimited fields should be defined as:?
>
> x SQLCHAR 0 0 ",\"" x colx ""

Not really. What you define is not *delimiters*. You define *terminators*.

So if you have a text field followed by another text field the terminator
should be

"\",\""

If you have a text field followed by a numeric field, the terminator
should be

"\","

And a numeric field followed by a text field is:

",\""

And a numeric field followed by another numeric field:

","

The last field is special. You last field was a text field, thus:

"\"\r"

Or maybe only

"\""

If the file is a Unix file with only \n as line terminator.


> With the first line to ignore the Delimiters:?
>
> x SQLCHAR 0 0 "\n\"" 0 "" ""

Yes. Or more precisely, with that definition BCP/BULK INSERT thinks the
first field is terminated by a newline character and a quote.

> You then mention "The command-line tool BCP" what is this?

A command-line tool! You run command-line tools from a Command-line
window. One way to open such a window is Start->Run and type cmd.exe.
You also find under the Accesories program group in the Start menu.

> and I do not have server agent on my server based SQL Studio Management,

You said that you intended to run your BULK INSERT every 24 hours, so
I assumed that you already have Agent. Which Edition of SQL Server
did you say you were using?


> an you explain the follow: C:\Temp>bcp tempdb..slask in slask.bcp -T -f
> slask.fmt. I gather that this is the database, then the table, but not
> sure what the rest of the line means?

"in" means that you are bulking data in. You would use "out" to export
data. The third parameter is the name of the datafile. -T specifices
that you login with integrated security. The -f option specifies the
format file.

> Finally, do I need to save the format file with .fmt extension rather than
> .txt? also is there any location preferred to save the format file?

The extension does not matter. Save the format file where you have
the data file, that is proabably the easiest.

Lez

unread,
Sep 30, 2009, 5:39:05 AM9/30/09
to
Hi Erland,

Many thanks for that detailed explanation, sorry about the cmdtool, I
thought is was another application not just the cmd prompt..doh

I have now created the format file and attempted to run this as advised, but
am getting this error when I try to run it:


C:\>bcp brook PROD in PROD.bcp -T -f prodBI.txt

Copy direction must be either 'in', 'out' or 'format'.
usage: bcp {dbtable | query} {in | out | queryout | format} datafile

I then changed the cmd to:

C:\>bcp PROD in PROD.bcp -T -f prodBI.txt

SQLState = 28000, NativeError = 18456
Error = [Microsoft][SQL Native Client][SQL Server]Login failed for user
'admin_L
T\XXXXX'.

So assume I need to include the username and password, although the
authentication is normally done as server auth, not windows?

Can you advise how I add the server login details as part of this process?

Thank you
Lez


"Erland Sommarskog" <esq...@sommarskog.se> wrote in message

news:Xns9C8FF3370...@127.0.0.1...

Russell Fields

unread,
Sep 30, 2009, 2:13:25 PM9/30/09
to
Lez (this is not Erland responding),

> C:\>bcp brook PROD in PROD.bcp -T -f prodBI.txt

In this case, the error is 'brook PROD', which I assume was meant to be
database name and table name. If so, you must remember to use dot notation,
namely 'brook.schema.PROD'. Resulting in:

bcp brook.dbo.PROD in PROD.bcp -T -f prodBI.txt

(I assume the dbo schema, but it could also be some other schema.)

However, since you are still not specifying the server, the assumption is
that it is the local default instance of SQL Server on the machine where you
are running bcp. So, in order to specify the server:

bcp brook.dbo.PROD in PROD.bcp -T -f prodBI.txt -S Servername\InstanceName


> Error = [Microsoft][SQL Native Client][SQL Server]Login failed for user
> 'admin_LT\XXXXX'.

Since you did not specify the server, it is possible that you were not
connecting to the server you expected to connect to. Therefore, the login
failed since admin_LT\XXXXX had no rights on that server. Trusted
connections work just fine if the rights exist.

However, if you truly need a SQL Server login then you need a couple more
parameters, all in one line below, no matter how it looks in the post.

bcp brook.dbo.PROD in PROD.bcp -T -f prodBI.txt -S
Servername\InstanceName -U loginname -P password

Have you looked at this link and the hotlinks at the bottom of this topic?
(This is the 2008 documentation, but you can click through to 2005 if that
is what you prefer.)
http://msdn.microsoft.com/en-us/library/ms162802.aspx

RLF

Lez

unread,
Oct 2, 2009, 9:23:54 AM10/2/09
to
Hi Russell,

Many thanks for that, will try your suggestion and will read the
documentation and advise accordingly.

Regards
Lez

"Russell Fields" <russel...@nomail.com> wrote in message
news:O%23o30mfQ...@TK2MSFTNGP05.phx.gbl...

Lez

unread,
Oct 9, 2009, 9:25:30 AM10/9/09
to
Hi Russell,

Sorry for the delay in getting back to you on this. I have tried to run this
on the server now using the cmdline tool, but keep getting this error
message:

SQLState = S1000, NativeError = 0
Error = [Microsoft][SQL Native Client]Unable to open BCP host data-file

I assume that this is the CSV file that contains the data? I have saved the
format file in the same location as the datafile and tried switching to that
folder in the cmd window and try it from there to see if that was the error

This is the cmdline I am using:

D:\>bcp brook.dbo.PROD in PROD.bcp -T -f prodBI.fmt -S myserver\myinstance
-U myuserme -P mypassword

SQLState = S1000, NativeError = 0
Error = [Microsoft][SQL Native Client]Unable to open BCP host data-file

I have looked through the link you provided but cannot find anything there
but will search for the error and see If I can find anything.

Regards
Lez

"Russell Fields" <russel...@nomail.com> wrote in message
news:O%23o30mfQ...@TK2MSFTNGP05.phx.gbl...

Lez

unread,
Oct 9, 2009, 11:27:35 AM10/9/09
to
OK after further investigation, I have moved on and understand that I have
to name the full path for both the file that contains the data I want to
import and also the format file used to handle the BCP.

I am now however getting the following error:

SQLState = S1000, NativeError = 0

Error = [Microsoft][SQL Native Client]Invalid buffer size for BCP column
property

So assume I need to add a batch size or something to my cmdline?:

C:\>bcp brook.dbo.PROD in D:\PROD.CSV -f D:\prodBI.txt -S
MYSERVER\MYINSTANCE -U XXXXXXX -P XXXXXXXXXX

does any know what I should add, I have tried adding -b 10000 with no luck

Regards
Lez


"Lez" <lez_nospam_@/noSpam\hotmail.co.uk> wrote in message
news:9F62F925-DBF9-4F7D...@microsoft.com...

Erland Sommarskog

unread,
Oct 9, 2009, 5:12:14 PM10/9/09
to
Lez (lez_nospam_@/noSpam\hotmail.co.uk) writes:
> OK after further investigation, I have moved on and understand that I have
> to name the full path for both the file that contains the data I want to
> import and also the format file used to handle the BCP.
>
> I am now however getting the following error:
>
> SQLState = S1000, NativeError = 0
> Error = [Microsoft][SQL Native Client]Invalid buffer size for BCP column
> property
>
> So assume I need to add a batch size or something to my cmdline?:
>
> C:\>bcp brook.dbo.PROD in D:\PROD.CSV -f D:\prodBI.txt -S
> MYSERVER\MYINSTANCE -U XXXXXXX -P XXXXXXXXXX
>
> does any know what I should add, I have tried adding -b 10000 with no luck

Since it was a while since we spoke, could you post the current version
of the format file. Seems like there is a problem with it. (The errors from
BCP are often obscure.)

Lez

unread,
Oct 11, 2009, 8:58:19 AM10/11/09
to
Hi Erland,

This is the format file content:

9.0
24
1 SQLCHAR 0 0 "\"" 0 "" ""


2 SQLCHAR 0 0 "\",\"" 1 col1 ""
3 SQLCHAR 0 0 "\",\"" 2 col2 ""

4 SQLCHAR 0 0 "\",\"" 3 col3 ""
5 SQLCHAR 0 0 "\",\"" 4 col4 ""
6 SQLCHAR 0 0 "\",\"" 5 col5 ""
7 SQLCHAR 0 0 "\"," 6 col6 ""
8 SQLCHAR 0 0 "," 7 col7 ""

9 SQLCHAR 0 0 "," 8 col8 "�


10 SQLCHAR 0 0 "," 9 col9 ""
11 SQLCHAR 0 0 "," 10 col10 ""
12 SQLCHAR 0 0 "," 11 col11 ""
13 SQLCHAR 0 0 "," 12 col12 ""
14 SQLCHAR 0 0 "," 13 col13 ""

15 SQLCHAR 0 0 ",\�" 14 col14 ""


16 SQLCHAR 0 0 "\",\"" 15 col15 ""
17 SQLCHAR 0 0 "\",\"" 16 col16 ""
18 SQLCHAR 0 0 "\",\"" 17 col17 ""

19 SQLCHAR 0 0 "\",\"" 18 col18 "�


20 SQLCHAR 0 0 "\",\"" 19 col19 ""

21 SQLCHAR 0 0 "\",\"" 20 col20 "�


22 SQLCHAR 0 0 "\",\"" 21 col21 ""
23 SQLCHAR 0 0 "\",\"" 22 col22 ""

24 SQLCHAR 0 0 "\"" 23 col23 "�

Regards
Lez


"Erland Sommarskog" <esq...@sommarskog.se> wrote in message

news:Xns9C9FEC0E0...@127.0.0.1...

Erland Sommarskog

unread,
Oct 11, 2009, 10:26:48 AM10/11/09
to

There is no newline in the format file, which means that any line
break will be taken as part of the data.

If memory serves, you had a header line which had a different structure
from the rest of the file. On top of all, your file came from Unix and
had only \n and not \r\n as the separator. Do I recall correctly?

In that case, the first field should read:

1 SQLCHAR 0 0 "\n\"" 0 "" ""

If you have \r\n as the line terminator in the file, the last line
should read:

24 SQLCHAR 0 0 "\"\r" 23 col23 ""

There is one more thing that is wrong, but this may be a mishap in your
newsreader or similar. But in the file some of the quotes are slanted
and not straight. If you actually have these in your file, be sure to
change them.

Lez

unread,
Oct 11, 2009, 2:11:06 PM10/11/09
to
Hi Erland,

Thanks for that, have checked the file and corrected as suggested, but now
getting the following error:

SQLState = S1000, NativeError = 0

Error = [Microsoft][SQL Native Client]I/O error while reading BCP format
file

This is the file format now:

9.0
24
1 SQLCHAR 0 0 "\n\"" 0 "" ""


2 SQLCHAR 0 0 "\",\"" 1 col1 ""
3 SQLCHAR 0 0 "\",\"" 2 col2 ""
4 SQLCHAR 0 0 "\",\"" 3 col3 ""
5 SQLCHAR 0 0 "\",\"" 4 col4 ""
6 SQLCHAR 0 0 "\",\"" 5 col5 ""
7 SQLCHAR 0 0 "\"," 6 col6 ""
8 SQLCHAR 0 0 "," 7 col7 ""
9 SQLCHAR 0 0 "," 8 col8 ""

10 SQLCHAR 0 0 "," 9 col9 ""
11 SQLCHAR 0 0 "," 10 col10 ""
12 SQLCHAR 0 0 "," 11 col11 ""
13 SQLCHAR 0 0 "," 12 col12 ""
14 SQLCHAR 0 0 "," 13 col13 ""

15 SQLCHAR 0 0 ",\"" 14 col14 ""


16 SQLCHAR 0 0 "\",\"" 15 col15 ""
17 SQLCHAR 0 0 "\",\"" 16 col16 ""
18 SQLCHAR 0 0 "\",\"" 17 col17 ""
19 SQLCHAR 0 0 "\",\"" 18 col18 ""

20 SQLCHAR 0 0 "\",\"" 19 col19 ""
21 SQLCHAR 0 0 "\",\"" 20 col20 ""

22 SQLCHAR 0 0 "\",\"" 21 col21 ""
23 SQLCHAR 0 0 "\",\"" 22 col22 ""

24 SQLCHAR 0 0 "\"\r" 23 col23 ""

If you can suggest anything?

Regards
Lez

"Erland Sommarskog" <esq...@sommarskog.se> wrote in message

news:Xns9CA1A74DE...@127.0.0.1...

Erland Sommarskog

unread,
Oct 11, 2009, 3:43:28 PM10/11/09
to
Lez (lez_nospam_@/noSpam\hotmail.co.uk) writes:
> Thanks for that, have checked the file and corrected as suggested, but now
> getting the following error:
>
> SQLState = S1000, NativeError = 0
> Error = [Microsoft][SQL Native Client]I/O error while reading BCP format
> file

Helpful error message isn't it? :-)

Probably something very basic, but what? The only thing I can think of
is that you have saved the file in a format that BCP does not understand.
For instance as a Unicode file, or with only \n as line terminator and
not \r\n. Unless there really is a bad sector on your disk. But that is
the least likely explanation. BCP is really good at poor error message.

If everything else fails, put the file on a website somewhere, so we
can download and inspeect it.

Erland Sommarskog

unread,
Oct 11, 2009, 4:02:58 PM10/11/09
to
I found:

http://forums.databasejournal.com/showthread.php?t=18211

Here the author says that his problem was that the file did not have
a CR-LF for the last line.

Lez

unread,
Oct 12, 2009, 11:55:37 AM10/12/09
to
Hi Erland,

Thanks for that, will give it a shot and let you know how I get on.

Just to go back to the earlier questions, if the file is quoted
consistently i.e. headers all have "" and ALL data is "quoted" we can use
the bulk insert method rather than BCP?

I am beginning to wonder if it would be worth asking the data provider if
they can re-format the file to included all quotes to use the Bulk insert
via Studio manager?

Many thanks and will advise on progress.

Regards
Lez


"Erland Sommarskog" <esq...@sommarskog.se> wrote in message

news:Xns9CA1E04D2...@127.0.0.1...

Erland Sommarskog

unread,
Oct 12, 2009, 5:29:11 PM10/12/09
to
Lez (lez_nospam_@/noSpam\hotmail.co.uk) writes:
> Just to go back to the earlier questions, if the file is quoted
> consistently i.e. headers all have "" and ALL data is "quoted" we can use
> the bulk insert method rather than BCP?

Yes. The reason we need to use BCP now, is that we do a trick, which
BULK INSERT cannot cope with but BCP agrees with.

And without that trick, the format file would be a little more
logical; the delimiter for the first field will confuse anyone
who takes a look at the format file.



> I am beginning to wonder if it would be worth asking the data provider if
> they can re-format the file to included all quotes to use the Bulk insert
> via Studio manager?

Or just strip the header entirely. If they want to give you a self-
describing file, they should use XML.

Lez

unread,
Oct 13, 2009, 7:08:13 AM10/13/09
to
Hi Erland,

If we get them to remove the header what would I need to alter in the format
file to use the bulk insert command?

Do I simply removed the first line and leave the rest of the file format
as-is? this is the current format file:

Regards
Lez


"Erland Sommarskog" <esq...@sommarskog.se> wrote in message

news:Xns9CA2EEEDC...@127.0.0.1...

Erland Sommarskog

unread,
Oct 13, 2009, 5:53:10 PM10/13/09
to
Lez (lez_nospam_@/noSpam\hotmail.co.uk) writes:
> If we get them to remove the header what would I need to alter in the
> format file to use the bulk insert command?

You change these first and the last line to look like this:

1 SQLCHAR 0 0 "\"" 0 "" ""

24 SQLCHAR 0 0 "\"\r\n" 23 col23 ""

> Do I simply removed the first line and leave the rest of the file format
> as-is? this is the current format file:

No, you still need the dummy line, since the first field is quoted.

Lez

unread,
Oct 14, 2009, 8:41:12 AM10/14/09
to
Hi Erland,

Ok I am trying this now with the header row removed, and just 2 fields of
data to add to see if I can hack my way through getting this to work.

I have tried:

9.0
3


1 SQLCHAR 0 0 "\"" 0 "" ""

2 SQLCHAR 0 0 "\"," 1 col1 ""

3 SQLCHAR 0 0 "\"\r\n" 2 col2 ""

And keep getting the error:

Msg 8152, Level 16, State 13, Line 1
String or binary data would be truncated.
The statement has been terminated.

(0 row(s) affected)

(0 row(s) affected)

I have altered the \r and \n thinking this might resolve the problem, with
no luck..argh, sorry any suggestions?

Regards
Lez

"Erland Sommarskog" <esq...@sommarskog.se> wrote in message

news:Xns9CA3F3018...@127.0.0.1...

Erland Sommarskog

unread,
Oct 14, 2009, 5:20:59 PM10/14/09
to
Lez (lez_nospam_@/noSpam\hotmail.co.uk) writes:
> Ok I am trying this now with the header row removed, and just 2 fields of
> data to add to see if I can hack my way through getting this to work.
>
> I have tried:
>
> 9.0
> 3
> 1 SQLCHAR 0 0 "\"" 0 "" ""
> 2 SQLCHAR 0 0 "\"," 1 col1 ""
> 3 SQLCHAR 0 0 "\"\r\n" 2 col2 ""
>
> And keep getting the error:
>
> Msg 8152, Level 16, State 13, Line 1
> String or binary data would be truncated.
> The statement has been terminated.
>
> (0 row(s) affected)
>
> (0 row(s) affected)
>
> I have altered the \r and \n thinking this might resolve the problem,
> with no luck..argh, sorry any suggestions?

The error means that you are trying to cram more characters into a
column than what fits. Since I don't know how the table looks like,
or how the data file looks I cannot say much. But if your table is

CREATE TABLE tbl (a varchar(10) NOT NULL,
b varchar(10) NOT NULL)

And you have a data file that goes

"0123456879",0123456789"
"0123456789",0123456789"

the above format file will work. Hm, there is imbalance in quotes there.
That might be a clue.

Lez

unread,
Oct 18, 2009, 9:07:37 AM10/18/09
to
Hi Erland,

Ok I have tried and tried to get this to work, and have reduced the file dow
to a single line of data and keep getting this error:

Msg 4832, Level 16, State 1, Line 1
Bulk load: An unexpected end of file was encountered in the data file.

Msg 7399, Level 16, State 1, Line 1
The OLE DB provider "BULK" for linked server "(null)" reported an error. The
provider did not give any information about the error.
Msg 7330, Level 16, State 2, Line 1
Cannot fetch a row from OLE DB provider "BULK" for linked server "(null)".

I have tried changing the last field to every alternative you have
suggested, this is the sturcture of the table:

GO
/****** Object: Table [dbo].[PROD] Script Date: 10/18/2009 14:01:32
******/
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
CREATE TABLE [dbo].[PROD](
[SKU Code] [nvarchar](50) NULL,
[Manf Ref] [nvarchar](50) NULL,
[Barcode] [nvarchar](50) NULL,
[Supplier Ref] [nvarchar](50) NULL,
[Product Ref] [nvarchar](50) NULL,
[Product Name] [nvarchar](255) NULL,
[Box Qty] [nvarchar](50) NULL,
[Box Cost] [nvarchar](50) NULL,
[Unit RRP] [nvarchar](50) NULL,
[Unit Price] [nvarchar](50) NULL,
[Unit Trade] [nvarchar](50) NULL,
[Unit Webprice] [nvarchar](50) NULL,
[Unit Staff Price] [nvarchar](50) NULL,
[VAT Rate] [nvarchar](50) NULL,
[Brand Name] [nvarchar](50) NULL,
[Size Stylename] [nvarchar](50) NULL,
[Size Value] [nvarchar](50) NULL,
[Colour Stylename] [nvarchar](50) NULL,
[Colour Value] [nvarchar](50) NULL,
[Other Stylename] [nvarchar](50) NULL,
[Other Value] [nvarchar](50) NULL,
[Created On] [nvarchar](50) NULL,
[Updated On] [nvarchar](50) NULL
) ON [PRIMARY]

I have checked the field sizes of each all are set at (50) with exception of
col 6 which is set to (255) as this is a temp table all feileds can have
(NULL) value if required.

This is the single line CSV file :

"CGV4558","CGV4558",84591331,"ISL","","Adjustable Sea kayak Boat
Bag",1,17.80000000,44.99,44.99,38.30,44.99,23.15,15.00,"-","","","","","","","2005-06-14
17:34:49","2009-04-27 12:32:22"

This is my file format:

9.0
24


1 SQLCHAR 0 0 "\"" 0 "" ""
2 SQLCHAR 0 0 "\",\"" 1 col1 ""

3 SQLCHAR 0 0 "\",\"" 2 col2 ""
4 SQLCHAR 0 0 "\",\"" 3 col3 ""
5 SQLCHAR 0 0 "\",\"" 4 col4 ""
6 SQLCHAR 0 0 "\",\"" 5 col5 ""
7 SQLCHAR 0 0 "\"," 6 col6 ""
8 SQLCHAR 0 0 "," 7 col7 ""
9 SQLCHAR 0 0 "," 8 col8 ""
10 SQLCHAR 0 0 "," 9 col9 ""
11 SQLCHAR 0 0 "," 10 col10 ""
12 SQLCHAR 0 0 "," 11 col11 ""
13 SQLCHAR 0 0 "," 12 col12 ""
14 SQLCHAR 0 0 "," 13 col13 ""
15 SQLCHAR 0 0 ",\"" 14 col14 ""
16 SQLCHAR 0 0 "\",\"" 15 col15 ""
17 SQLCHAR 0 0 "\",\"" 16 col16 ""
18 SQLCHAR 0 0 "\",\"" 17 col17 ""
19 SQLCHAR 0 0 "\",\"" 18 col18 ""
20 SQLCHAR 0 0 "\",\"" 19 col19 ""
21 SQLCHAR 0 0 "\",\"" 20 col20 ""
22 SQLCHAR 0 0 "\",\"" 21 col21 ""
23 SQLCHAR 0 0 "\",\"" 22 col22 ""

24 SQLCHAR 0 0 "\"\r\n" 23 col23 ""

And this is the BULK INSERT script:

BULK INSERT PROD

FROM 'C:\brook\PROD2.csv'

WITH ( formatfile = 'C:\brook\prodBI.txt')

GO
--Check the content of the table.
SELECT *
FROM dbo.PROD
GO
--Drop the table to clean up database.
SELECT *
FROM dbo.PROD
GO

Really appreciate your help and patience with this...it is driving me nuts
trying to complete this

Regards
Lez


"Erland Sommarskog" <esq...@sommarskog.se> wrote in message

news:Xns9CA4ED8E5...@127.0.0.1...

Erland Sommarskog

unread,
Oct 18, 2009, 2:37:11 PM10/18/09
to
Lez (lez_nospam_@/noSpam\hotmail.co.uk) writes:
> Ok I have tried and tried to get this to work, and have reduced the file
> dow to a single line of data and keep getting this error:
>
> Msg 4832, Level 16, State 1, Line 1
> Bulk load: An unexpected end of file was encountered in the data file.

Which most likely means that you have something wrong with the quotes
somewhere. Let's have a look:

>
> "CGV4558","CGV4558",84591331,"ISL","","Adjustable Sea kayak Boat
> Bag",1,17.80000000,44.99,44.99,38.30,44.99,23.15,15.00,"-
","","","","","","","2005-06-14
> 17:34:49","2009-04-27 12:32:22"
>
> This is my file format:
>
> 9.0
> 24
> 1 SQLCHAR 0 0 "\"" 0 "" ""
> 2 SQLCHAR 0 0 "\",\"" 1 col1 ""
> 3 SQLCHAR 0 0 "\",\"" 2 col2 ""
> 4 SQLCHAR 0 0 "\",\"" 3 col3 ""
> 5 SQLCHAR 0 0 "\",\"" 4 col4 ""

Oh-oh, the third field is not quoted, so you have some quotes to many here.
When I put 84591331 in quotes, I was able to load the file. The correct
fix is of course to modify the format file, but I was too lazy to do
that. After all, I need to leave some of the fun to you! :-)

Lez

unread,
Oct 19, 2009, 11:01:51 AM10/19/09
to
Hi Erland,

Thanks for that, and I have had some fun with this!!

Like yourself, I enclosed the 3rd column in quotes and got it to insert
fine. So then I set about changing the format file to match the data. and
changed rows 2 3 and 4 to:

3 SQLCHAR 0 0 "\"," 2 col2 ""
4 SQLCHAR 0 0 "," 3 col3 ""
5 SQLCHAR 0 0 ",\"" 4 col4 ""

Then removed the quotes from around column 3 and the single row of data
imported. However it did included " " around the item in column 4

So Changed line 5 to:

5 SQLCHAR 0 0 "\",\"" 4 col4 ""

As stated for a text file, which meant that the field nopw had a single "
quote in column 4

I thought ok add another row and see what happens? and keep getting the
following error:

Msg 8152, Level 16, State 13, Line 1
String or binary data would be truncated.
The statement has been terminated.

(0 row(s) affected)

(0 row(s) affected)

I also changed the last row to:

24 SQLCHAR 0 0 "\"" 23 col23 ""

As I noticed that the end of the feild had a "quote in the end, so changed
and that seemed to sort it

I have also looked further into the file that is being provided and can see
that the formatting is not consistent, as I am finding columns where the
data is quoted and then further down the data, the same column is unquoted!!
argh.., so going to get back to the providors and get them to make this data
consistent and then see how it all works.

Will post back in due course with an update.

Many thanks
Lez


"Erland Sommarskog" <esq...@sommarskog.se> wrote in message

news:Xns9CA8D1C78...@127.0.0.1...

Lez

unread,
Oct 20, 2009, 8:50:22 AM10/20/09
to
Hi Erland,

Further to my post from yesterday, I have today been able to complete the
Bulk Insert routine for these task, with one exception. But first to explain
what I did.

The issue of quoted and unquoted was becoming to complicated, so I tried to
look at in another, simpler way, getting the data provider to modify the
output file to use the tilde (~) character as the field separator. This way
all I need to do was use this script:

BULK INSERT PROD
FROM 'C:\brook\prod.csv'
WITH
(
FIELDTERMINATOR = '~',
ROWTERMINATOR = '\n'
)

This worked perfectly, before using this character, we search the whole of
the data we are receiving for the tilde (~) character and it did not appear
in the data anywhere, so requested that they send the data without headers
and simply using the tilde character to separate the fields.

This worked perfectly for 2 of the 3 files we get, however for some reason,
when I run it on the SUPP.csv I get the following error:

Bulk load data conversion error (type mismatch or invalid character for the
specified codepage) for row 1, column 16 (Currency).

Table is created as:

[suppID] [nvarchar](50) NULL,


[Supplier Ref] [nvarchar](50) NULL,

[Account Ref] [nvarchar](50) NULL,
[Supplier Name] [nvarchar](50) NULL,
[Address Line 1] [nvarchar](50) NULL,
[Address Line 2] [nvarchar](50) NULL,
[City] [nvarchar](50) NULL,
[County] [nvarchar](50) NULL,
[Country] [nvarchar](50) NULL,
[Postcode] [nvarchar](50) NULL,
[Phone] [nvarchar](50) NULL,
[Fax] [nvarchar](50) NULL,
[Email] [nvarchar](50) NULL,
[Contact] [nvarchar](50) NULL,
[Vat Code] [nvarchar](50) NULL,
[Currency] [nvarchar](50) NULL

I then tried deleting the currency field from the table and also excluded it
from the supplied data, and ended up getting the same error for the VAT
CODE, so assume it must be the row terminator now? yet the same row
terminator works for the 2 previous inserts!!

So thanks for your help to date, just a final thought or suggestion on
completing this final step.

Regards
Lez

"Erland Sommarskog" <esq...@sommarskog.se> wrote in message

news:Xns9CA8D1C78...@127.0.0.1...

Erland Sommarskog

unread,
Oct 20, 2009, 5:46:23 PM10/20/09
to
Lez (lez_nospam_@/noSpam\hotmail.co.uk) writes:
> The issue of quoted and unquoted was becoming to complicated, so I tried
> to look at in another, simpler way, getting the data provider to modify
> the output file to use the tilde (~) character as the field separator.

Wise move. :-)

I think I said this in the beginning of the thread: BCP cannot cope with
inconsistent quoting. It reads a binary stream and follows the instructions.

> Bulk load data conversion error (type mismatch or invalid character for
> the specified codepage) for row 1, column 16 (Currency).

Hm, wonder what that means. The column is nvarchar, so anything should
import. Hm, I would guess there is some junk character which is not
defined in your code page. Try adding CODEPAGE = 'RAW' to the BULK INSERT
command, and see what happens.

Erland Sommarskog

unread,
Oct 21, 2009, 6:01:46 PM10/21/09
to
Lez (lez_nospam_@/noSpam\hotmail.co.uk) writes:
> Table is created as:
>
> [suppID] [nvarchar](50) NULL,
> [Supplier Ref] [nvarchar](50) NULL,
> [Account Ref] [nvarchar](50) NULL,
> [Supplier Name] [nvarchar](50) NULL,
> [Address Line 1] [nvarchar](50) NULL,
> [Address Line 2] [nvarchar](50) NULL,
> [City] [nvarchar](50) NULL,
> [County] [nvarchar](50) NULL,
> [Country] [nvarchar](50) NULL,
> [Postcode] [nvarchar](50) NULL,
> [Phone] [nvarchar](50) NULL,
> [Fax] [nvarchar](50) NULL,
> [Email] [nvarchar](50) NULL,
> [Contact] [nvarchar](50) NULL,
> [Vat Code] [nvarchar](50) NULL,
> [Currency] [nvarchar](50) NULL

And all columns are really nullable? I ask, because I was reminded of
a bug that I submitted for BULK INSERT way back in 2007, where you
get precisely the error message you get. But that bug had to do with
columns that are NOT NULL. Whence my question.

Lez

unread,
Oct 23, 2009, 7:27:54 AM10/23/09
to
Hi Erland,

Yes they are defiantly all NULL, that was a sample of the table script which
I used to make sure the table was exactly that. I am now having a problem
with the data provider, as they are saying it is going to take a CCR request
to make the modification to the export file to change it to use the ~
character instead.

This could delay the process by many weeks and might not even get done if
they reject the CCR.

Can I ask you therefore, my CUST.csv file is consistent, I have checked all
the fields as well as I can, but am struggling with the format file. I don't
want to post a sample of the data as it includes personnel details but the
file is made up as:

Customer ID,"First Name","Last Name","Address Line 1","Address Line
2","City","County","Country","Postcode","Telephone","Fax","Email"

When I run the script:

BULK INSERT PROD

FROM 'C:\brook\CUST2.csv'

WITH ( formatfile = 'C:\brook\cust.txt')

GO
--Check the content of the table.
SELECT *

FROM dbo.CUST


GO
--Drop the table to clean up database.
SELECT *

FROM dbo.CUST
GO

It appears to work, yet adds no records:


(5 row(s) affected)

(0 row(s) affected)

(0 row(s) affected)

This is my format file:

9.0
13


1 SQLCHAR 0 0 "\"" 0 "" ""
2 SQLCHAR 0 0 ",\"" 1 col1 ""
3 SQLCHAR 0 0 "\",\"" 2 col2 ""
4 SQLCHAR 0 0 "\",\"" 3 col3 ""
5 SQLCHAR 0 0 "\",\"" 4 col4 ""

6 SQLCHAR 0 0 "\",\"" 5 col5 ""
7 SQLCHAR 0 0 "\",\"" 6 col6 ""
8 SQLCHAR 0 0 "\",\"" 7 col7 ""
9 SQLCHAR 0 0 "\",\"" 8 col8 ""
10 SQLCHAR 0 0 "\",\"" 9 col9 ""
11 SQLCHAR 0 0 "\",\"" 10 col10 ""
12 SQLCHAR 0 0 "\",\"" 11 col11 ""

13 SQLCHAR 0 0 "\"" 12 col12 ""


Again any suggestions would be most welcome.

Regards
Lez


"Erland Sommarskog" <esq...@sommarskog.se> wrote in message

news:Xns9CAC5AC4...@127.0.0.1...

Lez

unread,
Oct 23, 2009, 7:59:39 AM10/23/09
to
Ok, update on that and just noticed that the data was going into the wrong
table...doh!!

So After correcting the target table, this is the error I am getting:

Msg 4832, Level 16, State 1, Line 1
Bulk load: An unexpected end of file was encountered in the data file.

Msg 7399, Level 16, State 1, Line 1
The OLE DB provider "BULK" for linked server "(null)" reported an error. The
provider did not give any information about the error.
Msg 7330, Level 16, State 2, Line 1
Cannot fetch a row from OLE DB provider "BULK" for linked server "(null)".

I have changed the row terminator from "\"\r" and also tried "\"" neither of
which has worked.

Also tried changing the column 1 terminator to "," but this also failed, as
I expected it would as the next column is a text column.

This is the structure of the table the data is going into:

CREATE TABLE [dbo].[CUST]
[Customer ID] [nvarchar](50) NULL,
[First Name] [nvarchar](50) NULL,
[Last Name] [nvarchar](50) NULL,
[Address Line 1] [nvarchar](255) NULL,
[Address Line 2] [nvarchar](255) NULL,


[City] [nvarchar](50) NULL,
[County] [nvarchar](50) NULL,
[Country] [nvarchar](50) NULL,
[Postcode] [nvarchar](50) NULL,

[Telephone] [nvarchar](50) NULL,
[Fax] [nvarchar](50) NULL,
[Email] [nvarchar](255) NULL

And again, my format file:

9.0
13
1 SQLCHAR 0 0 "\"" 0 "" ""
2 SQLCHAR 0 0 ",\"" 1 col1 ""
3 SQLCHAR 0 0 "\",\"" 2 col2 ""
4 SQLCHAR 0 0 "\",\"" 3 col3 ""
5 SQLCHAR 0 0 "\",\"" 4 col4 ""
6 SQLCHAR 0 0 "\",\"" 5 col5 ""
7 SQLCHAR 0 0 "\",\"" 6 col6 ""
8 SQLCHAR 0 0 "\",\"" 7 col7 ""
9 SQLCHAR 0 0 "\",\"" 8 col8 ""
10 SQLCHAR 0 0 "\",\"" 9 col9 ""
11 SQLCHAR 0 0 "\",\"" 10 col10 ""
12 SQLCHAR 0 0 "\",\"" 11 col11 ""
13 SQLCHAR 0 0 "\"" 12 col12 ""

And finally my script:

BULK INSERT CUST

FROM 'C:\brook\CUST2.csv'

WITH ( formatfile = 'C:\brook\cust.txt')

GO
--Check the content of the table.
SELECT *
FROM dbo.CUST
GO
--Drop the table to clean up database.
SELECT *
FROM dbo.CUST
GO

Kindest regards
Lez


"Lez" <lez_nospam_@/noSpam\hotmail.co.uk> wrote in message

news:0CEE76D3-39DB-42FA...@microsoft.com...

Lez

unread,
Oct 23, 2009, 9:56:33 AM10/23/09
to
OK I have now tried something to ensure all data is consistent to get this
to work in some way. I have created a file with a few records as:

"lez","lez","lez","lez","lez","lez","lez","lez","lez","lez","lez","lez"
"lez","lez","lez","lez","lez","lez","lez","lez","lez","lez","lez","lez"
"lez","lez","lez","lez","lez","lez","lez","lez","lez","lez","lez","lez"
"lez","lez","lez","lez","lez","lez","lez","lez","lez","lez","lez","lez"
"lez","lez","lez","lez","lez","lez","lez","lez","lez","lez","lez","lez"
"lez","lez","lez","lez","lez","lez","lez","lez","lez","lez","lez","lez"
"lez","lez","lez","lez","lez","lez","lez","lez","lez","lez","lez","lez"
"lez","lez","lez","lez","lez","lez","lez","lez","lez","lez","lez","lez"
"lez","lez","lez","lez","lez","lez","lez","lez","lez","lez","lez","lez"
"lez","lez","lez","lez","lez","lez","lez","lez","lez","lez","lez","lez"
"lez","lez","lez","lez","lez","lez","lez","lez","lez","lez","lez","lez"
"lez","lez","lez","lez","lez","lez","lez","lez","lez","lez","lez","lez"
"lez","lez","lez","lez","lez","lez","lez","lez","lez","lez","lez","lez"
"lez","lez","lez","lez","lez","lez","lez","lez","lez","lez","lez","lez"
"lez","lez","lez","lez","lez","lez","lez","lez","lez","lez","lez","lez"
"lez","lez","lez","lez","lez","lez","lez","lez","lez","lez","lez","lez"
"lez","lez","lez","lez","lez","lez","lez","lez","lez","lez","lez","lez"
"lez","lez","lez","lez","lez","lez","lez","lez","lez","lez","lez","lez"
"lez","lez","lez","lez","lez","lez","lez","lez","lez","lez","lez","lez"
"lez","lez","lez","lez","lez","lez","lez","lez","lez","lez","lez","lez"

And using my file format :

9.0
13
1 SQLCHAR 0 0 "\"" 0 "" ""
2 SQLCHAR 0 0 "\",\"" 1 col1 ""
3 SQLCHAR 0 0 "\",\"" 2 col2 ""
4 SQLCHAR 0 0 "\",\"" 3 col3 ""
5 SQLCHAR 0 0 "\",\"" 4 col4 ""
6 SQLCHAR 0 0 "\",\"" 5 col5 ""
7 SQLCHAR 0 0 "\",\"" 6 col6 ""
8 SQLCHAR 0 0 "\",\"" 7 col7 ""
9 SQLCHAR 0 0 "\",\"" 8 col8 ""
10 SQLCHAR 0 0 "\",\"" 9 col9 ""
11 SQLCHAR 0 0 "\",\"" 10 col10 ""
12 SQLCHAR 0 0 "\",\"" 11 col11 ""
13 SQLCHAR 0 0 "\"" 12 col12 ""

Expected this to work, but still getting an error:

Msg 4832, Level 16, State 1, Line 1
Bulk load: An unexpected end of file was encountered in the data file.
Msg 7399, Level 16, State 1, Line 1
The OLE DB provider "BULK" for linked server "(null)" reported an error. The
provider did not give any information about the error.
Msg 7330, Level 16, State 2, Line 1
Cannot fetch a row from OLE DB provider "BULK" for linked server "(null)".


BULK INSERT CUST

FROM 'C:\brook\CUST2.csv'

WITH ( formatfile = 'C:\brook\cust.txt')

However, when I run this script the file imports perfectly

BULK INSERT CUST
FROM 'C:\brook\cust2.csv'
WITH
(
FIELDTERMINATOR = '","',
ROWTERMINATOR = '"\n"'
)

Kindest regards
Lez

"Lez" <lez_nospam_@/noSpam\hotmail.co.uk> wrote in message

news:C5F2B18A-5188-453A...@microsoft.com...

Erland Sommarskog

unread,
Oct 23, 2009, 5:48:45 PM10/23/09
to
Lez (lez_nospam_@/noSpam\hotmail.co.uk) writes:
> I have changed the row terminator from "\"\r" and also tried "\""
> neither of which has worked.

If it's a plain Windows file, the terminator for the last field
should be \r\n. (Or \"\r\n if the last field is enclosed in quotes.

0 new messages