Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

What means BCP error message

1,892 views
Skip to first unread message

Richard Ray

unread,
Oct 31, 1997, 3:00:00 AM10/31/97
to

I'm a newbie to MS SQL Server (Mac/4D/UNIX/Informix background). I'm
porting the data for an existing application (a Web front end to the data)
to MS SQL Server. Fairly large amounts of data will be imported into the
data file regularly and my reading suggests BCP as the way to go. I also
understand that I should do some option setting, drop the indices before
the import and rebuild the clustered indices before the non-clustered ones
when all is completed.

Two questions - one very specific - during the load of a few test rows I
get some of them into the data properly, but some are rejected with the
error message - "Attempt to bulk-copy an oversized row to the SQL Server."
That sounds like there are some rows that are somehow getting dumped to
the disk file with some extra field delimiters, but I really have no idea.
Any hints?

Question 2 - Is there a way to automate dropping the indices, running BCP
and rebuilding the indices? I guess I either need to be able to call
ISQL/W (or something similar from the command line, or call BCP from a SQL
script someplace. I don't know how to do either of these things.

--
Richard Ray
Excellence by Design, Inc.
Jackson Hole, WY

Dan Scheck

unread,
Nov 1, 1997, 3:00:00 AM11/1/97
to

Richard,
See my responses below.

Dan

Richard Ray <richa...@xbd.com> wrote in article
<richard_ray-31...@jac-unl-4.wyoming.com>...


> I'm a newbie to MS SQL Server (Mac/4D/UNIX/Informix background). I'm
> porting the data for an existing application (a Web front end to the
data)
> to MS SQL Server. Fairly large amounts of data will be imported into the
> data file regularly and my reading suggests BCP as the way to go. I also
> understand that I should do some option setting, drop the indices before
> the import and rebuild the clustered indices before the non-clustered
ones
> when all is completed.
>
> Two questions - one very specific - during the load of a few test rows I
> get some of them into the data properly, but some are rejected with the
> error message - "Attempt to bulk-copy an oversized row to the SQL
Server."
> That sounds like there are some rows that are somehow getting dumped to
> the disk file with some extra field delimiters, but I really have no
idea.
> Any hints?

If you are not already, create a format file to be used when running BCP.
Once you have created a format file, verify that the character fields that
are in your input file are not any larger than the sizes of the columns
that you have defined in your table. When I have run into this problem
it's usually with character fields. You can use the format file to
selectively import data from your file, until you have determined which
file fields are larger than the column you are trying to import them into.

>
> Question 2 - Is there a way to automate dropping the indices, running BCP
> and rebuilding the indices? I guess I either need to be able to call
> ISQL/W (or something similar from the command line, or call BCP from a
SQL
> script someplace. I don't know how to do either of these things.

I think that within SQL executive you can schedule a .bat file to run.
Create a .bat file using isql command lines first to drop the indexes,
followed by bcp to load, then isql again to recreate the indexes. If I'm
not mistaken you can then schedule this bat file to run whenever from
within Sql exec.

Good Luck
dan
dsc...@ameritech.net

Rick Carroll

unread,
Nov 8, 1997, 3:00:00 AM11/8/97
to

>I'm a newbie to MS SQL Server (Mac/4D/UNIX/Informix background). I'm
>porting the data for an existing application (a Web front end to the data)
>to MS SQL Server. Fairly large amounts of data will be imported into the
>data file regularly and my reading suggests BCP as the way to go. I also
>understand that I should do some option setting, drop the indices before
>the import and rebuild the clustered indices before the non-clustered ones
>when all is completed.
>
>Two questions - one very specific - during the load of a few test rows I
>get some of them into the data properly, but some are rejected with the
>error message - "Attempt to bulk-copy an oversized row to the SQL Server."
>That sounds like there are some rows that are somehow getting dumped to
>the disk file with some extra field delimiters, but I really have no idea.
>Any hints?

In general, bcp doesn't handle variable length fields very well. If
you can convert these to fixed-length bcp will work better.


Bill Altmann

unread,
Nov 10, 1997, 3:00:00 AM11/10/97
to

It could also be that you have more columns in the input file than in the
destination table.


Rick Carroll <tra...@msn.com> wrote in article
<3467ccf3....@news.nac.net>...

Darrel Knight

unread,
Nov 13, 1997, 3:00:00 AM11/13/97
to

Louis Lariviere wrote:

>
> Rick Carroll wrote:
> >
> > >I'm a newbie to MS SQL Server (Mac/4D/UNIX/Informix background). I'm
> > >porting the data for an existing application (a Web front end to the data)
> > >to MS SQL Server. Fairly large amounts of data will be imported into the
> > >data file regularly and my reading suggests BCP as the way to go. I also
> > >understand that I should do some option setting, drop the indices before
> > >the import and rebuild the clustered indices before the non-clustered ones
> > >when all is completed.
> > >
> > >Two questions - one very specific - during the load of a few test rows I
> > >get some of them into the data properly, but some are rejected with the
> > >error message - "Attempt to bulk-copy an oversized row to the SQL Server."
> > >That sounds like there are some rows that are somehow getting dumped to
> > >the disk file with some extra field delimiters, but I really have no idea.
> > >Any hints?
> >
> > In general, bcp doesn't handle variable length fields very well. If
> > you can convert these to fixed-length bcp will work better.
>
> I have had that exact message when trying to bcp a smalldatetime field
> with value larger than max allowed (June 6 2079). I ended up changing
> my column to datetime (which max is Dec 31 9999).
I have seen it where the date on the input file is for example US format
MM/DD/YY and I was bcping into a server configuraed on an NT server with
dates set to intenationa;/uk as DD/MM/YY. May help?

Louis Lariviere

unread,
Nov 13, 1997, 3:00:00 AM11/13/97
to

Christopher M. Jones

unread,
Nov 14, 1997, 3:00:00 AM11/14/97
to

The error message you're getting,

"Attempt to bulk-copy an oversized row to the SQL Server."

means that some of the field lengths in the data you are BCPing are
longer than the column lengths in the table receiving the data. For
example, sending "Customer" into a varchar(5) would result in the
above error.

You might want to try using the datalength function to determine which
fields are being cut off, like this:

select max(datalength(column_name))
from table_name

If the output is the same length as the column size, you may want to
consider recreating the table with a larger column size. Hope this
helps.

On Thu, 13 Nov 1997 10:35:36 +0000, Darrel Knight
<Darrel...@ccm.isw.intel.com> wrote:

0 new messages