Two questions - one very specific - during the load of a few test rows I
get some of them into the data properly, but some are rejected with the
error message - "Attempt to bulk-copy an oversized row to the SQL Server."
That sounds like there are some rows that are somehow getting dumped to
the disk file with some extra field delimiters, but I really have no idea.
Any hints?
Question 2 - Is there a way to automate dropping the indices, running BCP
and rebuilding the indices? I guess I either need to be able to call
ISQL/W (or something similar from the command line, or call BCP from a SQL
script someplace. I don't know how to do either of these things.
--
Richard Ray
Excellence by Design, Inc.
Jackson Hole, WY
Dan
Richard Ray <richa...@xbd.com> wrote in article
<richard_ray-31...@jac-unl-4.wyoming.com>...
> I'm a newbie to MS SQL Server (Mac/4D/UNIX/Informix background). I'm
> porting the data for an existing application (a Web front end to the
data)
> to MS SQL Server. Fairly large amounts of data will be imported into the
> data file regularly and my reading suggests BCP as the way to go. I also
> understand that I should do some option setting, drop the indices before
> the import and rebuild the clustered indices before the non-clustered
ones
> when all is completed.
>
> Two questions - one very specific - during the load of a few test rows I
> get some of them into the data properly, but some are rejected with the
> error message - "Attempt to bulk-copy an oversized row to the SQL
Server."
> That sounds like there are some rows that are somehow getting dumped to
> the disk file with some extra field delimiters, but I really have no
idea.
> Any hints?
If you are not already, create a format file to be used when running BCP.
Once you have created a format file, verify that the character fields that
are in your input file are not any larger than the sizes of the columns
that you have defined in your table. When I have run into this problem
it's usually with character fields. You can use the format file to
selectively import data from your file, until you have determined which
file fields are larger than the column you are trying to import them into.
>
> Question 2 - Is there a way to automate dropping the indices, running BCP
> and rebuilding the indices? I guess I either need to be able to call
> ISQL/W (or something similar from the command line, or call BCP from a
SQL
> script someplace. I don't know how to do either of these things.
I think that within SQL executive you can schedule a .bat file to run.
Create a .bat file using isql command lines first to drop the indexes,
followed by bcp to load, then isql again to recreate the indexes. If I'm
not mistaken you can then schedule this bat file to run whenever from
within Sql exec.
Good Luck
dan
dsc...@ameritech.net
In general, bcp doesn't handle variable length fields very well. If
you can convert these to fixed-length bcp will work better.
Rick Carroll <tra...@msn.com> wrote in article
<3467ccf3....@news.nac.net>...
"Attempt to bulk-copy an oversized row to the SQL Server."
means that some of the field lengths in the data you are BCPing are
longer than the column lengths in the table receiving the data. For
example, sending "Customer" into a varchar(5) would result in the
above error.
You might want to try using the datalength function to determine which
fields are being cut off, like this:
select max(datalength(column_name))
from table_name
If the output is the same length as the column size, you may want to
consider recreating the table with a larger column size. Hope this
helps.
On Thu, 13 Nov 1997 10:35:36 +0000, Darrel Knight
<Darrel...@ccm.isw.intel.com> wrote: