Bulk insert fails, but no error thrown

288 views
Skip to first unread message

sylvest...@gmail.com

unread,
Jun 5, 2015, 10:39:33 AM6/5/15
to pyo...@googlegroups.com
Hi,
I am using Python 2.7.2 on windows, pyodbc version 2.1.8, sqlalchemy version 0.7.6. SQLAlchemy is used only to get to the raw cursor that is then used to execute queries.

I just ran into the following issue with BULK INSERT statements:

I have a table with 4 columns which has a primary key constraint. If the primary key constraint is violated, I get an error. This is how it should be.
However, if I insert 'NaN' into a column with a floating point number, no error is thrown. The BULK INSERT process is actually halted, and not all data is inserted. Here is my query (with sensitive info changed)

BULK INSERT DB_NAME..TBL_NAME FROM '\\file_on_server' WITH (FIELDTERMINATOR= ',',ROWTERMINATOR= '\n',FIRE_TRIGGERS,CHECK_CONSTRAINTS, MAXERRORS= 0, BATCHSIZE= 1000)

The offending row looks like this:
SOME_STR,SOME_DT,SOME_INT,nan

Here is the code that runs this query:
engine_con  = engine.connect()
engine_con = engine_con.connection.connection
cursor = engine_con.cursor()
orig_autocommit = engine_con.autocommit
engine_con.autocommit = True
res = cursor.execute(sql)

If I run this on SQL server studio I get the following:

Msg 4864, Level 16, State 1, Line 1
Bulk load data conversion error (type mismatch or invalid character for the specified codepage) for row 388594, column 4 (COL_NAME).
Msg 7399, Level 16, State 1, Line 1
The OLE DB provider "BULK" for linked server "(null)" reported an error. The provider did not give any information about the error.
Msg 7330, Level 16, State 2, Line 1
Cannot fetch a row from OLE DB provider "BULK" for linked server "(null)".

Why is the cursor.execute statement not throwing this error? What can I do to get access to it?

I posted this question on the sqlalchemy group earlier here.

Thanks

cdo...@gmail.com

unread,
Feb 28, 2017, 6:17:07 PM2/28/17
to pyodbc
Hi, I had exactly the same issue. I created a connection to SQL server using pyodbc, then using cursor to execute a bulk insert with a file containing 4000 rows. The insertion failed in the middle, only half data were inserted, but cursor execute didn't return any error and didn't throw any exception. Execute the same command in freesqsh will always see exception. Can some expert show me how to capture the insertion failure exception? Thanks
Reply all
Reply to author
Forward
0 new messages