Hi SQLAlchemy,System information:
- Mac OS X v. 10.15.7
- Python v. 3.8.5
- SQLAlchemy v. 1.3.19
- MS SQL Server 2017 and 2019 (both Enterprise and Docker images e.g. mcr.microsoft.com/mssql/server:2019-latest)
Problem:I have an issue with inserting strings with a length greater than 2000 into columns with a datatype of Text, VARCHAR, or NVARCHAR using MS SQL Server 2017 and 2019.I've checked the MS SQL Server and it creates it column properly with e.g. a datatype of varchar(max)for Text and VARCHAR, which should be able to store strings with a size up to 2 GB according to Microsoft documentation.Furthermore, I've tried using other collations, but I need _SC (supplementary character) support for my applications, so I cannot drop it, and adding _UTF8 (UTF-8) doesn't solve the problem either.Why am I not allowed to store strings with a size greater than 2000?
And why is SQLAlchemy displaying that error message, when trying to insert plain ASCII text ("AAAA...")?
--SQLAlchemy -The Python SQL Toolkit and Object Relational MapperTo post example code, please provide an MCVE: Minimal, Complete, and Verifiable Example. See http://stackoverflow.com/help/mcve for a full description.---You received this message because you are subscribed to the Google Groups "sqlalchemy" group.To unsubscribe from this group and stop receiving emails from it, send an email to sqlalchemy+...@googlegroups.com.To view this discussion on the web visit https://groups.google.com/d/msgid/sqlalchemy/ca549391-4360-480e-8c58-06577f6d92dan%40googlegroups.com.
You received this message because you are subscribed to a topic in the Google Groups "sqlalchemy" group.
To unsubscribe from this topic, visit https://groups.google.com/d/topic/sqlalchemy/Kk6DkPNWlR4/unsubscribe.
To unsubscribe from this group and all its topics, send an email to sqlalchemy+...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/sqlalchemy/b4a135ef-1c1e-476c-a3f0-60326dca76ea%40www.fastmail.com.
Thank you, Mike - very much appreciated!Just to be clear, pyodbc is not a driver, it’s a ODBC-compliant DBAPI, right? I separately downloaded a driver for SQL Server from Microsoft, which pyodbc makes use of.
Do you suggest that changing pyodbc to another SQL Server DPAPI would solve the problem?
If so, can you recommend another DBAPI for SQL Server? Or do you think that the problem is caused by Microsoft’s driver?
I’m pretty sure SQL Server works fine when accessed using .NET, otherwise the Internet would be full of complaints regarding not being able to insert +2000 characters in a varchar(max).
To view this discussion on the web visit https://groups.google.com/d/msgid/sqlalchemy/CAE%2B5CJABrO_459MHtDePMXah8mb67TFvQB8rUwgVgk6%2By4v-ow%40mail.gmail.com.
Hi Mike,I've now tested inserting strings with more than 2000 characters using Azure Data Studio (SQL Server GUI) and everything works.Furthermore, I've tested pyodbc (DBAPI) directly, and it only fails when inserting such strings using parameterised SQL queries (it succeeds without using parametrised queries).
You can see my POC below, if you have any interest.I guess it should be submitted as a bug to pyodbc... Do you know if I can disable parametrisation for certain SQL queries in SQLAlchemy?
To view this discussion on the web visit https://groups.google.com/d/msgid/sqlalchemy/60f82633-887a-43a3-ac52-c5541058e0bcn%40googlegroups.com.
Hi Mike,
I have created an issue for pyodbc: https://github.com/mkleehammer/pyodbc/issues/835#issuecomment-708930870I've gotten really good feedback there from Microsoft, and a fix has been proposed that works:"You can try to use setinputsizes on your parameter to tell it to send as varchar(max): cursor.setinputsizes([(pyodbc.SQL_WVARCHAR,0,0),])"I'm by no means an SQLAlchemy expert, but shouldn't the pyodbc dialect be updated to support varchar(max)using the proposed method? If not, how can I execute cursor.setinputsizes([(pyodbc.SQL_WVARCHAR,0,0),]) using SQLAlchemy, so that I can make use of varchar(max)in my application?
_SC
in my application, and it caused errors.". Can you be more specific of these errors? At the moment, this is suggesting a major architectural rework of the pyodbc dialect to support a use case which has other workarounds. The architecture of SQLAlchemy's set_input_sizes() hook has changed and at best this would be part of 1.4 which is not in beta release yet, a production release is not for several months. To view this discussion on the web visit https://groups.google.com/d/msgid/sqlalchemy/f3e2a277-8529-4fd5-83be-26445616f6c3n%40googlegroups.com.
Is it really necessary to use your very-subtle vendored version of the set_input_sizes() hook? Why use it compared to Simon King's simple version?
To view this discussion on the web visit https://groups.google.com/d/msgid/sqlalchemy/f963fe8a-a595-48c5-92a8-597046e199c6n%40googlegroups.com.
Yep, I misunderstood what setinputsizes was doing. I thought it toldpyodbc how it should handle a particular datatype,
rather than tellingit how to handle the set of parameters it is about receive in the nextexecute() call...Sorry for adding to the confusion,
To view this discussion on the web visit https://groups.google.com/d/msgid/sqlalchemy/CAFHwexfqcGGMsYpn4y192qncs7Ka0a0CssLfdnUHJCoWGdMRCQ%40mail.gmail.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/sqlalchemy/c2246a7e-9e30-4eb9-8a9a-fe53e3557651n%40googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/sqlalchemy/84ea9dfd-216a-4c76-980c-ac03db4d9ddb%40www.fastmail.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/sqlalchemy/CADm-oGnuWycLYx-AWN_SxVcgcGSdamEwcHKcThRxbN8UqJ2hFQ%40mail.gmail.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/sqlalchemy/CAE%2B5CJAT6s3UfNFB9Oz%2BM%2BoMPmQZN13-xgS%2BGo7946%2Beo8H51g%40mail.gmail.com.