英文:
Data from Sybase image column truncated at 32 KiB when retrieved via pyodbc
问题
在Sybase数据库表中,我有存储为图像数据类型(如文档中所述的大型二进制数据)的PDF文件。我尝试从数据库中读取其中一个文件,并将其写入本地文件夹中,使用Python的pyodbc包,就像这个示例一样:
driver = "FreeTDS"
prt = 'port'
db = 'db'
passwd = 'passwd'
usr = 'usr'
serv = 'serv'
conn = pyodbc.connect(driver=driver, server=serv, port=prt, uid=usr, pwd=passwd)
sql_query = (
"SELECT ARCH_DOC_DOC as file_content FROM table_name WHERE ARCH_DOC_ID = id"
)
cursor = conn.cursor()
cursor.execute(sql_query)
pdf_data = cursor.fetchone()[0]
with open('my_test_file.pdf', 'wb') as f:
f.write(pdf_data)
我正在Debian GNU/Linux 11机器上使用TDS驱动程序运行此代码。
编译时设置(使用“configure”脚本建立)
版本:freetds v1.2.3
freetds.conf目录:/etc/freetds
MS db-lib源兼容性:否
Sybase二进制兼容性:是
线程安全性:是
iconv库:是
TDS版本:auto
iODBC:否
unixodbc:是
SSPI“trusted”登录:否
Kerberos:是
OpenSSL:否
GnuTLS:是
MARS:是
问题是,最终我得到了损坏的文件,并且在测试了一些文件后,我注意到我总是得到一个33ko的文件大小。例如,我用于测试的原始文件大小在数据库中是90ko,但我得到的文件大小只有33ko。因此,我想知道问题是否出现在数据库/驱动程序配置中,或者是否存在可以使用pyodbc读取的数据大小限制?我该如何解决这个问题?
英文:
I have PDF files stored as image datatype (large binary data as mentioned in the doc) in a sybase database table. I am trying to read one of those files from the db and write it to a file in a local folder using python pyodbc package like this example :
driver = "FreeTDS"
prt = 'port'
db = 'db'
passwd = 'passwd'
usr = 'usr'
serv = 'serv'
conn = pyodbc.connect(driver=driver, server=serv, port=prt, uid=usr, pwd=passwd)
sql_query = (
"SELECT ARCH_DOC_DOC as file_content FROM table_name WHERE ARCH_DOC_ID = id"
)
cursor = conn.cursor()
cursor.execute(sql_query)
pdf_data = cursor.fetchone()[0]
with open('my_test_file.pdf', 'wb') as f:
f.write(pdf_data)
I am using TDS driver and running this code on Debian GNU/Linux 11 machine
Compile-time settings (established with the "configure" script)
Version: freetds v1.2.3
freetds.conf directory: /etc/freetds
MS db-lib source compatibility: no
Sybase binary compatibility: yes
Thread safety: yes
iconv library: yes
TDS version: auto
iODBC: no
unixodbc: yes
SSPI "trusted" logins: no
Kerberos: yes
OpenSSL: no
GnuTLS: yes
MARS: yes
The problem is that I am getting corrupt file in the end and after testing a couple of files I noticed that I am always getting a file size 33ko. For example, the original file size that I am using to test is 90ko in the db and the file I am getting is only 33ko. So I am wondering if the issue is in the database/driver config or if there is a limit in the size of data that I can read with pyodbc ? And how can I fix that ?
答案1
得分: 1
这是一个可重现的问题,讨论在这里:
https://github.com/mkleehammer/pyodbc/issues/1226
作为一种解决方法,我们可以使用JayDeBeApi和jTDS,如下所示:
import jaydebeapi
cnxn = jaydebeapi.connect(
"net.sourceforge.jtds.jdbc.Driver",
"jdbc:jtds:sybase://192.168.0.199:5000/mydb;useLOBs=false",
["sa", "myPassword"],
"/home/gord/Downloads/jtds-1.3.1.jar"
)
crsr = cnxn.cursor()
crsr.execute("SELECT ARCH_DOC_DOC FROM so76408133 WHERE ARCH_DOC_ID = 1")
pdf_data = crsr.fetchone()[0]
with open("test_pdf", "wb") as f:
f.write(pdf_data)
请注意,这需要Java运行时环境(JRE)。在Ubuntu/Debian上,可以通过以下方式安装:
sudo apt install default-jre
英文:
This is a reproducible issue, discussed here
https://github.com/mkleehammer/pyodbc/issues/1226
As a workaround, we can use JayDeBeApi and jTDS, like so:
import jaydebeapi
cnxn = jaydebeapi.connect(
"net.sourceforge.jtds.jdbc.Driver",
"jdbc:jtds:sybase://192.168.0.199:5000/mydb;useLOBs=false",
["sa", "myPassword"],
"/home/gord/Downloads/jtds-1.3.1.jar"
)
crsr = cnxn.cursor()
crsr.execute("SELECT ARCH_DOC_DOC FROM so76408133 WHERE ARCH_DOC_ID = 1")
pdf_data = crsr.fetchone()[0]
with open("test_pdf", "wb") as f:
f.write(pdf_data)
Note that this requires a Java Runtime Environment (JRE). On Ubuntu/Debian, it can be installed via
sudo apt install default-jre
通过集体智慧和协作来改善编程学习和解决问题的方式。致力于成为全球开发者共同参与的知识库,让每个人都能够通过互相帮助和分享经验来进步。
评论