英文:
Ansible...SQLParseError: PostgreSQL does not support table with more than 3 dots
问题
我正在编辑和扩展由第三方构建的Ansible playbook。
我添加了一个任务,该任务在数据库上运行查询并将数据导出到CSV文件中:
- name: 运行查询1并将数据导出到CSV
community.postgresql.postgresql_copy:
login_host: '{{ db_host }}'
login_user: '{{ db_username }}'
login_password: '{{ db_password }}'
db: '{{ db_database }}'
port: '{{ db_database_port }}'
src: "{{ lookup('template', 'query1.sql.j2') }}"
copy_to: "{{ log_base_path }}/results1.csv"
options:
format: csv
delimiter: ';'
header: yes
query1.sql.j2
文件的内容是:
SELECT
KEY,
idint,
barcode,
systemLV1,
systemLV2,
n.attribute1 AS attribute1_master,
c.attribute1 AS attribute1_systemLV2,
n.attribute2 AS attribute2_master,
c.attribute2 AS attribute2_systemLV2,
n.attribute3 AS attribute3_master,
c.attribute3 AS attribute3_systemLV2,
n.attribute1 = c.attribute1 AS check_attribute1_systemLV2,
n.attribute2 = c.attribute2 AS check_attribute2_systemLV2,
n.attribute3 = c.attribute3 AS check_attribute3_systemLV2
FROM
check_items_stechsystem2_ctrl2 c
LEFT JOIN check_items_stechsystem1_ctrl2 n USING (KEY, idint, barcode, systemLV1)
WHERE
(c.attribute1, c.attribute2, c.attribute3) != (n.attribute1, n.attribute2, n.attribute3);
当运行此任务时,会出现以下错误:
任务执行期间发生异常。要查看完整的回溯信息,请使用 -vvv。错误为:ansible_collections.community.postgresql.plugins.module_utils.database.SQLParseError: PostgreSQL不支持带有超过3个点的表
localhost 失败 | 消息:模块失败
请查看标准输出/标准错误以获取确切的错误信息
这是什么意思?
在互联网上,我找到了关于此错误的很少信息。
我在 这个问题页面 中看到,错误与某些非ASCII字符有关。如果是这样,我应该在哪里查找破坏我的playbook的非ASCII字符?
我查询中的两个表具有以下结构:
列名 | 类型 | 校对顺序 | 可空 | 默认值
---------+----------+-----------+----------+---------
key | text | | not null |
idint | text | | |
barcode | text | | |
systemLV1 | integer | | |
systemLV2 | text | | not null |
attribute1 | numeric | | |
attribute2 | smallint | | |
attribute3 | bigint | | |
但文本类型的列由包含字符串化的数值(可能带有零填充)数据的CSV文件的列填充。
当我手动在数据库上运行查询时,没有错误,并且数据不包含任何奇怪的字符。
英文:
I am editing and extending an ansible playbook built by third parts.
I added a task which runs a query on a DB and dumps the data into a csv file:
- name: run query1 and dump data in csv
community.postgresql.postgresql_copy:
login_host: '{{ db_host }}'
login_user: '{{ db_username }}'
login_password: '{{ db_password }}'
db: '{{ db_database }}'
port: '{{ db_database_port }}'
src: "{{ lookup('template', 'query1.sql.j2') }}"
copy_to: "{{ log_base_path }}/results1.csv"
options:
format: csv
delimiter: ';'
header: yes
the content of query1.sql.j2
is
SELECT
KEY,
idint,
barcode,
systemLV1,
systemLV2,
n.attribute1 AS attribute1_master,
c.attribute1 AS attribute1_systemLV2,
n.attribute2 AS attribute2_master,
c.attribute2 AS attribute2_systemLV2,
n.attribute3 AS attribute3_master,
c.attribute3 AS attribute3_systemLV2,
n.attribute1 = c.attribute1 AS check_attribute1_systemLV2,
n.attribute2 = c.attribute2 AS check_attribute2_systemLV2,
n.attribute3 = c.attribute3 AS check_attribute3_systemLV2
FROM
check_items_stechsystem2_ctrl2 c LEFT JOIN check_items_stechsystem1_ctrl2 n USING (KEY, idint, barcode, systemLV1)
WHERE
(c.attribute1,c.attribute2,c.attribute3) != (n.attribute1,n.attribute2,n.attribute3);
when this task is runned, the following error is raised:
An exception occurred during task execution. To see the full traceback, use -vvv. The error was: ansible_collections.community.postgresql.plugins.module_utils.database.SQLParseError: PostgreSQL does not support table with more than 3 dots
localhost failed | msg: MODULE FAILURE
See stdout/stderr for the exact error
What is the meaning of it?
On the internet I can find little information about this error.
I read in this issue page that the error is somehow related to some non-ASCII character. If so, where should I look for a non-ASCII character which breaks my playbook ?
Both the tables in my query have the following structure:
Column | Type | Collation | Nullable | Default
---------+----------+-----------+----------+---------
key | text | | not null |
idint | text | | |
barcode | text | | |
systemLV1 | integer | | |
systemLV2 | text | | not null |
attribute1 | numeric | | |
attribute2 | smallint | | |
attribute3 | bigint | | |
But the text type columns are filled up by columns of the csv file which contain stringified numeric (eventually zero-padded) data.
When I manually run the query on the DB, I get no error, and the data don't carry any strange sign.
答案1
得分: 1
替换查询中的所有多个空格和换行字符解决了问题。
然而,这非常奇怪,因为在那个剧本中还有其他使用换行字符运行查询的任务。
由于回溯指示问题是SQLParseError
,我猜测在我在某个文本编辑器上格式化查询字符串时,可能会有一些“坏”的换行/制表符/空格字符进入查询字符串中。
还有可能是我从一些聊天应用程序(如MS Teams)中复制了查询的某些部分,这是一个众所周知的代码泄漏工具。
英文:
Substituting all the multiple spaces and new lines characters from the query solved the issue.
Yet this is very weird since in that playbook there are other tasks which run queries with new line characters.
Since the Traceback indicates the issue is a SQLParseError
, I guess some "bad" new line / tab / space character got into the query string while I was formatting it on some text editor.
Also it is possible that I have copied some parts of the query from some chat apps like MS Teams, which is a well known code spoiler.
通过集体智慧和协作来改善编程学习和解决问题的方式。致力于成为全球开发者共同参与的知识库,让每个人都能够通过互相帮助和分享经验来进步。
评论