Spark SQL 无法在子查询中使用当前日期。

huangapple go评论73阅读模式
英文:

Spark SQL cannot use current date in subquery

问题

我有一个将日志发布到Spark数据库的流程:

Spark SQL 无法在子查询中使用当前日期。

我尝试通过在查询中提供当前日期来使SQL更加动态,但当我这样做时,它返回0:

Spark SQL 无法在子查询中使用当前日期。

单独运行子查询是有效的:
Spark SQL 无法在子查询中使用当前日期。

有人可以帮忙吗?

英文:

I have a process which posts logs into a Spark database:

Spark SQL 无法在子查询中使用当前日期。

I'm trying to make the SQL more dynamic by supplying the current date in the query like so, but when I do it returns 0:

Spark SQL 无法在子查询中使用当前日期。

Running the subquery on its own works:
Spark SQL 无法在子查询中使用当前日期。

Can anyone help please?

答案1

得分: 0

查询的问题在于你在子查询生成的模式周围添加了单引号。这意味着 like 运算符只会匹配 batch_insert_Date 列以单引号开头,后跟子查询生成的日期前缀的行。要解决这个问题,你可以从子查询生成的模式中移除单引号。以下是如何修改查询的示例:

SELECT * FROM my_table
WHERE batch_insert_Date LIKE
(SELECT CONCAT(cast(date_format(current_date(),'yyyyMMdd') as string), '%') AS date_col)

在这个修改后的查询中,使用 CONCAT() 函数将子查询生成的日期前缀与 % 通配符字符连接在一起,该通配符字符匹配任意数量的字符。然后使用 LIKE 运算符将 batch_insert_Date 列与子查询生成的模式匹配。

这个查询应该返回所有符合条件的行,其中 batch_insert_Date 列以 yyyyMMdd 格式的当前日期开头。

英文:

The issue with the query is that you are adding single quotes around the pattern generated by the subquery. This means that the like operator will only match rows where the batch_insert_Date column starts with a single quote, followed by the date prefix generated by the subquery. To fix this issue, you can remove the single quotes from the pattern generated by the subquery. Here's an example of how you can modify the query to do this:

SELECT * FROM my_table
WHERE batch_insert_Date LIKE
(SELECT CONCAT(cast(date_format(current_date(),'yyyyMMdd') as string), '%') AS date_col)

In this modified query, the CONCAT() function is used to concatenate the date prefix generated by the subquery with the % wildcard character, which matches any number of characters. The LIKE operator is then used to match the batch_insert_Date column with the pattern generated by the subquery.

This query should return all rows from the table where the batch_insert_Date column starts with the current date in the yyyyMMdd format.

Spark SQL 无法在子查询中使用当前日期。

huangapple
  • 本文由 发表于 2023年7月14日 01:09:22
  • 转载请务必保留本文链接:https://go.coder-hub.com/76681809.html
匿名

发表评论

匿名网友

:?: :razz: :sad: :evil: :!: :smile: :oops: :grin: :eek: :shock: :???: :cool: :lol: :mad: :twisted: :roll: :wink: :idea: :arrow: :neutral: :cry: :mrgreen:

确定