英文:
Delete rows from Azure Sql table using Azure Databricks with Pyspark
问题
以下是我正在使用的代码片段:
azuresqlOptions={ "driver":jdbcDriver, "url":jdbcUrl, "user":username, "port":jdbcPort, "password":password }
query = "(DELETE cone.address WHERE Address_ID=756 ) ad1" df1 = spark.read.format("jdbc").option("header","true").options(**azuresqlOptions).option("dbtable",query).load() display(df1)
我遇到了以下错误:
com.microsoft.sqlserver.jdbc.SQLServerException: 嵌套的INSERT、UPDATE、DELETE或MERGE语句必须具有OUTPUT子句。
有人可以帮助我吗?
英文:
Below is the snippet which I am using
azuresqlOptions={ "driver":jdbcDriver, "url":jdbcUrl, "user":username, "port":jdbcPort, "password":password }
query = "(DELETE cone.address WHERE Address_ID=756 ) ad1" df1 = spark.read.format("jdbc").option("header","true").options(**azuresqlOptions).option("dbtable",query).load() display(df1)
I am getting below error:
com.microsoft.sqlserver.jdbc.SQLServerException: A nested INSERT, UPDATE, DELETE, or MERGE statement must have an OUTPUT clause.
Can any one help me on this
答案1
得分: 0
在 Azure 数据工厂中使用示例数据删除 SQL 表中的行
使用以下代码:
from pyspark.sql import SparkSession
spark = SparkSession.builder.getOrCreate()
Host = "<serverName>.database.windows.net"
Port = 1433
Database = "<dbName>"
Username = "<userName>"
Password = "<password>"
Driver = "com.microsoft.sqlserver.jdbc.SQLServerDriver"
table = "<table>"
Url = f"jdbc:sqlserver://{Host}:{Port};databaseName={Database}"
connectionProperties = {
"user": Username,
"password": Password,
"driver": Driver
}
query = f"(Delete {table} where Id = 1) AS subquery"
df = spark.read.jdbc(url=Url, table=query, properties=connectionProperties)
df.show()
我得到了相同的错误:
我尝试了以下过程来删除表中的行。我使用了过滤函数来删除行:
from pyspark.sql.functions import *
df2 = df.filter(col("<condition>"))
df2.show()
我使用以下代码将数据帧写入 SQL 表:
df2.write.format("jdbc").mode("overwrite").option("url", "<Url>").option("dbtable", "<table>").option("user", "<Username>").option("password", "<Password>").save()
表已成功更新。
更新:
您可以使用以下代码执行删除查询:
import pyodbc
conn = pyodbc.connect('DRIVER={ODBC Driver 17 for SQL Server};'
'SERVER=<serverNmae>.database.windows.net;'
'DATABASE=<db>;UID=<userName>;'
'PWD=<password>')
conn.execute('DELETE <tableName> WHERE <condition>')
英文:
When deleting rows from SQL table in Azure data bricks with sample data
with below code:
from pyspark.sql import SparkSession
spark = SparkSession.builder.getOrCreate()
Host = "<serverName>.database.windows.net"
Port = 1433
Database = "<dbName>"
Username = "<userName>"
Password = "<password>"
Driver = "com.microsoft.sqlserver.jdbc.SQLServerDriver"
table = "<table>"
Url = f"jdbc:sqlserver://{Host}:{Port};databaseName={Database}"
connectionProperties = {
"user": Username,
"password": Password,
"driver": Driver
}
query = f"(Delete {table} where Id = 1) AS subquery"
df = spark.read.jdbc(url=Url, table=query, properties=connectionProperties)
df.show()
I got the same error:
I tried below procedure to delete row from table. I have used filter function to delete row:
from pyspark.sql.functions import *
df2 = df.filter(col("<condition>")
df2.show()
I write the data frame into Sql table using below code:
df2.write.format("jdbc").mode("overwrite").option("url", "<Url>").option("dbtable", ""<table>").option("user", "<Username>").option("password", "<Password>").save()
The table updated successfully.
Updated:
You can use below code to execute delete query:
import pyodbc
conn = pyodbc.connect( 'DRIVER={ODBC Driver 17 for SQL Server};'
'SERVER=<serverNmae>.database.windows.net;'
'DATABASE=<db>;UID=<userName>;'
'PWD=<password>')
conn.execute('DELETE <tableName> WHERE <condition>')
通过集体智慧和协作来改善编程学习和解决问题的方式。致力于成为全球开发者共同参与的知识库,让每个人都能够通过互相帮助和分享经验来进步。
评论