英文:
How to export SQL files in Synapse to sandbox environment or directly access these SQL files via notebooks?
问题
可以通过代码将在Synapse工作区中发布的SQL文件导出到沙盒环境吗?
如果不行,是否可以通过笔记本使用pySpark、Scala、SparkSQL、C#等方式访问已发布的SQL文件?
英文:
Is it possible to export published SQL files in your Synapse workspace to your sandbox environment via code and without the use of pipelines?
If not is it somehow possible to access your published SQL files via a notebook with e.g. pySpark, Scala, SparkSQL, C# etc?
答案1
得分: 1
Here is the translated content:
>如果不行,是否可以通过笔记本(例如pySpark、Scala、SparkSQL、C#等)访问已发布的SQL文件?
您可以使用以下REST API从Synapse工作区获取SQL脚本
列表。
https://<synapse_workspace_name>.dev.azuresynapse.net/sqlScripts?api-version=2020-12-01
在Synapse笔记本(Pyspark)中使用此REST API。
首先创建服务主体和密钥。按照以下步骤为该服务主体授予对Synapse的访问权限。
在这里,这些是附加到名为rakeshdedipool
的专用SQL池的工作区中的我的SQL脚本。
生成服务主体的承载令牌。我按照@Saideep Arikontham在此 SO答案 中使用msal
生成承载令牌的代码。
如果您愿意,也可以使用Postman获取承载令牌。
现在,在Pyspark中使用承载令牌,您可以查看SQL脚本列表。
import requests
# API端点
URL = "https://rakeshsynapse.dev.azuresynapse.net/sqlScripts?api-version=2020-12-01"
# 发送get请求并将响应保存为响应对象
r = requests.get(url=URL, headers={"Authorization": f"Bearer {result['access_token']}"})
print(r.json())
您可以获取脚本并根据需要在此笔记本中使用。
for i in r.json()['value']:
print("script : ", i['properties']['content']['query'])
(或者)使用Powershell脚本 Export-AzSynapseSqlScript 将脚本文件导出到存储帐户,您还可以尝试使用Python SDK。
英文:
>If not is it somehow possible to access your published SQL files via a notebook with e.g. pySpark, Scala, SparkSQL, C# etc?
You can get the list of SQL scripts
from the Synapse workspace using following REST API.
https://<synapse_workspace_name>.dev.azuresynapse.net/sqlScripts?api-version=2020-12-01
Use this REST API in Synapse notebook(Pyspark).
First create a Service principal and secret. Give the access for that Service principal to Synapse by following below steps.
Here, these are my SQL scripts in the workspace attached to a dedicated SQL pool named rakeshdedipool
.
Generate the bearer token for the service principal. I followed the code in this SO Answer by @Saideep Arikontham which uses msal
for generating the bearer token.
If you want, you can use postman also for the bearer token.
Now, use the bearer token in Pyspark and you can see the list of SQL Scripts.
import requests
# api-endpoint
URL = "https://rakeshsynapse.dev.azuresynapse.net/sqlScripts?api-version=2020-12-01"
# sending get request and saving the response as response object
r = requests.get(url = URL, headers = {"Authorization":f"Bearer {result['access_token']}"})
print(r.json())
You can get the scripts like and use as per your requirement in this Notebook.
for i in r.json()['value']:
print("script : ", i['properties']['content']['query'])
(OR) Use the Powershell script Export-AzSynapseSqlScript to export the script files to a Storage account and you can also try with Python SDK.
通过集体智慧和协作来改善编程学习和解决问题的方式。致力于成为全球开发者共同参与的知识库,让每个人都能够通过互相帮助和分享经验来进步。
评论