英文:
Azure Mapping Data Flow : Not able to use blob storage dataset as a source
问题
我将Azure Blob“数据集”添加为Azure映射数据流的数据源,但无法查看预览,因为显示以下错误:
> 数据集使用“AzureStorage”链接服务类型,数据流不支持此类型。
以下是数据集JSON:
{
"name": "PIT_Input",
"properties": {
"linkedServiceName": {
"referenceName": "data_staging",
"type": "LinkedServiceReference"
},
"annotations": [],
"type": "DelimitedText",
"typeProperties": {
"location": {
"type": "AzureBlobStorageLocation",
"container": "dataflowpoc"
},
"columnDelimiter": ",",
"escapeChar": "\\\"",
"firstRowAsHeader": true,
"quoteChar": "\""
},
"schema": []
}
}
data_staging是类型为Azure Storage的链接服务。
文档说明Azure Blob数据集可以用作源。请告诉我我在这里做错了什么。
英文:
I added a Azure Blob dataset
as a source to a Azure mapping data flow, but am not able to view the preview as it is showing the below error :
> Dataset is using 'AzureStorage' linked service type, which is not supported in data flow.
Given below is the dataset JSON :
{
"name": "PIT_Input",
"properties": {
"linkedServiceName": {
"referenceName": "data_staging",
"type": "LinkedServiceReference"
},
"annotations": [],
"type": "DelimitedText",
"typeProperties": {
"location": {
"type": "AzureBlobStorageLocation",
"container": "dataflowpoc"
},
"columnDelimiter": ",",
"escapeChar": "\\",
"firstRowAsHeader": true,
"quoteChar": "\""
},
"schema": []
}
}
data_staging is a linked service of type Azure Storage.
The documentation states that Azure Blob datasets can used as a source.
Please tell me what I'm doing wrong here.
答案1
得分: 1
根据您的dataset
JSON,您只选择了容器dataflowpoc
,没有指定文件。
在数据集预览数据中,如果容器中的文件具有不同的模式,我们无法预览所有数据:
数据集 JSON:
{
"name": "DelimitedText1",
"properties": {
"linkedServiceName": {
"referenceName": "AzureBlobStorage1",
"type": "LinkedServiceReference"
},
"annotations": [],
"type": "DelimitedText",
"typeProperties": {
"location": {
"type": "AzureBlobStorageLocation",
"container": "containerleon"
},
"columnDelimiter": ",",
"escapeChar": "\\",
"firstRowAsHeader": false,
"quoteChar": "\""
},
"schema": []
},
"type": "Microsoft.DataFactory/factories/datasets"
}
我认为您的错误只是偶然发生的,请刷新数据工厂并重试。
更新:
错误已解决:“我将关联服务的类型从Azure存储更改为Azure Blob存储,然后它正常工作了。”
希望对您有所帮助。
英文:
According you dataset
JSON
, you just choose the container dataflowpoc
, didn't specify the file.
You only could preview the file data which type is "DelimitedText":
Dataset
Preview data, we can not preview all the data if files in the container with different schema:
Dataset JSON:
{
"name": "DelimitedText1",
"properties": {
"linkedServiceName": {
"referenceName": "AzureBlobStorage1",
"type": "LinkedServiceReference"
},
"annotations": [],
"type": "DelimitedText",
"typeProperties": {
"location": {
"type": "AzureBlobStorageLocation",
"container": "containerleon"
},
"columnDelimiter": ",",
"escapeChar": "\\",
"firstRowAsHeader": false,
"quoteChar": "\""
},
"schema": []
},
"type": "Microsoft.DataFactory/factories/datasets"
}
But in Data Flow Data Preview, we can see all the data in the files:
I think your error just happened by accident, please refresh the Data Factory and try again.
Update:
The error is solved: "I changed the type of the Linked Service from Azure storage to Azure blob storage, and it worked."
Hope this helps.
通过集体智慧和协作来改善编程学习和解决问题的方式。致力于成为全球开发者共同参与的知识库,让每个人都能够通过互相帮助和分享经验来进步。
评论