在Azure数据工厂中的管道中运行U-SQL活动时出错

前端之家收集整理的这篇文章主要介绍了在Azure数据工厂中的管道中运行U-SQL活动时出错前端之家小编觉得挺不错的,现在分享给大家,也给大家做个参考。
我在ADF的管道中运行Usql活动时遇到以下错误

活动错误

{"errorId":"E_CSC_USER_SyntaxERROR","severity":"Error","component":"CSC","source":"USER","message":"Syntax error.
     Final statement did not end with a semicolon","details":"at token 'txt',line 3\r\nnear the ###:\r\n**************\r\nDECLARE @in string = \"/demo/SearchLog.txt\";\nDECLARE @out string = \"/scripts/Result.txt\";\nSearchLogProcessing.txt ### \n","description":"Invalid Syntax found in the script.","resolution":"Correct the script Syntax,using expected token(s) as a guide.","helpLink":"","filePath":"","lineNumber":3,"startOffset":109,"endOffset":112}].

这是输出数据集,管道和Usql脚本的代码,我试图在管道中执行.

OutputDataset:

{
"name": "OutputDataLakeTable","properties": {
    "published": false,"type": "AzureDataLakeStore","linkedServiceName": "LinkedServiceDestination","typeProperties": {
        "folderPath": "scripts/"
    },"availability": {
        "frequency": "Hour","interval": 1
    }
}

管道:

{
    "name": "ComputeEventsByRegionPipeline","properties": {
        "description": "This is a pipeline to compute events for en-gb locale and date less than 2012/02/19.","activities": [
            {
                "type": "DataLakeAnalyticsU-sql","typeProperties": {
                    "script": "SearchLogProcessing.txt","scriptPath": "scripts\\","degreeOfParallelism": 3,"priority": 100,"parameters": {
                        "in": "/demo/SearchLog.txt","out": "/scripts/Result.txt"
                    }
                },"inputs": [
                    {
                        "name": "InputDataLakeTable"
                    }
                ],"outputs": [
                    {
                        "name": "OutputDataLakeTable"
                    }
                ],"policy": {
                    "timeout": "06:00:00","concurrency": 1,"executionPriorityOrder": "NewestFirst","retry": 1
                },"scheduler": {
                    "frequency": "Minute","interval": 15
                },"name": "CopybyU-sql","linkedServiceName": "AzureDataLakeAnalyticsLinkedService"
            }
        ],"start": "2017-01-03T12:01:05.53Z","end": "2017-01-03T13:01:05.53Z","isPaused": false,"hubName": "denojaidbfactory_hub","pipelineMode": "Scheduled"
    }
}

这是我的Usql脚本,我试图使用“DataLakeAnalyticsU-sql”活动类型执行.

@searchlog =
    EXTRACT UserId          int,Start           DateTime,Region          string,Query           string,Duration        int?,Urls            string,ClickedUrls     string
    FROM @in
    USING Extractors.Text(delimiter:'|');

@rs1 =
    SELECT Start,Region,Duration
    FROM @searchlog
WHERE Region == "kota";


OUTPUT @rs1   
    TO @out
      USING Outputters.Text(delimiter:'|');

请建议我如何解决此问题.

解决方法

您的脚本缺少scriptLinkedService属性.您(当前)还需要将U-sql脚本放置在Azure Blob存储中以成功运行它.因此,您还需要AzureStorage链接服务,例如:
{
    "name": "StorageLinkedService","properties": {
        "description": "","type": "AzureStorage","typeProperties": {
            "connectionString": "DefaultEndpointsProtocol=https;AccountName=myAzureBlobStorageAccount;AccountKey=**********"
        }
    }
}

创建此链接服务,将Blob存储名称myAzureBlobStorageAccount替换为相关的Blob存储帐户,然后将U-sql脚本(SearchLogProcessing.txt)放在容器中,然后重试.在下面的示例管道中,我在Blob存储区中有一个名为adlascripts的容器,脚本在那里:

正如Alexandre所说,确保scriptPath完整.管道的开始:

{
    "name": "ComputeEventsByRegionPipeline","typeProperties": {
                    "scriptPath": "adlascripts\\SearchLogProcessing.txt","scriptLinkedService": "StorageLinkedService","parameters": {
                        "in": "/input/SearchLog.tsv","out": "/output/Result.tsv"
                    }
                },...

输入和输出.tsv文件可以位于数据湖中,并使用AzureDataLakeStoreLinkedService链接服务.

我可以看到你正在尝试遵循以下演示:https://docs.microsoft.com/en-us/azure/data-factory/data-factory-usql-activity#script-definition.这不是最直观的演示,似乎存在一些问题,例如StorageLinkedService的定义在哪里?,SearchLogProcessing.txt在哪里?好吧,我通过谷歌搜索找到它,但网页上应该有一个链接.我得到了它的工作,但感觉有点像混血王子中的哈利波特.

猜你在找的MsSQL相关文章