文档

Dataphin中ClickHouse数据源DELETE准备语句执行报错

更新时间:
一键部署

问题描述

Dataphin中ClickHouse数据源DELETE准备语句执行报错。

2023-01-04 17:20:29.324 ---------------- voldemort task initiating ----------------

2023-01-04 17:20:29.324 Tenant id: 300008401
2023-01-04 17:20:29.324 DataphinEnvironment: PROD
2023-01-04 17:20:29.324 Node id: n_4288633799794098176
2023-01-04 17:20:29.324 Task id: t_4288633799794098176_20230103_4348265228811960320
2023-01-04 17:20:29.324 Task Type: NORMAL
2023-01-04 17:20:29.324 Taskrun id: tr_4288633799794098176_20230103_4348294331510358016
2023-01-04 17:20:29.324 ExecutableTaskrun id: 0000001089091
2023-01-04 17:20:29.324 Taskrun priority is MIDDLE
2023-01-04 17:20:29.324 Taskrun was due to execute at 2023-01-04 00:00:00
2023-01-04 17:20:29.324 Taskrun was ready to execute at 2023-01-04 17:20:21
2023-01-04 17:20:29.324 Taskrun delay time is: 8324ms
2023-01-04 17:20:29.324 Current Taskrun has been dispatched to agent: 
2023-01-04 17:20:29.324 Begin to execute DLINK task.
2023-01-04 17:20:29.324 Current task status: RUNNING
2023-01-04 17:20:29.324 --------------------------------
2023-01-04 17:20:29.324 List of task parameters: 
2023-01-04 17:20:29.324 bizdate=20230103
2023-01-04 17:20:29.324 nodeid=n_4288633799794098176
2023-01-04 17:20:29.324 taskid=t_4288633799794098176_20230103_4348265228811960320
2023-01-04 17:20:29.324 taskrunid=tr_4288633799794098176_20230103_4348294331510358016
2023-01-04 17:20:29.324 --------------------------------
2023-01-04 17:20:29.324 =================================================================

Dlink (1.0.0), From Alibaba !
Copyright (C) 2019-2029, Alibaba Group. All Rights Reserved.


Dlink command: java -server -Xms2048m -Xmx2048m -XX:+HeapDumpOnOutOfMemoryError -XX:HeapDumpPath=/dataphin/dlink/log -Xmx1024m -Xms1024m  -Dloglevel=info -Dfile.encoding=UTF-8 -Dlogback.statusListenerClass=ch.qos.logback.core.status.NopStatusListener -Djava.security.egd=file:///dev/urandom -Ddlink.home=/dataphin/dlink -Ddatax.home=/dataphin/dlink -Dlogback.configurationFile=/dataphin/dlink/conf/logback.xml -classpath /dataphin/dlink/lib/*:.  -Dlog.file.name=0000001089091 com.alibaba.dt.dlink.core.Engine --job /mnt/mesos/sandbox/1089091.json --jobid 0000001089091 --jobmark -1  --mode standalone --startcode 0
2023-01-04 17:20:32.621 [main] INFO  Engine - 
{
 "job":{
  "order":{
   "hops":[
    {
     "from":"Hive_2",
     "to":"ClickHouse_2"
    }
   ]
  },
  "setting":{
   "engine":{
    "name":"dlink"
   },
   "errorLimit":{
    "record":0
   },
   "jvmOption":"",
   "requiredResource":{
    "cpus":0.5,
    "memoryInMb":1024
   },
   "speed":{
    "concurrent":3
   }
  },
  "stepErrorHandlers":[],
  "steps":[
   {
    "category":"writer",
    "columnMapping":[
     {
      "dstColName":"Plant",
      "sourceColName":"plant"
     },
     {
      "dstColName":"CompanyCode",
      "sourceColName":"companycode"
     },
     {
      "dstColName":"ProfitCenter",
      "sourceColName":"profitcenter"
     },
     {
      "dstColName":"CostCenter",
      "sourceColName":"costcenter"
     },
     {
      "dstColName":"MaterialNo",
      "sourceColName":"materialno"
     },
     {
      "dstColName":"Qty",
      "sourceColName":"qty"
     },
     {
      "dstColName":"Amount",
      "sourceColName":"amount"
     },
     {
      "dstColName":"MoveType",
      "sourceColName":"movetype"
     },
     {
      "dstColName":"PostingDate",
      "sourceColName":"postingdate"
     },
     {
      "dstColName":"DocDate",
      "sourceColName":"docdate"
     },
     {
      "dstColName":"RecordeDate",
      "sourceColName":"recordedate"
     },
     {
      "dstColName":"YearMonth",
      "sourceColName":"yearmonth"
     },
     {
      "dstColName":"InventoryLocation",
      "sourceColName":"inventorylocation"
     },
     {
      "dstColName":"FiscalYear",
      "sourceColName":"fiscalyear"
     },
     {
      "dstColName":"SpecialStock",
      "sourceColName":"specialstock"
     },
     {
      "dstColName":"UPDATETIME",
      "sourceColName":"updatetime"
     },
     {
      "dstColName":"ds",
      "sourceColName":"ds"
     }
    ],
    "distribute":true,
    "name":"ClickHouse_2",
    "parameter":{
     "batchByteSize":67108864,
     "batchSize":65536,
     "column":[
      "Plant",
      "CompanyCode",
      "ProfitCenter",
      "CostCenter",
      "MaterialNo",
      "Qty",
      "Amount",
      "MoveType",
      "PostingDate",
      "DocDate",
      "RecordeDate",
      "YearMonth",
      "InventoryLocation",
      "FiscalYear",
      "SpecialStock",
      "UPDATETIME",
      "ds"
     ],
     "connection":[
      {
       "jdbcUrl":"jdbc:clickhouse://10.XX.65:8123/dataphin",
       "table":[
        "ads_warehouse_materialdoc_1d"
       ]
      }
     ],
     "dsId":"6989155252931475136",
     "dsName":"ClickHouse",
     "dsType":"CLICKHOUSE",
     "guid":"ads_warehouse_materialdoc_1d",
     "password":"*******",
     "postSql":[],
     "preSql":[
      "delete from dataphin.ads_warehouse_materialdoc_1d where ds='20230103'"
     ],
     "username":"default",
     "writeMode":"insert"
    },
    "stepType":"clickhouse"
   },
   {
    "category":"reader",
    "distribute":true,
    "name":"Hive_2",
    "parameter":{
     "column":[
      {
       "index":0,
       "name":"plant",
       "type":"String"
      },
      {
       "index":1,
       "name":"companycode",
       "type":"String"
      },
      {
       "index":2,
       "name":"profitcenter",
       "type":"String"
      },
      {
       "index":3,
       "name":"costcenter",
       "type":"String"
      },
      {
       "index":4,
       "name":"materialno",
       "type":"String"
      },
      {
       "index":5,
       "name":"qty",
       "type":"Double"
      },
      {
       "index":6,
       "name":"amount",
       "type":"Double"
      },
      {
       "index":7,
       "name":"movetype",
       "type":"String"
      },
      {
       "index":8,
       "name":"postingdate",
       "type":"String"
      },
      {
       "index":9,
       "name":"docdate",
       "type":"String"
      },
      {
       "index":10,
       "name":"recordedate",
       "type":"String"
      },
      {
       "index":11,
       "name":"yearmonth",
       "type":"String"
      },
      {
       "index":12,
       "name":"inventorylocation",
       "type":"String"
      },
      {
       "index":13,
       "name":"fiscalyear",
       "type":"String"
      },
      {
       "index":14,
       "name":"specialstock",
       "type":"String"
      },
      {
       "index":15,
       "name":"updatetime",
       "type":"String"
      },
      {
       "index":16,
       "name":"ds",
       "type":"String"
      }
     ],
     "connection":[
      {
       "jdbcUrl":[
        "jdbc:hive2://cdhtn01.XX.com:10000/HDB_ADS_DSV_DC;principal=hive/cdhtn01.XX.com@DESAY.COM"
       ],
       "table":"ads_warehouse_materialdoc_1d"
      }
     ],
     "defaultFS":"hdfs://10.XX.77:8020",
     "dsId":"6984797310576834240",
     "dsName":"HIVE_ADS_DSV_DC",
     "dsType":"HIVE",
     "fieldDelimiter":"\u0001",
     "fileType":"text",
     "hadoopConfig":{
      "dfs.data.transfer.protection":"integrity"
     },
     "hadoopNameNodes":"[{\"host\":\"10.XX.77\",\"webUiPort\":\"50070\",\"ipcPort\":\"8020\"},{\"host\":\"10.XX.76\",\"webUiPort\":\"50070\",\"ipcPort\":\"8020\"}]",
     "hadoopSitePaths":"[\"/mnt/mesos/sandbox/resources/0000001089091/resource.fs_f92b663a-48f3-4b4c-98eb-0e31625b46e3\",\"/mnt/mesos/sandbox/resources/0000001089091/resource.fs_5e61771d-0b42-41d5-b44e-a9b448b549de\"]",
     "haveKerberos":"true",
     "kdcAddress":"cdhtn02.XX.com",
     "kerberosConfigFilePath":"/mnt/mesos/sandbox/reader_krb5_6464063891574463.conf",
     "kerberosKeytabFilePath":"/mnt/mesos/sandbox/resources/0000001089091/resource.fs_68c64c16-e6bd-4c14-8e61-64cca3d71316",
     "kerberosPrincipal":"XX@DESAY.COM",
     "nullFormat":"\\N",
     "partition":"ds=20230103",
     "path":[
      "/user/hive/warehouse/hdb_ads_dsv_dc.db/ads_warehouse_materialdoc_1d/ds=20230103"
     ],
     "providedKerberosConfigFilePath":"fs_null",
     "useKerberosConf":"false"
    },
    "stepType":"hdfs"
   }
  ]
 }
}

2023-01-04 17:20:32.630 [main] INFO  VMInfo - VMInfo# operatingSystem class => sun.management.OperatingSystemImpl
2023-01-04 17:20:32.635 [main] INFO  Engine - the machine info  => 

 osInfo: "Alibaba" 1.8 25.152-b211
 jvmInfo: Linux amd64 3.10.0-1160.el7.x86_64
 cpu num: 24

 totalPhysicalMemory: 94.26G
 freePhysicalMemory: 51.33G
 maxFileDescriptorCount: 1048576
 currentOpenFileDescriptorCount: 418

 GC Names [ParNew, ConcurrentMarkSweep]

 MEMORY_NAME                    | allocation_size                | init_size                      
 Par Survivor Space             | 34.13MB                        | 34.13MB                        
 Code Cache                     | 240.00MB                       | 2.44MB                         
 Compressed Class Space         | 1,024.00MB                     | 0.00MB                         
 Metaspace                      | -0.00MB                        | 0.00MB                         
 Par Eden Space                 | 273.06MB                       | 273.06MB                       
 CMS Old Gen                    | 682.69MB                       | 682.69MB                       


2023-01-04 17:20:32.750 [main] INFO  ConcurrentFileSystemManager - Using "/tmp/vfs_cache" as temporary files store.
2023/01/04 17:20:32 - General - Plugin class com.alibaba.dt.dlink.core.trans.adaptor.ReaderStepMetaAdaptor registered for plugin type 'Step'
2023/01/04 17:20:32 - General - Plugin class com.alibaba.dt.dlink.core.trans.adaptor.WriterStepMetaAdaptor registered for plugin type 'Step'
2023/01/04 17:20:32 - General - Plugin class com.alibaba.dt.dlink.core.trans.adaptor.TransStepMetaAdaptor registered for plugin type 'Step'
2023/01/04 17:20:32 - General - Plugin class com.alibaba.dt.dlink.core.trans.adaptor.FilterStepMetaAdaptor registered for plugin type 'Step'
2023/01/04 17:20:32 - General - Plugin class com.alibaba.dt.dlink.core.trans.adaptor.streamperf.StreamPerfReaderStepMeta registered for plugin type 'Step'
2023/01/04 17:20:32 - General - Plugin class com.alibaba.dt.dlink.core.trans.adaptor.streamperf.StreamPerfWriterStepMeta registered for plugin type 'Step'
2023-01-04 17:20:33.143 [main] INFO  ColumnCast - writerTimeZone is set to: [GMT+8]
2023-01-04 17:20:33.153 [main] INFO  CalendarFactory - set CalendarFactory's timeZone to GMT+08:00
2023-01-04 17:20:33.154 [main] INFO  Engine - Dlink 开始运行...
2023-01-04 17:20:33.190 [main] INFO  DlinkTrans - Dlink Trans begin init ...
2023-01-04 17:20:33.192 [main] INFO  DlinkTrans - begin start dlink ...
2023-01-04 17:20:33.192 [main] INFO  DlinkTrans - dlink starts to do init ...
2023-01-04 17:20:33.193 [job-1089091] INFO  DlinkTransRunner - Dlink Reader.Job [Hive_2] do init work .
2023-01-04 17:20:33.193 [job-1089091] INFO  HdfsReader$Job - init() begin...
2023-01-04 17:20:33.457 [job-1089091] INFO  BaseDfsUtil - start to login for user XX@DESAY.COM using keytab file /mnt/mesos/sandbox/resources/0000001089091/resource.fs_68c64c16-e6bd-4c14-8e61-64cca3d71316
2023-01-04 17:20:33.467 [job-1089091] INFO  BaseDfsUtil -  kerberos login with kerberosAuthInfo =com.alibaba.dt.kerberos.support.common.KerberosAuthInfo@e19bb76[keytabFile=com.alibaba.dt.kerberos.support.common.KeytabFile@512535ff[keytabFilePath=/mnt/mesos/sandbox/resources/0000001089091/resource.fs_68c64c16-e6bd-4c14-8e61-64cca3d71316],principal=dsv_dc@DESAY.COM,kdcAddress=cdhtn02.desay.com,providedKerberosConfigFile=com.alibaba.dt.kerberos.support.common.ProvidedKerberosConfigFile@71529963[localPath=/mnt/mesos/sandbox/reader_krb5_6464063891574463.conf,fileContent=[libdefaults]
kdc_timeout = 5000
default_realm = DESAY.COM
max_retries = 3

[realms]
DESAY.COM = {
  kdc = cdhtn02.desay.com
}
]]
log4j:WARN No appenders could be found for logger (org.apache.htrace.core.Tracer).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
2023-01-04 17:20:34.459 [job-1089091] WARN  NativeCodeLoader - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
2023-01-04 17:20:34.673 [job-1089091] INFO  BaseDfsUtil - Login successful for user dsv_dc@DESAY.COM using keytab file /mnt/mesos/sandbox/resources/0000001089091/resource.fs_68c64c16-e6bd-4c14-8e61-64cca3d71316
2023-01-04 17:20:34.674 [job-1089091] INFO  HdfsReader$Job - init() ok and end...
2023-01-04 17:20:34.675 [job-1089091] INFO  DlinkTransRunner - Dlink Writer.Job [ClickHouse_2] do init work .
2023-01-04 17:20:34.700 [job-1089091] INFO  DBUtil - start to open connection:jdbc:clickhouse://10.XX.65:8123/dataphin
obproxy Druid LogFactory, userDefinedLogType=null, logInfo=public com.alipay.oceanbase.obproxy.druid.support.logging.Log4j2Impl(java.lang.String)
2023-01-04 17:20:35.104 [job-1089091] INFO  ClickHouseDriver - Driver registered
2023-01-04 17:20:35.293 [job-1089091] INFO  DBUtil - opened connection for jdbc:clickhouse://10.XX.65:8123/dataphin
2023-01-04 17:20:35.294 [job-1089091] INFO  DBUtil - start to get columns: select * from ads_warehouse_materialdoc_1d where 1=2
2023-01-04 17:20:35.302 [job-1089091] INFO  DBUtil - succeed to get columns for table:ads_warehouse_materialdoc_1d
2023-01-04 17:20:35.303 [job-1089091] INFO  OriginalConfPretreatmentUtil - table:[ads_warehouse_materialdoc_1d] all columns:[
Plant,CompanyCode,ProfitCenter,CostCenter,MaterialNo,Qty,Amount,MoveType,PostingDate,DocDate,RecordeDate,YearMonth,InventoryLocation,FiscalYear,SpecialStock,UPDATETIME,ds
].
2023-01-04 17:20:35.304 [job-1089091] INFO  DBUtil - start to open connection:jdbc:clickhouse://10.XX.65:8123/dataphin
2023-01-04 17:20:35.311 [job-1089091] INFO  DBUtil - opened connection for jdbc:clickhouse://10.XX.65:8123/dataphin
2023-01-04 17:20:35.311 [job-1089091] INFO  DBUtil - getColumnMetaData sql: select /*+read_consistency(weak) query_timeout(100000000)*/ Plant,CompanyCode,ProfitCenter,CostCenter,MaterialNo,Qty,Amount,MoveType,PostingDate,DocDate,RecordeDate,YearMonth,InventoryLocation,FiscalYear,SpecialStock,UPDATETIME,ds from ads_warehouse_materialdoc_1d where 1=2
2023-01-04 17:20:35.329 [job-1089091] INFO  OriginalConfPretreatmentUtil - Write data [
insert INTO %s (Plant,CompanyCode,ProfitCenter,CostCenter,MaterialNo,Qty,Amount,MoveType,PostingDate,DocDate,RecordeDate,YearMonth,InventoryLocation,FiscalYear,SpecialStock,UPDATETIME,ds) VALUES(?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?)
], which jdbcUrl like:[jdbc:clickhouse://10.XX.65:8123/dataphin]
2023-01-04 17:20:35.330 [job-1089091] INFO  DlinkTrans - dlink starts to do prepare ...
2023-01-04 17:20:35.330 [job-1089091] INFO  DlinkTransRunner - Dlink Reader.Job [Hive_2] do prepare work .
2023-01-04 17:20:35.330 [job-1089091] INFO  HdfsReader$Job - prepare(), start to getAllFiles...
2023-01-04 17:20:35.330 [job-1089091] INFO  HdfsHelper - get HDFS all files in path = [/user/hive/warehouse/hdb_ads_dsv_dc.db/ads_warehouse_materialdoc_1d/ds=20230103]
2023-01-04 17:20:35.493 [job-1089091] INFO  HdfsHelper - [hdfs://nameservice1/user/hive/warehouse/hdb_ads_dsv_dc.db/ads_warehouse_materialdoc_1d/ds=20230103/000000_0]是[text]类型的文件, 将该文件加入source files列表
2023-01-04 17:20:35.495 [job-1089091] INFO  HdfsReader$Job - 您即将读取的文件数为: [1], 列表为: [hdfs://nameservice1/user/hive/warehouse/hdb_ads_dsv_dc.db/ads_warehouse_materialdoc_1d/ds=20230103/000000_0]
2023-01-04 17:20:35.495 [job-1089091] INFO  DlinkTransRunner - Dlink Writer.Job [ClickHouse_2] do prepare work .
2023-01-04 17:20:35.496 [job-1089091] INFO  DBUtil - start to open connection:jdbc:clickhouse://10.XX.65:8123/dataphin
2023-01-04 17:20:35.502 [job-1089091] INFO  DBUtil - opened connection for jdbc:clickhouse://10.XX.65:8123/dataphin
2023-01-04 17:20:35.502 [job-1089091] INFO  CommonRdbmsWriter$Job - Begin to execute preSqls:[delete from dataphin.ads_warehouse_materialdoc_1d where ds='20230103']. context info:jdbc:clickhouse://10.133.35.65:8123/dataphin. in transaction:false.
2023-01-04 17:20:35.515 [job-1089091] ERROR DlinkTrans - Exception when job run
com.alibaba.dt.pipeline.plugin.center.exception.DataXException: Code:[DBUtilErrorCode-06], Description:[执行数据库 Sql 失败, 请检查您的配置的 column/table/where/querySql或者向 DBA 寻求帮助.].  - 执行的SQL为: delete from dataphin.ads_warehouse_materialdoc_1d where ds='20230103' 具体错误信息为:ru.yandex.clickhouse.except.ClickHouseException: ClickHouse exception, code: 62, host: 10.133.35.65, port: 8123; Code: 62, e.displayText() = DB::Exception: Syntax error: failed at position 1 ('delete'): delete from dataphin.ads_warehouse_materialdoc_1d where ds='20230103'. Expected one of: ALTER query, Query with output, ALTER PROFILE, RENAME DATABASE, SHOW PRIVILEGES query, TRUNCATE, KILL, KILL QUERY query, SELECT query, possibly with UNION, list of union elements, ALTER ROLE, SELECT subquery, DESCRIBE query, SELECT query, subquery, possibly with UNION, SHOW GRANTS, SHOW CREATE, WATCH, CREATE SETTINGS PROFILE or ALTER SETTINGS PROFILE query, SHOW PROCESSLIST query, ALTER POLICY, ALTER USER, CREATE VIEW query, CHECK TABLE, SET ROLE, SELECT query, SELECT, REVOKE, CREATE USER, CREATE DICTIONARY, CREATE PROFILE, SET ROLE DEFAULT, EXPLAIN, ALTER SETTINGS PROFILE, SYSTEM, ALTER LIVE VIEW, RENAME TABLE, DROP query, SHOW ACCESS, OPTIMIZE query, USE, DROP access entity query, RENAME DICTIONARY, DETACH, SET, SHOW, DESC, OPTIMIZE TABLE, CREATE ROW POLICY, SET DEFAULT ROLE, EXCHANGE DICTIONARIES, CREATE POLICY, ALTER ROW POLICY, INSERT INTO, INSERT query, SHOW [TEMPORARY] TABLES|DATABASES|CLUSTERS|CLUSTER 'name' [[NOT] [I]LIKE 'str'] [LIMIT expr], GRANT, RENAME query, SHOW GRANTS query, SHOW PRIVILEGES, EXISTS, DROP, SYSTEM query, CREATE LIVE VIEW query, CREATE ROW POLICY or ALTER ROW POLICY query, CREATE QUOTA or ALTER QUOTA query, SHOW PROCESSLIST, ALTER QUOTA, CREATE QUOTA, CREATE DATABASE query, SET query, Query, CREATE, WITH, CREATE ROLE or ALTER ROLE query, EXTERNAL DDL FROM, EXCHANGE TABLES, EXISTS or SHOW CREATE query, WATCH query, REPLACE, CREATE ROLE, CREATE SETTINGS PROFILE, SET ROLE or SET DEFAULT ROLE query, CREATE USER or ALTER USER query, EXTERNAL DDL query, SHOW ACCESS query, SHOW CREATE QUOTA query, USE query, ATTACH, DESCRIBE, ALTER TABLE, ShowAccessEntitiesQuery, GRANT or REVOKE query, CREATE TABLE or ATTACH TABLE query (version 21.6.6.51 (official build))

 at com.alibaba.dt.pipeline.plugin.center.exception.DataXException.asDataXException(DataXException.java:40) ~[plugin.center.base-v2.9.5.3-2.RELEASE.jar:na]
 at com.alibaba.datax.plugin.rdbms.util.RdbmsException.asQueryException(RdbmsException.java:97) ~[plugin-rdbms-util-0.0.1-SNAPSHOT.jar:na]
 at com.alibaba.datax.plugin.rdbms.writer.util.WriterUtil.executeSqlsAutoCommit(WriterUtil.java:127) ~[plugin-rdbms-util-0.0.1-SNAPSHOT.jar:na]
 at com.alibaba.datax.plugin.rdbms.writer.util.WriterUtil.executeSqls(WriterUtil.java:107) ~[plugin-rdbms-util-0.0.1-SNAPSHOT.jar:na]
 at com.alibaba.datax.plugin.rdbms.writer.CommonRdbmsWriter$Job.prepare(CommonRdbmsWriter.java:125) ~[plugin-rdbms-util-0.0.1-SNAPSHOT.jar:na]
 at com.alibaba.datax.plugin.writer.clickhousewriter.ClickhouseWriter$Job.prepare(ClickhouseWriter.java:63) ~[clickhousewriter-0.0.1-SNAPSHOT.jar:na]
 at com.alibaba.dt.dlink.core.trans.DlinkTransRunner.prepareJobWriter(DlinkTransRunner.java:95) ~[dlink-engine-0.0.1-SNAPSHOT.jar:na]
 at com.alibaba.dt.dlink.core.trans.DlinkTrans.doPrepare(DlinkTrans.java:279) ~[dlink-engine-0.0.1-SNAPSHOT.jar:na]
 at com.alibaba.dt.dlink.core.trans.DlinkTrans.start(DlinkTrans.java:111) ~[dlink-engine-0.0.1-SNAPSHOT.jar:na]
 at com.alibaba.dt.dlink.core.Engine.runTrans(Engine.java:89) [dlink-engine-0.0.1-SNAPSHOT.jar:na]
 at com.alibaba.dt.dlink.core.Engine.entry(Engine.java:172) [dlink-engine-0.0.1-SNAPSHOT.jar:na]
 at com.alibaba.dt.dlink.core.Engine.main(Engine.java:246) [dlink-engine-0.0.1-SNAPSHOT.jar:na]
2023-01-04 17:20:35.530 [job-1089091] INFO  DlinkTrans - dlink starts to do destroy ...
2023-01-04 17:20:35.530 [job-1089091] INFO  DlinkTransRunner - Dlink Reader.Job [Hive_2] do destroy work .
2023-01-04 17:20:35.633 [job-1089091] INFO  DlinkTransRunner - Dlink Writer.Job [ClickHouse_2] do destroy work .
2023-01-04 17:20:35.637 [job-1089091] INFO  DlinkTrans - 
  [total cpu info] => 
  averageCpu                     | maxDeltaCpu                    | minDeltaCpu                    
  15.89%                         | 15.89%                         | 15.89%
                        

  [total memory info] => 
   NAME                           | max_used_size                  | max_percent                    
   Par Survivor Space             | 26.38MB                        | 77.29%                         
   Code Cache                     | 10.37MB                        | 88.30%                         
   Compressed Class Space         | 4.41MB                         | 94.14%                         
   Metaspace                      | 38.22MB                        | 97.80%                         
   Par Eden Space                 | 89.43MB                        | 32.75%                         
   CMS Old Gen                    | 0.00MB                         | 0.00%                          

  [total gc info] => 
   NAME                 | totalGCCount       | maxDeltaGCCount    | minDeltaGCCount    | totalGCTime        | maxDeltaGCTime     | minDeltaGCTime     
   ParNew               | 2                  | 2                  | 2                  | 0.043s             | 0.043s             | 0.043s             
   ConcurrentMarkSweep  | 0                  | 0                  | 0                  | 0.003s             | 0.003s             | 0.003s             

2023-01-04 17:20:35.668 [job-1089091] ERROR Engine - 

经Dlink智能分析,该任务最可能的错误原因是:
com.alibaba.dt.dlink.core.exception.DlinkException: Code:[Framework-02], Description:[Dlink引擎运行过程出错,具体原因请参看Dlink运行结束时的错误诊断信息  .].  - com.alibaba.dt.pipeline.plugin.center.exception.DataXException: Code:[DBUtilErrorCode-06], Description:[执行数据库 Sql 失败, 请检查您的配置的 column/table/where/querySql或者向 DBA 寻求帮助.].  - 执行的SQL为: delete from dataphin.ads_warehouse_materialdoc_1d where ds='20230103' 具体错误信息为:ru.yandex.clickhouse.except.ClickHouseException: ClickHouse exception, code: 62, host: 10.133.35.65, port: 8123; Code: 62, e.displayText() = DB::Exception: Syntax error: failed at position 1 ('delete'): delete from dataphin.ads_warehouse_materialdoc_1d where ds='20230103'. Expected one of: ALTER query, Query with output, ALTER PROFILE, RENAME DATABASE, SHOW PRIVILEGES query, TRUNCATE, KILL, KILL QUERY query, SELECT query, possibly with UNION, list of union elements, ALTER ROLE, SELECT subquery, DESCRIBE query, SELECT query, subquery, possibly with UNION, SHOW GRANTS, SHOW CREATE, WATCH, CREATE SETTINGS PROFILE or ALTER SETTINGS PROFILE query, SHOW PROCESSLIST query, ALTER POLICY, ALTER USER, CREATE VIEW query, CHECK TABLE, SET ROLE, SELECT query, SELECT, REVOKE, CREATE USER, CREATE DICTIONARY, CREATE PROFILE, SET ROLE DEFAULT, EXPLAIN, ALTER SETTINGS PROFILE, SYSTEM, ALTER LIVE VIEW, RENAME TABLE, DROP query, SHOW ACCESS, OPTIMIZE query, USE, DROP access entity query, RENAME DICTIONARY, DETACH, SET, SHOW, DESC, OPTIMIZE TABLE, CREATE ROW POLICY, SET DEFAULT ROLE, EXCHANGE DICTIONARIES, CREATE POLICY, ALTER ROW POLICY, INSERT INTO, INSERT query, SHOW [TEMPORARY] TABLES|DATABASES|CLUSTERS|CLUSTER 'name' [[NOT] [I]LIKE 'str'] [LIMIT expr], GRANT, RENAME query, SHOW GRANTS query, SHOW PRIVILEGES, EXISTS, DROP, SYSTEM query, CREATE LIVE VIEW query, CREATE ROW POLICY or ALTER ROW POLICY query, CREATE QUOTA or ALTER QUOTA query, SHOW PROCESSLIST, ALTER QUOTA, CREATE QUOTA, CREATE DATABASE query, SET query, Query, CREATE, WITH, CREATE ROLE or ALTER ROLE query, EXTERNAL DDL FROM, EXCHANGE TABLES, EXISTS or SHOW CREATE query, WATCH query, REPLACE, CREATE ROLE, CREATE SETTINGS PROFILE, SET ROLE or SET DEFAULT ROLE query, CREATE USER or ALTER USER query, EXTERNAL DDL query, SHOW ACCESS query, SHOW CREATE QUOTA query, USE query, ATTACH, DESCRIBE, ALTER TABLE, ShowAccessEntitiesQuery, GRANT or REVOKE query, CREATE TABLE or ATTACH TABLE query (version 21.6.6.51 (official build))

 at com.alibaba.dt.pipeline.plugin.center.exception.DataXException.asDataXException(DataXException.java:40)
 at com.alibaba.datax.plugin.rdbms.util.RdbmsException.asQueryException(RdbmsException.java:97)
 at com.alibaba.datax.plugin.rdbms.writer.util.WriterUtil.executeSqlsAutoCommit(WriterUtil.java:127)
 at com.alibaba.datax.plugin.rdbms.writer.util.WriterUtil.executeSqls(WriterUtil.java:107)
 at com.alibaba.datax.plugin.rdbms.writer.CommonRdbmsWriter$Job.prepare(CommonRdbmsWriter.java:125)
 at com.alibaba.datax.plugin.writer.clickhousewriter.ClickhouseWriter$Job.prepare(ClickhouseWriter.java:63)
 at com.alibaba.dt.dlink.core.trans.DlinkTransRunner.prepareJobWriter(DlinkTransRunner.java:95)
 at com.alibaba.dt.dlink.core.trans.DlinkTrans.doPrepare(DlinkTrans.java:279)
 at com.alibaba.dt.dlink.core.trans.DlinkTrans.start(DlinkTrans.java:111)
 at com.alibaba.dt.dlink.core.Engine.runTrans(Engine.java:89)
 at com.alibaba.dt.dlink.core.Engine.entry(Engine.java:172)
 at com.alibaba.dt.dlink.core.Engine.main(Engine.java:246)
 - com.alibaba.dt.pipeline.plugin.center.exception.DataXException: Code:[DBUtilErrorCode-06], Description:[执行数据库 Sql 失败, 请检查您的配置的 column/table/where/querySql或者向 DBA 寻求帮助.].  - 执行的SQL为: delete from dataphin.ads_warehouse_materialdoc_1d where ds='20230103' 具体错误信息为:ru.yandex.clickhouse.except.ClickHouseException: ClickHouse exception, code: 62, host: 10.133.35.65, port: 8123; Code: 62, e.displayText() = DB::Exception: Syntax error: failed at position 1 ('delete'): delete from dataphin.ads_warehouse_materialdoc_1d where ds='20230103'. Expected one of: ALTER query, Query with output, ALTER PROFILE, RENAME DATABASE, SHOW PRIVILEGES query, TRUNCATE, KILL, KILL QUERY query, SELECT query, possibly with UNION, list of union elements, ALTER ROLE, SELECT subquery, DESCRIBE query, SELECT query, subquery, possibly with UNION, SHOW GRANTS, SHOW CREATE, WATCH, CREATE SETTINGS PROFILE or ALTER SETTINGS PROFILE query, SHOW PROCESSLIST query, ALTER POLICY, ALTER USER, CREATE VIEW query, CHECK TABLE, SET ROLE, SELECT query, SELECT, REVOKE, CREATE USER, CREATE DICTIONARY, CREATE PROFILE, SET ROLE DEFAULT, EXPLAIN, ALTER SETTINGS PROFILE, SYSTEM, ALTER LIVE VIEW, RENAME TABLE, DROP query, SHOW ACCESS, OPTIMIZE query, USE, DROP access entity query, RENAME DICTIONARY, DETACH, SET, SHOW, DESC, OPTIMIZE TABLE, CREATE ROW POLICY, SET DEFAULT ROLE, EXCHANGE DICTIONARIES, CREATE POLICY, ALTER ROW POLICY, INSERT INTO, INSERT query, SHOW [TEMPORARY] TABLES|DATABASES|CLUSTERS|CLUSTER 'name' [[NOT] [I]LIKE 'str'] [LIMIT expr], GRANT, RENAME query, SHOW GRANTS query, SHOW PRIVILEGES, EXISTS, DROP, SYSTEM query, CREATE LIVE VIEW query, CREATE ROW POLICY or ALTER ROW POLICY query, CREATE QUOTA or ALTER QUOTA query, SHOW PROCESSLIST, ALTER QUOTA, CREATE QUOTA, CREATE DATABASE query, SET query, Query, CREATE, WITH, CREATE ROLE or ALTER ROLE query, EXTERNAL DDL FROM, EXCHANGE TABLES, EXISTS or SHOW CREATE query, WATCH query, REPLACE, CREATE ROLE, CREATE SETTINGS PROFILE, SET ROLE or SET DEFAULT ROLE query, CREATE USER or ALTER USER query, EXTERNAL DDL query, SHOW ACCESS query, SHOW CREATE QUOTA query, USE query, ATTACH, DESCRIBE, ALTER TABLE, ShowAccessEntitiesQuery, GRANT or REVOKE query, CREATE TABLE or ATTACH TABLE query (version 21.6.6.51 (official build))

 at com.alibaba.dt.pipeline.plugin.center.exception.DataXException.asDataXException(DataXException.java:40)
 at com.alibaba.datax.plugin.rdbms.util.RdbmsException.asQueryException(RdbmsException.java:97)
 at com.alibaba.datax.plugin.rdbms.writer.util.WriterUtil.executeSqlsAutoCommit(WriterUtil.java:127)
 at com.alibaba.datax.plugin.rdbms.writer.util.WriterUtil.executeSqls(WriterUtil.java:107)
 at com.alibaba.datax.plugin.rdbms.writer.CommonRdbmsWriter$Job.prepare(CommonRdbmsWriter.java:125)
 at com.alibaba.datax.plugin.writer.clickhousewriter.ClickhouseWriter$Job.prepare(ClickhouseWriter.java:63)
 at com.alibaba.dt.dlink.core.trans.DlinkTransRunner.prepareJobWriter(DlinkTransRunner.java:95)
 at com.alibaba.dt.dlink.core.trans.DlinkTrans.doPrepare(DlinkTrans.java:279)
 at com.alibaba.dt.dlink.core.trans.DlinkTrans.start(DlinkTrans.java:111)
 at com.alibaba.dt.dlink.core.Engine.runTrans(Engine.java:89)
 at com.alibaba.dt.dlink.core.Engine.entry(Engine.java:172)
 at com.alibaba.dt.dlink.core.Engine.main(Engine.java:246)

 at com.alibaba.dt.dlink.core.exception.DlinkException.asDlinkException(DlinkException.java:48)
 at com.alibaba.dt.dlink.core.trans.DlinkTrans.start(DlinkTrans.java:134)
 at com.alibaba.dt.dlink.core.Engine.runTrans(Engine.java:89)
 at com.alibaba.dt.dlink.core.Engine.entry(Engine.java:172)
 at com.alibaba.dt.dlink.core.Engine.main(Engine.java:246)
Caused by: com.alibaba.dt.pipeline.plugin.center.exception.DataXException: Code:[DBUtilErrorCode-06], Description:[执行数据库 Sql 失败, 请检查您的配置的 column/table/where/querySql或者向 DBA 寻求帮助.].  - 执行的SQL为: delete from dataphin.ads_warehouse_materialdoc_1d where ds='20230103' 具体错误信息为:ru.yandex.clickhouse.except.ClickHouseException: ClickHouse exception, code: 62, host: 10.133.35.65, port: 8123; Code: 62, e.displayText() = DB::Exception: Syntax error: failed at position 1 ('delete'): delete from dataphin.ads_warehouse_materialdoc_1d where ds='20230103'. Expected one of: ALTER query, Query with output, ALTER PROFILE, RENAME DATABASE, SHOW PRIVILEGES query, TRUNCATE, KILL, KILL QUERY query, SELECT query, possibly with UNION, list of union elements, ALTER ROLE, SELECT subquery, DESCRIBE query, SELECT query, subquery, possibly with UNION, SHOW GRANTS, SHOW CREATE, WATCH, CREATE SETTINGS PROFILE or ALTER SETTINGS PROFILE query, SHOW PROCESSLIST query, ALTER POLICY, ALTER USER, CREATE VIEW query, CHECK TABLE, SET ROLE, SELECT query, SELECT, REVOKE, CREATE USER, CREATE DICTIONARY, CREATE PROFILE, SET ROLE DEFAULT, EXPLAIN, ALTER SETTINGS PROFILE, SYSTEM, ALTER LIVE VIEW, RENAME TABLE, DROP query, SHOW ACCESS, OPTIMIZE query, USE, DROP access entity query, RENAME DICTIONARY, DETACH, SET, SHOW, DESC, OPTIMIZE TABLE, CREATE ROW POLICY, SET DEFAULT ROLE, EXCHANGE DICTIONARIES, CREATE POLICY, ALTER ROW POLICY, INSERT INTO, INSERT query, SHOW [TEMPORARY] TABLES|DATABASES|CLUSTERS|CLUSTER 'name' [[NOT] [I]LIKE 'str'] [LIMIT expr], GRANT, RENAME query, SHOW GRANTS query, SHOW PRIVILEGES, EXISTS, DROP, SYSTEM query, CREATE LIVE VIEW query, CREATE ROW POLICY or ALTER ROW POLICY query, CREATE QUOTA or ALTER QUOTA query, SHOW PROCESSLIST, ALTER QUOTA, CREATE QUOTA, CREATE DATABASE query, SET query, Query, CREATE, WITH, CREATE ROLE or ALTER ROLE query, EXTERNAL DDL FROM, EXCHANGE TABLES, EXISTS or SHOW CREATE query, WATCH query, REPLACE, CREATE ROLE, CREATE SETTINGS PROFILE, SET ROLE or SET DEFAULT ROLE query, CREATE USER or ALTER USER query, EXTERNAL DDL query, SHOW ACCESS query, SHOW CREATE QUOTA query, USE query, ATTACH, DESCRIBE, ALTER TABLE, ShowAccessEntitiesQuery, GRANT or REVOKE query, CREATE TABLE or ATTACH TABLE query (version 21.6.6.51 (official build))

 at com.alibaba.dt.pipeline.plugin.center.exception.DataXException.asDataXException(DataXException.java:40)
 at com.alibaba.datax.plugin.rdbms.util.RdbmsException.asQueryException(RdbmsException.java:97)
 at com.alibaba.datax.plugin.rdbms.writer.util.WriterUtil.executeSqlsAutoCommit(WriterUtil.java:127)
 at com.alibaba.datax.plugin.rdbms.writer.util.WriterUtil.executeSqls(WriterUtil.java:107)
 at com.alibaba.datax.plugin.rdbms.writer.CommonRdbmsWriter$Job.prepare(CommonRdbmsWriter.java:125)
 at com.alibaba.datax.plugin.writer.clickhousewriter.ClickhouseWriter$Job.prepare(ClickhouseWriter.java:63)
 at com.alibaba.dt.dlink.core.trans.DlinkTransRunner.prepareJobWriter(DlinkTransRunner.java:95)
 at com.alibaba.dt.dlink.core.trans.DlinkTrans.doPrepare(DlinkTrans.java:279)
 at com.alibaba.dt.dlink.core.trans.DlinkTrans.start(DlinkTrans.java:111)
 ... 3 more

child_process.returncode: 1
2023-01-04 17:20:36.558 Pipeline callback - postTaskrunInfo request: http://pipelineEndpoint/api/pipeline/dlink/metric, retry count:0
2023-01-04 17:20:36.582 Pipeline callback - postTaskrunInfo response: {}
2023-01-04 17:20:36.596 No outputData produced.

2023-01-04 17:20:36.549 Dlink command exit with code: 1

2023-01-04 17:20:36.671 =================================================================
2023-01-04 17:20:36.671 Current task status: FAILED
2023-01-04 17:20:36.671 Elapsed time: 7.691 s( Estimated: 10m )
2023-01-04 17:20:36.671 ---------------- voldemort task ends ----------------

问题原因

JDBC不兼容该语法。

解决方案

使用ClickHouse删除数据的另外一种写法:

 alter table dataphin.ads_warehouse_materialdoc_1d delete where ds='20230103'

ALTER TABLE … DELETE 语句

适用于

  • Dataphin
  • 本页导读
文档反馈