文档

Dataphin中CSV文件集成到hive运行报“Code:[HdfsWriter-02], Description:[您填写的参数值不合法.]. - 仅仅支持单字符切分, 您配置的切分为 : [ SOH]”错

更新时间:
一键部署

概述

解决数据集成到hive库中时任务运行报 “Code:[HdfsWriter-02], Description:[您填写的参数值不合法.]. - 仅仅支持单字符切分, 您配置的切分为 : [ SOH]”错误的问题

问题描述

将CSV文件集成到hive数据库,运行报错:

经Dlink智能分析,该任务最可能的错误原因是:

com.alibaba.dt.dlink.core.exception.DlinkException: Code:[Framework-02], Description:[Dlink引擎运行过程出错,具体原因请参看Dlink运行结束时的错误诊断信息 .]. 
 - com.alibaba.dt.pipeline.plugin.center.exception.DataXException: Code:[HdfsWriter-02], Description:[您填写的参数值不合法.].
- 仅仅支持单字符切分, 您配置的切分为 : [ SOH]

具体日志如附件(日志内容已经脱敏)

Dlink command: java -server -Xms2048m -Xmx2048m -XX:+HeapDumpOnOutOfMemoryError -XX:HeapDumpPath=/dataphin/dlink/log -Xmx1024m -Xms1024m  -Dloglevel=info -Dfile.encoding=UTF-8 -Dlogback.statusListenerClass=ch.qos.logback.core.status.NopStatusListener -Djava.security.egd=file:///dev/urandom -Ddlink.home=/dataphin/dlink -Ddatax.home=/dataphin/dlink -Dlogback.configurationFile=/dataphin/dlink/conf/logback.xml -classpath /dataphin/dlink/lib/*:.  -Dlog.file.name=0000001015968 com.alibaba.dt.dlink.core.Engine --job /mnt/mesos/sandbox/1015968.json --jobid 0000001015968 --jobmark -1  --mode standalone --startcode 0

2021-07-01 16:56:36.766 [main] INFO  Engine - 
{
 "job":{
  "order":{
   "hops":[
...
2021-07-01 16:56:37.232 [job-1015968] ERROR DlinkTrans - Exception when job run
com.alibaba.dt.pipeline.plugin.center.exception.DataXException: Code:[HdfsWriter-02], Description:[您填写的参数值不合法.]. - 仅仅支持单字符切分, 您配置的切分为 : [ ]
 at com.alibaba.dt.pipeline.plugin.center.exception.DataXException.asDataXException(DataXException.java:40) ~[plugin.center.base-0.0.1-SNAPSHOT.jar:na]
 at com.alibaba.datax.plugin.writer.hdfswriter.HdfsWriter$Job.validateParameter(HdfsWriter.java:116) ~[hdfswriter-0.0.1-SNAPSHOT.jar:na]
 at com.alibaba.datax.plugin.writer.hdfswriter.HdfsWriter$Job.init(HdfsWriter.java:52) ~[hdfswriter-0.0.1-SNAPSHOT.jar:na]
 at com.alibaba.dt.dlink.core.trans.DlinkTransRunner.initJobWriter(DlinkTransRunner.java:77) ~[dlink-engine-0.0.1-SNAPSHOT.jar:na]
 at com.alibaba.dt.dlink.core.trans.DlinkTrans.doInit(DlinkTrans.java:284) ~[dlink-engine-0.0.1-SNAPSHOT.jar:na]
 at com.alibaba.dt.dlink.core.trans.DlinkTrans.start(DlinkTrans.java:132) ~[dlink-engine-0.0.1-SNAPSHOT.jar:na]
 at com.alibaba.dt.dlink.core.Engine.runTrans(Engine.java:78) [dlink-engine-0.0.1-SNAPSHOT.jar:na]
 at com.alibaba.dt.dlink.core.Engine.entry(Engine.java:158) [dlink-engine-0.0.1-SNAPSHOT.jar:na]
 at com.alibaba.dt.dlink.core.Engine.main(Engine.java:170) [dlink-engine-0.0.1-SNAPSHOT.jar:na]
2021-07-01 16:56:37.236 [job-1015968] INFO  DlinkTrans - dlink starts to do destroy ...
2021-07-01 16:56:37.236 [job-1015968] INFO  DlinkTransRunner - Dlink Reader.Job [本地CSV_1] do destroy work .
2021-07-01 16:56:37.237 [job-1015968] INFO  DlinkTransRunner - Dlink Writer.Job [Hive_1] do destroy work .
2021-07-01 16:56:37.237 [job-1015968] ERROR Engine - 

经Dlink智能分析,该任务最可能的错误原因是:
com.alibaba.dt.dlink.core.exception.DlinkException: Code:[Framework-02], Description:[Dlink引擎运行过程出错,具体原因请参看Dlink运行结束时的错误诊断信息  .].  - com.alibaba.dt.pipeline.plugin.center.exception.DataXException: Code:[HdfsWriter-02], Description:[您填写的参数值不合法.]. - 仅仅支持单字符切分, 您配置的切分为 : [ ]
 at com.alibaba.dt.pipeline.plugin.center.exception.DataXException.asDataXException(DataXException.java:40)
 at com.alibaba.datax.plugin.writer.hdfswriter.HdfsWriter$Job.validateParameter(HdfsWriter.java:116)
 at com.alibaba.datax.plugin.writer.hdfswriter.HdfsWriter$Job.init(HdfsWriter.java:52)
 at com.alibaba.dt.dlink.core.trans.DlinkTransRunner.initJobWriter(DlinkTransRunner.java:77)
 at com.alibaba.dt.dlink.core.trans.DlinkTrans.doInit(DlinkTrans.java:284)
 at com.alibaba.dt.dlink.core.trans.DlinkTrans.start(DlinkTrans.java:132)
 at com.alibaba.dt.dlink.core.Engine.runTrans(Engine.java:78)
 at com.alibaba.dt.dlink.core.Engine.entry(Engine.java:158)
 at com.alibaba.dt.dlink.core.Engine.main(Engine.java:170)
 - com.alibaba.dt.pipeline.plugin.center.exception.DataXException: Code:[HdfsWriter-02], Description:[您填写的参数值不合法.]. - 仅仅支持单字符切分, 您配置的切分为 : [ ]
 at com.alibaba.dt.pipeline.plugin.center.exception.DataXException.asDataXException(DataXException.java:40)
 at com.alibaba.datax.plugin.writer.hdfswriter.HdfsWriter$Job.validateParameter(HdfsWriter.java:116)
 at com.alibaba.datax.plugin.writer.hdfswriter.HdfsWriter$Job.init(HdfsWriter.java:52)
 at com.alibaba.dt.dlink.core.trans.DlinkTransRunner.initJobWriter(DlinkTransRunner.java:77)
 at com.alibaba.dt.dlink.core.trans.DlinkTrans.doInit(DlinkTrans.java:284)
 at com.alibaba.dt.dlink.core.trans.DlinkTrans.start(DlinkTrans.java:132)
 at com.alibaba.dt.dlink.core.Engine.runTrans(Engine.java:78)
 at com.alibaba.dt.dlink.core.Engine.entry(Engine.java:158)
 at com.alibaba.dt.dlink.core.Engine.main(Engine.java:170)

 at com.alibaba.dt.dlink.core.exception.DlinkException.asDlinkException(DlinkException.java:48)
 at com.alibaba.dt.dlink.core.trans.DlinkTrans.start(DlinkTrans.java:159)
 at com.alibaba.dt.dlink.core.Engine.runTrans(Engine.java:78)
 at com.alibaba.dt.dlink.core.Engine.entry(Engine.java:158)
 at com.alibaba.dt.dlink.core.Engine.main(Engine.java:170)
Caused by: com.alibaba.dt.pipeline.plugin.center.exception.DataXException: Code:[HdfsWriter-02], Description:[您填写的参数值不合法.]. - 仅仅支持单字符切分, 您配置的切分为 : [ ]
 at com.alibaba.dt.pipeline.plugin.center.exception.DataXException.asDataXException(DataXException.java:40)
 at com.alibaba.datax.plugin.writer.hdfswriter.HdfsWriter$Job.validateParameter(HdfsWriter.java:116)
 at com.alibaba.datax.plugin.writer.hdfswriter.HdfsWriter$Job.init(HdfsWriter.java:52)
 at com.alibaba.dt.dlink.core.trans.DlinkTransRunner.initJobWriter(DlinkTransRunner.java:77)
 at com.alibaba.dt.dlink.core.trans.DlinkTrans.doInit(DlinkTrans.java:284)
 at com.alibaba.dt.dlink.core.trans.DlinkTrans.start(DlinkTrans.java:132)
 ... 3 more

child_process.returncode: 1


2021-07-01 16:56:38 No outputData produced.

2021-07-01 16:56:38 Dlink command exit with code: 1

问题原因

管道任务输出配置中指定的分隔符\u0001前后有空格。

解决方案

该问题是由于管道任务输出配置中指定分隔符\u0001前后有空格,删除前后空格之后问题解决。

相关文档

Dataphin将CSV文件同步到hive库,目标hive库字段值为NULL

适用于

  • 产品名称:Dataphin
  • 产品模块:数据集成
  • 本页导读
文档反馈