调用GetJobAttemptLog,获取对应JobAttempt的日志。
调试
您可以在OpenAPI Explorer中直接运行该接口,免去您计算签名的困扰。运行成功后,OpenAPI Explorer可以自动生成SDK代码示例。
请求参数
名称 | 类型 | 是否必选 | 示例值 | 描述 |
---|---|---|---|---|
Action | String | 是 | GetJobAttemptLog |
系统规定参数。取值:GetJobAttemptLog。 |
JobAttemptId | String | 是 | j202105272322hangzhou5d64f1560000128-0001 |
Spark作业尝试ID。 |
JobId | String | 是 | j202105272322hangzhou5d64f1560000128 |
Spark作业ID。您可以在作业管理页面查看作业。 |
VcName | String | 是 | release-test |
执行Spark作业的虚拟集群名称。您可以在虚拟集群管理页面查看虚拟集群名称。 |
返回数据
名称 | 类型 | 示例值 | 描述 |
---|---|---|---|
Data | String | local:///opt/spark/jars/offline-sql.jar, main_file\n+ exec /usr/bin/tini -s -- /jdk/jdk8/bin/java -cp '/opt/tools/exec-wrapper.jar:.:::/opt/spark/jars/*' com.aliyun.dla.spark.SparkJobWrapper /opt/spark/bin/spark-submit --conf spark.driver.host=172.16.6.205 --conf spark.ui.port=4040 --conf 'spark.driver.extraJavaOptions=-Dlog4j.configuration=file:///opt/spark/log-conf/log4j.properties -XX:+HeapDumpOnOutOfMemoryError -XX:HeapDumpPath=/tmp/dump.hprof -XX:OnOutOfMemoryError=\"bash /opt/tools/oss-cp.sh /tmp/dump.hprof oss://dla-test-cn-hangzhou/spark-logs/release-test/j202105272322hangzhou5d64f1560000128-0001/driver/dump.hprof; bash /opt/tools/job-stop.sh\" ' |
返回的日志。 |
RequestId | String | 9CE8F271-F918-43B6-8F58-F9F1C2DCFDB8 |
请求的唯一ID。 |
示例
请求示例
http(s)://[Endpoint]/?Action=GetJobAttemptLog
&JobAttemptId=j202105272322hangzhou5d64f1560000128-0001
&JobId=j202105272322hangzhou5d64f1560000128
&VcName=release-test
&<公共请求参数>
正常返回示例
XML
格式
<RequestId>9CE8F271-F918-43B6-8F58-F9F1C2DCFDB8</RequestId>
<Data>local:///opt/spark/jars/offline-sql.jar, main_file\n+ exec /usr/bin/tini -s -- /jdk/jdk8/bin/java -cp '/opt/tools/exec-wrapper.jar:.:::/opt/spark/jars/*' com.aliyun.dla.spark.SparkJobWrapper /opt/spark/bin/spark-submit --conf spark.driver.host=172.16.6.205 --conf spark.ui.port=4040 --conf 'spark.driver.extraJavaOptions=-Dlog4j.configuration=file:///opt/spark/log-conf/log4j.properties -XX:+HeapDumpOnOutOfMemoryError -XX:HeapDumpPath=/tmp/dump.hprof -XX:OnOutOfMemoryError=\"bash /opt/tools/oss-cp.sh /tmp/dump.hprof oss://dla-test-cn-hangzhou/spark-logs/release-test/j202105272322hangzhou5d64f1560000128-0001/driver/dump.hprof; bash /opt/tools/job-stop.sh\" '</Data>
JSON
格式
{"RequestId":"9CE8F271-F918-43B6-8F58-F9F1C2DCFDB8","Data":"local:///opt/spark/jars/offline-sql.jar, main_file\\n+ exec /usr/bin/tini -s -- /jdk/jdk8/bin/java -cp '/opt/tools/exec-wrapper.jar:.:::/opt/spark/jars/*' com.aliyun.dla.spark.SparkJobWrapper /opt/spark/bin/spark-submit --conf spark.driver.host=172.16.6.205 --conf spark.ui.port=4040 --conf 'spark.driver.extraJavaOptions=-Dlog4j.configuration=file:///opt/spark/log-conf/log4j.properties -XX:+HeapDumpOnOutOfMemoryError -XX:HeapDumpPath=/tmp/dump.hprof -XX:OnOutOfMemoryError=\\\"bash /opt/tools/oss-cp.sh /tmp/dump.hprof oss://dla-test-cn-hangzhou/spark-logs/release-test/j202105272322hangzhou5d64f1560000128-0001/driver/dump.hprof; bash /opt/tools/job-stop.sh\\\" '"}
错误码
访问错误中心查看更多错误码。