GetSparkAppMetrics - 查询Spark App指标数据

获取Spark App指标数据。

接口说明

  • 地域的公网接入地址:adb.<region-id>.aliyuncs.com。示例:adb.cn-hangzhou.aliyuncs.com
  • 地域的 VPC 接入地址:adb-vpc.<region-id>.aliyuncs.com。示例:adb-vpc.cn-hangzhou.aliyuncs.com
说明 如果华北 1(青岛)、华南 1(深圳)、华南 3(广州)、中国香港发起请求时,遇到 409 错误,请联系技术支持。

调试

您可以在OpenAPI Explorer中直接运行该接口,免去您计算签名的困扰。运行成功后,OpenAPI Explorer可以自动生成SDK代码示例。

授权信息

当前API暂无授权信息透出。

请求参数

名称类型必填描述示例值
AppIdstring

应用 ID。

s202204221525hzca7d8140000003
DBClusterIdstring

企业版、基础版或湖仓版集群 ID。

说明 您可以调用 DescribeDBClusters 接口查询目标地域中所有集群的集群 ID。
amv-bp1ggnu61d77****

返回参数

名称类型描述示例值
object

结构体

RequestIdstring

Id of the request

1AD222E9-E606-4A42-BF6D-8A4442913CEF
Dataobject

返回数据。

AppIdstring

应用 ID。

s202302051515shfa865f8000****
AttemptIdstring

Spark 应用的重试 ID。

s202301061000hz57d797b0000201-****
Finishedboolean

解析是否结束:

  • true:已结束。
  • false:未结束。
True
EventLogPathstring

Event 日志路径。

oss://path/to/eventLog
ScanMetricsobject

指标数据。

OutputRowsCountlong

扫描行数。

1000
TotalReadFileSizeInBytelong

扫描字节大小。

10000

示例

正常返回示例

JSON格式

{
  "RequestId": "1AD222E9-E606-4A42-BF6D-8A4442913CEF",
  "Data": {
    "AppId": "s202302051515shfa865f8000****",
    "AttemptId": "s202301061000hz57d797b0000201-****",
    "Finished": true,
    "EventLogPath": "oss://path/to/eventLog",
    "ScanMetrics": {
      "OutputRowsCount": 1000,
      "TotalReadFileSizeInByte": 10000
    }
  }
}

错误码

HTTP status code错误码错误信息描述
400Spark.InvalidStateThe object of the operation is in an invalid state: %s操作对象处于非合法状态。
400Spark.InvalidParameterInvalid parameter value: %s输入参数不正确。
400Spark.AnalyzeTask.AppStateNotAcceptedOnly Spark applications in the terminated state can be analyzed. The specified application %s does not meet the requirement.只允许分析处于终止状态的 Spark app。指定 app 不满足条件。
400Spark.AnalyzeTask.FailedToKillFailed to terminate the Spark log analysis task %s, because the task status has changed. Obtain the latest status and try again.杀死Spark日志分析任务的请求执行失败,因为对应任务状态已经发生了变化。请获取分析任务最新状态后执行。
400Spark.AnalyzeTask.InvalidStateWhenAnalyzingAppOnly logs of Spark applications in the terminated state can be analyzed. The specified application %s does not meet the requirement.仅允许针对处于终止状态的Spark作业执行日志分析程序。当前作业不满足条件。
400Spark.App.InvalidResourceSpecThe requested resource type is not supported:\n %s-
400Spark.App.KillOperationFailedFailed to kill the application %s, please retry in a few seconds.杀死 Spark 应用失败,请稍后重试。
400Spark.App.ParameterConflictConflicting parameters submitted:\n %s-
400Spark.CancelNotebookStatementFailedFailed to cancel the execution of notebook statements.Notebook 取消代码运行失败
400Spark.CloseNotebookKernelFailedFailed to disable the notebook kernel.Notebook Kernel 关闭失败
400Spark.Config.invalidConnectorsThe spark.adb.connectors configuration is invalid: %sspark.adb.connectors配置不正确。
400Spark.Config.RoleArnVerifyFailedRoleARN parameter verification failed. Error msg: %s when verify RoleArn %sRoleARN参数校验失败。
400Spark.ExecuteNotebookStatementFailedFailed to execute notebook statements.Notebook 代码运行失败
400Spark.GetNotebookKernelFailedFailed to start the notebook kernel. Try again later.Notebook Kernel 启动失败,请稍后重试
400Spark.GetNotebookKernelStateFailedFailed to obtain the status of the notebook kernel.Notebook Kernel 状态查询失败
400Spark.GetNotebookStatementLogFailedFailed to obtain notebook statement logs.Notebook 代码执行日志查询失败
400Spark.GetNotebookStatementResultFailedFailed to obtain notebook statement execution results.Notebook 代码运行结果查询失败
400Spark.GetNotebookStatementUiFailedFailed to obtain notebook statement UI.Notebook 代码UI查询失败
400Spark.Log.IllegalPathInvalid job log URI: %s.配置的Spark日志路径不合法:%s。
400Spark.Log.InvalidStateFailed to obtain the logs of the Spark job %s in the %s state.无法获取指定Spark作业的日志。
400Spark.NotebookKernelBusyToo many jobs in the notebook kernel. Try again later.Notebook Kernel 作业数过多,请稍后重试
400Spark.NotebookKernelExpiredThe notebook kernel has expired. Restart the kernel.Notebook Kernel 已过期,请重新启动
400Spark.NotebookKernelInvalidStatusThe status of the notebook kernel is invalid.Notebook Kernel 状态无效,请联系管理员
400Spark.NotebookKernelNotStartupThe notebook kernel has not been startedNotebook Kernel未启动,请先启动
400Spark.NotebookKernelStartingThe notebook kernel is starting. Try again later.Notebook Kernel 启动中,请稍后重试
400Spark.NotebookNamingDuplicateThe name of the notebook is duplicated, please change the nameNotebook命名重复,请重新命名。
400Spark.NotebookNotFoundThe notebook is not foundNotebook不存在,请联系管理员
400Spark.NotebookParagraphMissingProgramCodeThe notebook paragraph program code is missingNotebook段落缺少执行代码
400Spark.NotebookParagraphNotFoundThe notebook paragraph is not foundNotebook段落不存在,请联系管理员
400Spark.NotebookParagraphNotRunningThe notebook paragraph has not been runningNotebook段落未运行,请先执行语句
400Spark.Oss.InternalErrorAn OSS internal error occurred: %sOSS内部错误,请求失败。
400Spark.RoleArn.Invalid%s is not found, or the RAM role has not been authorized.RoleArn不存在或子账号未授权。
400Spark.SQL.NotFoundExecutableSQLErrorNo executable statements are submitted. Please check the input SQL.Spark作业的SQL信息中不包含可执行语句。
400Spark.SQL.ParserErrorFailed to parse the SQL %s. Error message: %s.Spark作业信息中的SQL无法解析。
400Spark.TemplateFile.BadFileTypeThe requested template %s is not a file.请求的模板文件ID不是文件类型的。
403Spark.ForbiddenNo permissions to access the resources: %s权限不足,拒绝访问相关资源。您当前申请访问的信息是:%s。
404Spark.AnalyzeTask.NotFoundThe requested analysis task %s is not found.请求查看的日志分析任务 %s 不存在。
404Spark.App.ContentNotFoundThe requested content %s of the Spark application is not found.找不到指定Spark作业的提交内容。
404Spark.Log.PodLogNotFoundCan't find logs of the pod by podName[%s] in the namespace[%s].无法通过podName找到对应pod的日志。
404Spark.ObjectNotFoundThe object is not found. More information: %s操作对象不存在。关联信息:%s。
404Spark.TemplateFile.FileNotFoundThe template file %s is not found.未能找到输入的模板文件。
404Spark.TemplateFile.TemplateNotFoundThe template %s is not found.模板文件未找到。
406Spark.App.KillNotAcceptableCan't kill the application %s in %s state.不能删除处于不可删除状态下的 Spark 应用。
500Spark.ServerErrorThe Spark control component system encountered an error, please create a ticket to solve the problem or concat the supported engineer on duty. Error message: %sSpark管控组件系统遇到错误,请提交工单,或联系值班工程师。
500Spark.Resources.LoadFileFromClasspathFailedCan't load the content from file: %s加载资源文件失败。

访问错误中心查看更多错误码。

变更历史

变更时间变更内容概要操作
2023-11-24OpenAPI 错误码发生变更、OpenAPI 入参发生变更查看变更详情
2023-06-28OpenAPI 错误码发生变更查看变更详情