ListTracesDatas - 按照条件查找链路

每条链路数据记录,可以命名为trace。一个trace包含了一组span记录。这个APItrace为粒度执行搜索,按照指定条件获取trace数据列表。

调试

您可以在OpenAPI Explorer中直接运行该接口,免去您计算签名的困扰。运行成功后,OpenAPI Explorer可以自动生成SDK代码示例。

授权信息

下表是API对应的授权信息,可以在RAM权限策略语句的Action元素中使用,用来给RAM用户或RAM角色授予调用此API的权限。具体说明如下:

  • 操作:是指具体的权限点。
  • 访问级别:是指每个操作的访问级别,取值为写入(Write)、读取(Read)或列出(List)。
  • 资源类型:是指操作中支持授权的资源类型。具体说明如下:
    • 对于必选的资源类型,用前面加 * 表示。
    • 对于不支持资源级授权的操作,用全部资源表示。
  • 条件关键字:是指云产品自身定义的条件关键字。
  • 关联操作:是指成功执行操作所需要的其他权限。操作者必须同时具备关联操作的权限,操作才能成功。
操作访问级别资源类型条件关键字关联操作
paillmtrace:ListTracesDataslist
*全部资源
*

请求语法

GET /api/v1/PAILLMTrace/TracesDatas HTTP/1.1

请求参数

名称类型必填描述示例值
TraceIdsarray

链路 id 数组。

string

一个链路 id

4d097fb3d451bad0148feefb4ab95
SpanIdsarray

span id 数组。一条链路记录里包含一个或多个 span。

string

一个 span id

17d4ef87a1fe8
SessionIdstring

链路记录里 attributes.gen_ai.session.id 字段的值。默认为空。

e5895063-9b29-4ce4-b0fd-a018bfa11111
MinTimestring

搜索时间范围下限,UTC 时间,YYYY-mm-dd 或者 YYYY-MM-DD HH:mm:ss 格式。默认(当前时间 - 2 天)

2024-01-31 2024-12-31 23:59:59
MaxTimestring

搜索时间范围上限,UTC 时间,YYYY-mm-dd 或者 YYYY-MM-DD HH:mm:ss 格式。默认(当前时间 + 10 分钟)

2024-01-31 2024-12-31 23:59:59
PageNumberinteger

分页查询时设置的页码。 起始值:1。默认值:1。

1
PageSizeinteger

分页大小,默认值 10,最大值 50。

10
SpanNamestring

查找 Trace 时使用的过滤条件,返回包含该 span 名称的 Trace。

ChatFlow.astream_chat
LlmAppNamestring

链路记录里,resources.service.app.name 字段的值。允许 a-zA-Z0-9 dot, hyphen, underscore 这些字符。必须完全匹配。默认为空。

My.super_LLM-app2
MinDurationfloat

查找 Trace 时使用的过滤条件,Trace 的最小持续时长。单位为秒(s)。

2
MaxDurationfloat

查找 Trace 时使用的过滤条件,Trace 的最大持续时长。单位为秒(s)。

5
OpentelemetryCompatibleboolean

返回的 json 格式数据是否可以直接转换成 opentelemetry TracesData protobuf 对象。默认为 False。OpenTelemetry 格式兼容的 json 数据,格式更复杂,除非为了生成 OpenTelemetry 的 protobuf 对象,一般不需要这种格式的数据。

False
HasStatusMessageboolean

是否只返回包含有 statusMessage 不为空的 span 的链路记录。比如 1 个链路里有 3 个 span,如果该项为 True,3 个 span 任何 1 个有非空的 statusMessage,那么这个链路就符合条件。默认为 False,不用 statusMessage 状态做筛选。

False
HasEventsboolean

是否只返回包含有 events 不为空的 span 的链路记录。比如 1 个链路里有 3 个 span,如果该项为 True,3 个 span 任何 1 个有非空的 events,那么这个链路记录就符合条件。默认为 False,不用 events 状态做筛选。

False
OwnerSubIdstring

链路记录里 resources.service.owner.sub_id 字段的值。允许 a-zA-Z0-9 dot, hyphen, underscore 这些字符。默认为空。

123456789
EndUserIdstring

链路记录里 attributes.service.app.user_id 字段的值。允许 a-zA-Z0-9 dot, hyphen, underscore 这些字符。默认为空。

end-user.12345
Filtersarray<object>

额外的过滤参数

object

过滤参数对象

Keystring

过滤参数名称,支持若干预定义参数和用户自定义参数。

  1. 预定义参数,大小写不敏感,包括:'serviceid', 'servicename', 'input', 'output', 'status', 'tracetype', 'tracename'。这些预定义参数对应的 span 属性如下:

serviceid: resources.service.id

servicename: resources.service.name

input: attributes.input.value

output: attributes.output.value

status: statusCode

tracetype: parentSpanId 是 0 的 span,attributes.gen_ai.span.kind

tracename: parentSpanId 是 0 的 span,spanName

  1. 自定义参数,大小写敏感。: 用户可以自定义参数名称。这个名称会作为一个以 attributes 开头的 json 路径的后缀。服务端以这个 json 路径为位置,按 operator 和 value,筛选符合条件的 span。

比如用户自定义 key 为 foo.bar,那么服务端会在 attributes.foo.bar 这个位置执行筛选。比如用户自定义 key 为 gen_ai.FOO.BAR,那么服务端会在 attributes.gen_ai.FOO.BAR 这个位置执行筛选。

枚举值:
  • StatusStatus
  • SpanNameSpanName
  • InputInput
  • TraceTypeTraceType
  • SpanTypeSpanType
  • ServiceNameServiceName
  • OutputOutput
  • TraceNameTraceName
  • ServiceIdServiceId
output
Operatorstring

参数运算符。大小写不敏感。目前支持:‘=’,’contains‘,'startswith'。

枚举值:
  • containscontains
  • ==
  • startsWithstartsWith
contains
Valuestring

过滤参数取值。对于 contains 操作,大小写敏感。其它操作,大小写不敏感。

智能填写
TraceReduceMethodstring

对返回的链路数据做内容简化的方法,用于减少数据返回量。

REMOVE_EMBEDDING - 将所有的 embedding 数组具体内容清除

ROOT_ONLY - 每个链路仅返回 root span,且如果 root span 内容也做 REMOVE_EMBEDDING 处理。

不填写 - 则维持原有数据

REMOVE_EMBEDDING ROOT_ONLY
SortBystring

返回结果排序字段,目前支持 trace 开始时间、trace 时长、trace 总 token 数量、和 trace id。

枚举值:
  • StartTimeStartTime
  • DurationDuration
StartTime Duration TotalTokens TraceId
SortOrderstring

排序方向。取值范围:

  • ASC:升序
  • DESC(默认值):降序
DESC ASC

所有参数之间是 AND 的关系。spanIds 的多个 id 之间是 OR 的关系。traceIds 的多个 id 之间是 OR 的关系。一个 trace 里任何一个 span 如果符合搜索条件,那么这个 trace 就符合搜索条件。

如果不需要支持 opentelemetry 格式的数据,建议把 OpentelemetryCompatible 参数设置成 False,可以简化返回的 json string 的结构。

返回参数

名称类型描述示例值
object

Schema of Response

TotalCountinteger

符合条件的 trace 的总数。

22
RequestIdstring

POP request id

6A87228C-969A-1381-98CF-AE07AE630FA5
Codestring

内部错误码。只在响应出错时被设置。

ExecutionFailure
Messagestring

响应错误信息。只在响应出错时被设置。

failed to get trace data
Tracesarray

json array,每个 element 是一个 trace 的 json string。array 长度等于或者小于 page size 参数值。

tracesany

代表一个 trace 的 json string。

open telemetry compatible: {"resource_spans":[{"resource":{"attributes":[{"key":"telemetry.sdk.language","value":{"string_value":"python"}},{"key":"telemetry.sdk.name","value":{"string_value":"opentelemetry"}},{"key":"telemetry.sdk.version","value":{"string_value":"1.25.0"}},{"key":"service.name","value":{"string_value":"llm_trace_llamaindex_test_template"}},{"key":"service.version","value":{"string_value":"0.0.1"}},{"key":"deployment.environment","value":{"string_value":"cn-hangzhou"}},{"key":"service.app.name","value":{"string_value":"llm_trace_llamaindex_test"}},{"key":"service.owner.id","value":{"string_value":"177393915"}},{"key":"service.owner.sub_id","value":{"string_value":"230589443368"}}]},"scope_spans":[{"spans":[{"trace_id":"ae40025eae77fbab21687bd7e41c","span_id":"e09710fbd3c6","parent_span_id":"0","name":"query","kind":"SPAN_KIND_INTERNAL","start_time_unix_nano":"1718910212440262841","end_time_unix_nano":"1718910215403326721","attributes":[{"key":"input.value","value":{"string_value":"Question 1: what is pai-llm-trace."}},{"key":"openinference.span.kind","value":{"string_value":"CHAIN"}},{"key":"output.value","value":{"string_value":"The term \"pai-llm-trace\" refers to a component or module within the project or codebase being worked on."}},{"key":"pop.request.id","value":{"string_value":"1C714E-057D-1216-835A-06F722E4F3"}}],"status":{"code":"STATUS_CODE_OK"}},{"trace_id":"444bae400277fbab21687bd7e41c","span_id":"5686fa7a6b85b","parent_span_id":"e0a5920fbd3c6","name":"retrieve","kind":"SPAN_KIND_INTERNAL","start_time_unix_nano":"1718910212440884059","end_time_unix_nano":"1718910213181800657","attributes":[{"key":"input.value","value":{"string_value":"Question 1: what is pai-llm-trace."}},{"key":"openinference.span.kind","value":{"string_value":"RETRIEVER"}},{"key":"pop.request.id","value":{"string_value":"E8A1F8D9-D35A-1F9F-B724-5DA12E6F612"}},{"key":"retrieval.documents","value":{"array_value":{"values":[{"kvlist_value":{"values":[{"key":"document.content","value":{"string_value":".PHONY: clean package test\n\nclean:\n\trm -rf build dist src/*.egg-info src/pai/llm_trace/__pycache__ src/pai/llm_trace/__pycache__\n\npackage: clean\n\tpip install build && python -m build\n\nrun:\n\tbash tools/run_test.sh"}},{"key":"document.id","value":{"string_value":"c3bd0f7e-2-42f7-964c-2fcc698657db"}},{"key":"document.metadata","value":{"string_value":"{\"file_path\": \"/mnt/disk1t/weidan.kong/work/llm_trace/client/makefile\", \"file_name\": \"makefile\", \"file_size\": 213, \"creation_date\": \"2024-05-10\", \"last_modified_date\": \"2024-05-10\"}"}},{"key":"document.score","value":{"double_value":0.592955599044689}},{"key":"vector_index","value":{"int_value":"0"}}]}}]}}}],"status":{"code":"STATUS_CODE_OK"}},{"trace_id":"444bae4e77fbab21687bd7e41c","span_id":"f2fc6f6acbc7","parent_span_id":"f08a7a6b85b","name":"embedding","kind":"SPAN_KIND_INTERNAL","start_time_unix_nano":"1718910212441468561","end_time_unix_nano":"1718910212617063167","attributes":[{"key":"embedding.embeddings","value":{"array_value":{"values":[{"kvlist_value":{"values":[{"key":"embedding.text","value":{"string_value":"Question 1: what is pai-llm-trace."}},{"key":"embedding.vector","value":{"string_value":"PAI LLM Trace system hid 1536 actual items"}},{"key":"vector_index","value":{"int_value":"0"}}]}}]}}},{"key":"embedding.model_name","value":{"string_value":"text-embedding-ada-002"}},{"key":"openinference.span.kind","value":{"string_value":"EMBEDDING"}},{"key":"pop.request.id","value":{"string_value":"56B9CCC3-CC84-0ED-53F60DC4A318"}}],"status":{"code":"STATUS_CODE_OK"}},{"trace_id":"ae25eae77fbab21687bd7e41c","span_id":"f2ebc6f12b27","parent_span_id":"e0a710fbd3c6","name":"synthesize","kind":"SPAN_KIND_INTERNAL","start_time_unix_nano":"1718910213465461860","end_time_unix_nano":"1718910215131655555","attributes":[{"key":"input.value","value":{"string_value":"Question 1: what is pai-llm-trace."}},{"key":"openinference.span.kind","value":{"string_value":"CHAIN"}},{"key":"output.value","value":{"string_value":"The term \"pai-llm-trace\" refers to a component or module within the project or codebase being worked on."}},{"key":"pop.request.id","value":{"string_value":"67D34C0D-1CD5-11B7-BEE2-05649"}}],"status":{"code":"STATUS_CODE_OK"}},{"trace_id":"21687bd7e41c","span_id":"d8f3fc7c","parent_span_id":"f2ebc12b27","name":"chunking","kind":"SPAN_KIND_INTERNAL","start_time_unix_nano":"1718910213467525240","end_time_unix_nano":"1718910213467894216","attributes":[{"key":"openinference.span.kind","value":{"string_value":"CHAIN"}},{"key":"pop.request.id","value":{"string_value":"8E1625-B8D1-EA177F9FC69D"}}],"status":{"code":"STATUS_CODE_OK"}},{"trace_id":"5eae77fbab7bd7e41c","span_id":"0ab8a7c75","parent_span_id":"f2ebcf12b27","name":"chunking","kind":"SPAN_KIND_INTERNAL","start_time_unix_nano":"1718910213733013448","end_time_unix_nano":"1718910213733446902","attributes":[{"key":"openinference.span.kind","value":{"string_value":"CHAIN"}},{"key":"pop.request.id","value":{"string_value":"14D0D5-1675-BCA7-AF320E26C1A4"}}],"status":{"code":"STATUS_CODE_OK"}},{"trace_id":"25eae77fbab87bd7e41c","span_id":"2cb6c51c5fb2","parent_span_id":"f2ebcf12b27","name":"llm","kind":"SPAN_KIND_INTERNAL","start_time_unix_nano":"1718910214008467118","end_time_unix_nano":"1718910214849631714","attributes":[{"key":"llm.completions","value":{"array_value":{"values":[{"kvlist_value":{"values":[{"key":"message.content","value":{"string_value":"The term \"pai-llm-trace\" refersto a component or module within the project or codebase being worked on."}},{"key":"message.role","value":{"string_value":"assistant"}},{"key":"vector_index","value":{"int_value":"0"}}]}}]}}},{"key":"llm.invocation_parameters","value":{"string_value":"{\"temperature\": 0.1, \"model\": \"gpt-3.5-turbo\"}"}},{"key":"llm.model_name","value":{"string_value":"gpt-3.5-turbo"}},{"key":"llm.prompt_template.template","value":{"string_value":"system: You are an expert Q&A system that is trusted around the world.\nAlways answer the query using the provided context information, and not prior knowledge.\nSome rules to follow:\n1. Never directly reference thegiven context in your answer.\n2. Avoid statements like 'Based on the context, ...' or 'The context information ...' or anything along those lines.\nuser: Context information is below.\n---------------------\n{context_str}\n---------------------\nGiven the context information and not prior knowledge, answer the query.\nQuery: {query_str}\nAnswer: \nassistant: "}},{"key":"llm.prompt_template.variables","value":{"string_value":"{\"context_str\": \"file_path: /mnt/disk1t/weidan.kong/work/llm_trace/client/makefile\\n\\n.PHONY: clean package test\\n\\nclean:\\n\\trm -rf build dist src/*.egg-info src/pai/llm_trace/__pycache__ src/pai/llm_trace/__pycache__\\n\\npackage: clean\\n\\tpip install build && python -m build\\n\\nrun:\\n\\tbash tools/run_test.sh\", \"query_str\": \"Question 1: what is pai-llm-trace.\"}"}},{"key":"llm.prompts","value":{"array_value":{"values":[{"kvlist_value":{"values":[{"key":"message.content","value":{"string_value":"You are an expert Q&A system that is trusted around the world.\nAlways answer the query using the provided context information, and not prior knowledge.\nSome rules to follow:\n1. Never directly reference the given context in your answer.\n2. Avoid statements like 'Based on the context, ...' or 'The context information ...' or anything along those lines."}},{"key":"message.role","value":{"string_value":"system"}},{"key":"vector_index","value":{"int_value":"0"}}]}},{"kvlist_value":{"values":[{"key":"message.content","value":{"string_value":"Context information is below.\n---------------------\nfile_path: /mnt/disk1t/weidan.kong/work/llm_trace/client/makefile\n\n.PHONY: clean package test\n\nclean:\n\trm -rf build dist src/*.egg-info src/pai/llm_trace/__pycache__ src/pai/llm_trace/__pycache__\n\npackage: clean\n\tpip install build && python -m build\n\nrun:\n\tbash tools/run_test.sh\n---------------------\nGiven the context information and not prior knowledge, answer the query.\nQuery: Question 1:what is pai-llm-trace.\nAnswer: "}},{"key":"message.role","value":{"string_value":"user"}},{"key":"vector_index","value":{"int_value":"1"}}]}}]}}},{"key":"llm.token_count.completion","value":{"string_value":"26"}},{"key":"llm.token_count.prompt","value":{"string_value":"210"}},{"key":"llm.token_count.total","value":{"string_value":"236"}},{"key":"openinference.span.kind","value":{"string_value":"LLM"}},{"key":"output.value","value":{"string_value":"The term \"pai-llm-trace\" refers to a component or module within the project or codebase being worked on."}},{"key":"pop.request.id","value":{"string_value":"AA68F16D-A8B7-1BB0-1BD1E47B5"}}],"status":{"code":"STATUS_CODE_OK"}}]}]}]} open telemetry incompatible: [{"aliyunUid": "1773939154", "hostname": "qlc-ark2", "resources": {"deployment": {"environment": "cn-hangzhou"}, "service": {"app": {"name": "llm_trace_llamaindex_test"}, "name": "llm_trace_llamaindex_test_template", "owner": {"id": "177393915", "sub_id": "2305894433"}, "version": "0.0.1"}, "telemetry": {"sdk": {"language": "python", "name": "opentelemetry", "version": "1.25.0"}}}, "traceId": "444bae77fbab7bd7e41c", "spanId": "e0a0fbd3c6", "parentSpanId": "0", "kind": "SPAN_KIND_INTERNAL", "spanName": "query", "links": [], "events": [], "traceState": "", "startTime": 1718910212440262841, "endTime": 1718910215403326721, "duration": 2963063880, "attributes": {"input": {"value": "Question 1: what is pai-llm-trace."}, "openinference": {"span": {"kind": "CHAIN"}},"output": {"value": "The term \\"pai-llm-trace\\" refers to a component or module within the project or codebase being worked on."}, "pop": {"request": {"id": "1C70E14E-06F7D4F3"}}}, "statusCode": "STATUS_CODE_OK", "statusMessage": ""}, {"aliyunUid": "773939154", "hostname": "qlc-ark2", "resources": {"deployment": {"environment": "cn-hangzhou"}, "service": {"app": {"name": "llm_trace_llamaindex_test"}, "name": "llm_trace_llamaindex_test_template", "owner": {"id": "1773939", "sub_id": "2305819539"}, "version": "0.0.1"}, "telemetry": {"sdk": {"language": "python", "name": "opentelemetry", "version": "1.25.0"}}}, "traceId": "baeeae77fbab21687bd7e41c", "spanId": "ff08a7a6b85b", "parentSpanId": "e0a0fbd3c6", "kind": "SPAN_KIND_INTERNAL", "spanName": "retrieve", "links": [], "events": [], "traceState": "","startTime": 1718910212440884059, "endTime": 1718910213181800657, "duration": 740916598, "attributes": {"input": {"value": "Question 1: what is pai-llm-trace."}, "openinference": {"span": {"kind": "RETRIEVER"}}, "pop": {"request": {"id": "E8A1F8D9-D35A-1F9F2E6F612"}}, "retrieval": {"documents": [{"document": {"content": ".PHONY: clean package test\\n\\nclean:\\n\\trm -rf build dist src/*.egg-info src/pai/llm_trace/__pycache__ src/pai/llm_trace/__pycache__\\n\\npackage: clean\\n\\tpip install build && python -m build\\n\\nrun:\\n\\tbash tools/run_test.sh", "id": "c3bd0f7e-7--2fcc8657db", "metadata": "{\\"file_path\\": \\"/mnt/disk1t/weidan.kong/work/llm_trace/client/makefile\\", \\"file_name\\": \\"makefile\\", \\"file_size\\": 213, \\"creation_date\\": \\"2024-05-10\\", \\"last_modified_date\\": \\"2024-05-10\\"}", "score": 0.592955599044689}}]}}, "statusCode": "STATUS_CODE_OK", "statusMessage": ""}, {"aliyunUid": "1773939154062345", "hostname": "qlc-ark2", "resources": {"deployment": {"environment": "cn-hangzhou"}, "service": {"app": {"name": "llm_trace_llamaindex_test"}, "name": "llm_trace_llamaindex_test_template", "owner": {"id": "1793915", "sub_id": "58944336"}, "version": "0.0.1"}, "telemetry": {"sdk": {"language": "python", "name": "opentelemetry", "version": "1.25.0"}}}, "traceId": "444baebd7e41c", "spanId": "fc6f6acbc7", "parentSpanId": "f08a7a6b85b", "kind": "SPAN_KIND_INTERNAL", "spanName": "embedding", "links": [],"events": [], "traceState": "", "startTime": 1718910212441468561, "endTime": 1718910212617063167, "duration": 175594606, "attributes": {"embedding": {"embeddings": [{"embedding": {"text": "Question 1: what is pai-llm-trace.", "vector": "PAI LLM Trace system hid 1536 actual items"}}], "model_name": "text-embedding-ada-002"}, "openinference": {"span": {"kind": "EMBEDDING"}}, "pop": {"request": {"id": "56B9CCC3-CC84-80ED-53F60DC4A"}}}, "statusCode": "STATUS_CODE_OK", "statusMessage": ""}, {"aliyunUid": "1773939", "hostname": "qlc-ark2", "resources": {"deployment": {"environment": "cn-hangzhou"}, "service": {"app": {"name": "llm_trace_llamaindex_test"}, "name": "llm_trace_llamaindex_test_template", "owner": {"id": "17739391540", "sub_id": "23058944336"}, "version": "0.0.1"}, "telemetry": {"sdk": {"language": "python", "name": "opentelemetry", "version": "1.25.0"}}}, "traceId": "eae77fbab21687bd7e41c", "spanId": "f2ebcf12b27", "parentSpanId": "e0afbd3c6", "kind": "SPAN_KIND_INTERNAL", "spanName": "synthesize", "links": [], "events": [], "traceState": "", "startTime": 1718910213465461860, "endTime": 1718910215131655555, "duration": 1666193695, "attributes": {"input": {"value": "Question 1: what is pai-llm-trace."}, "openinference": {"span": {"kind": "CHAIN"}}, "output": {"value": "The term \\"pai-llm-trace\\" refers to a component or module within the project or codebase being worked on."}, "pop": {"request": {"id": "67D34C0D-1CD5-11B7-BEE2-0F90DCC"}}}, "statusCode": "STATUS_CODE_OK", "statusMessage": ""}, {"aliyunUid": "1773939", "hostname": "qlc-ark2", "resources": {"deployment": {"environment": "cn-hangzhou"}, "service": {"app": {"name": "llm_trace_llamaindex_test"}, "name": "llm_trace_llamaindex_test_template", "owner": {"id": "177393915", "sub_id": "23058944336"}, "version": "0.0.1"}, "telemetry": {"sdk": {"language": "python", "name": "opentelemetry", "version": "1.25.0"}}}, "traceId": "ae0ae77fbab87bd7e41c", "spanId": "d8f3fc6d47c", "parentSpanId": "f2ebcf12b27", "kind": "SPAN_KIND_INTERNAL", "spanName": "chunking", "links": [], "events": [], "traceState": "", "startTime": 1718910213467525240, "endTime": 1718910213467894216, "duration": 368976, "attributes": {"openinference": {"span": {"kind": "CHAIN"}}, "pop": {"request": {"id": "8EA31C-5-B8D1-EA177F9FC69D"}}}, "statusCode": "STATUS_CODE_OK", "statusMessage": ""}, {"aliyunUid": "1773939", "hostname": "qlc-ark2", "resources": {"deployment": {"environment": "cn-hangzhou"}, "service": {"app": {"name":"llm_trace_llamaindex_test"}, "name": "llm_trace_llamaindex_test_template", "owner": {"id": "17739391", "sub_id": "23058944"}, "version": "0.0.1"}, "telemetry": {"sdk": {"language": "python", "name":"opentelemetry", "version": "1.25.0"}}}, "traceId": "444babd7e41c", "spanId": "0aba7c75", "parentSpanId": "f2ebc6f12b27", "kind": "SPAN_KIND_INTERNAL", "spanName": "chunking", "links": [], "events": [], "traceState": "", "startTime": 1718910213733013448, "endTime": 1718910213733446902, "duration": 433454, "attributes": {"openinference": {"span": {"kind": "CHAIN"}}, "pop": {"request": {"id": "14D0D75-BCA7-AFE26C1A4"}}}, "statusCode": "STATUS_CODE_OK", "statusMessage": ""}, {"aliyunUid": "1773939", "hostname": "qlc-ark2", "resources": {"deployment": {"environment": "cn-hangzhou"}, "service": {"app": {"name": "llm_trace_llamaindex_test"}, "name": "llm_trace_llamaindex_test_template", "owner": {"id": "177393915", "sub_id": "2305894433"}, "version": "0.0.1"}, "telemetry": {"sdk": {"language": "python", "name": "opentelemetry", "version": "1.25.0"}}}, "traceId": "ae025eae77fbab7bd7e41c", "spanId": "2cb6c51c5fb2", "parentSpanId": "f2ebc6f12b27", "kind": "SPAN_KIND_INTERNAL", "spanName": "llm","links": [], "events": [], "traceState": "", "startTime": 1718910214008467118, "endTime": 1718910214849631714, "duration": 841164596, "attributes": {"llm": {"completions": [{"message": {"content": "The term \\"pai-llm-trace\\" refers to a component or module within the project or codebase being worked on.", "role": "assistant"}}], "invocation_parameters": "{\\"temperature\\": 0.1, \\"model\\": \\"gpt-3.5-turbo\\"}", "model_name": "gpt-3.5-turbo", "prompt_template": {"template": "system: You are an expert Q&A system that is trusted around the world.\\nAlways answer the query using the provided context information, and not prior knowledge.\\nSome rules to follow:\\n1. Never directly reference the given context in your answer.\\n2. Avoid statements like \'Based on the context, ...\' or \'The context information ...\' or anything along those lines.\\nuser: Contextinformation is below.\\n---------------------\\n{context_str}\\n---------------------\\nGiven the context information and not prior knowledge, answer the query.\\nQuery: {query_str}\\nAnswer: \\nassistant: ", "variables": "{\\"context_str\\": \\"file_path: /mnt/disk1t/weidan.kong/work/llm_trace/client/makefile\\\\n\\\\n.PHONY: clean package test\\\\n\\\\nclean:\\\\n\\\\trm -rf build dist src/*.egg-info src/pai/llm_trace/__pycache__src/pai/llm_trace/__pycache__\\\\n\\\\npackage: clean\\\\n\\\\tpip install build && python -m build\\\\n\\\\nrun:\\\\n\\\\tbash tools/run_test.sh\\", \\"query_str\\": \\"Question 1: what is pai-llm-trace.\\"}"}, "prompts": [{"message": {"content": "You are an expert Q&A system that is trusted around the world.\\nAlways answer the query using the provided context information, and not prior knowledge.\\nSome rules to follow:\\n1. Never directly reference the given context in your answer.\\n2. Avoid statements like \'Based on the context, ...\' or \'The context information ...\' or anything along those lines.", "role": "system"}}, {"message": {"content": "Context information is below.\\n---------------------\\nfile_path: /mnt/disk1t/weidan.kong/work/llm_trace/client/makefile\\n\\n.PHONY: clean package test\\n\\nclean:\\n\\trm -rf build dist src/*.egg-info src/pai/llm_trace/__pycache__ src/pai/llm_trace/__pycache__\\n\\npackage: clean\\n\\tpip install build && python -m build\\n\\nrun:\\n\\tbash tools/run_test.sh\\n---------------------\\nGiven the context information and notprior knowledge, answer the query.\\nQuery: Question 1: what is pai-llm-trace.\\nAnswer: ", "role": "user"}}], "token_count": {"completion": "26", "prompt": "210", "total": "236"}}, "openinference": {"span": {"kind": "LLM"}}, "output": {"value": "The term \\"pai-llm-trace\\" refers to a component or module within the project or codebase being worked on."}, "pop": {"request": {"id": "AA68F16D-A8B7-1BB0-1BD1E47B5"}}}, "statusCode": "STATUS_CODE_OK", "statusMessage": ""}]

示例

正常返回示例

JSON格式

{
  "TotalCount": 22,
  "RequestId": "6A87228C-969A-1381-98CF-AE07AE630FA5",
  "Code": "ExecutionFailure",
  "Message": "failed to get trace data",
  "Traces": [
    "open telemetry compatible:\n{\"resource_spans\":[{\"resource\":{\"attributes\":[{\"key\":\"telemetry.sdk.language\",\"value\":{\"string_value\":\"python\"}},{\"key\":\"telemetry.sdk.name\",\"value\":{\"string_value\":\"opentelemetry\"}},{\"key\":\"telemetry.sdk.version\",\"value\":{\"string_value\":\"1.25.0\"}},{\"key\":\"service.name\",\"value\":{\"string_value\":\"llm_trace_llamaindex_test_template\"}},{\"key\":\"service.version\",\"value\":{\"string_value\":\"0.0.1\"}},{\"key\":\"deployment.environment\",\"value\":{\"string_value\":\"cn-hangzhou\"}},{\"key\":\"service.app.name\",\"value\":{\"string_value\":\"llm_trace_llamaindex_test\"}},{\"key\":\"service.owner.id\",\"value\":{\"string_value\":\"177393915\"}},{\"key\":\"service.owner.sub_id\",\"value\":{\"string_value\":\"230589443368\"}}]},\"scope_spans\":[{\"spans\":[{\"trace_id\":\"ae40025eae77fbab21687bd7e41c\",\"span_id\":\"e09710fbd3c6\",\"parent_span_id\":\"0\",\"name\":\"query\",\"kind\":\"SPAN_KIND_INTERNAL\",\"start_time_unix_nano\":\"1718910212440262841\",\"end_time_unix_nano\":\"1718910215403326721\",\"attributes\":[{\"key\":\"input.value\",\"value\":{\"string_value\":\"Question 1: what is pai-llm-trace.\"}},{\"key\":\"openinference.span.kind\",\"value\":{\"string_value\":\"CHAIN\"}},{\"key\":\"output.value\",\"value\":{\"string_value\":\"The term \\\"pai-llm-trace\\\" refers to a component or module within the project or codebase being worked on.\"}},{\"key\":\"pop.request.id\",\"value\":{\"string_value\":\"1C714E-057D-1216-835A-06F722E4F3\"}}],\"status\":{\"code\":\"STATUS_CODE_OK\"}},{\"trace_id\":\"444bae400277fbab21687bd7e41c\",\"span_id\":\"5686fa7a6b85b\",\"parent_span_id\":\"e0a5920fbd3c6\",\"name\":\"retrieve\",\"kind\":\"SPAN_KIND_INTERNAL\",\"start_time_unix_nano\":\"1718910212440884059\",\"end_time_unix_nano\":\"1718910213181800657\",\"attributes\":[{\"key\":\"input.value\",\"value\":{\"string_value\":\"Question 1: what is pai-llm-trace.\"}},{\"key\":\"openinference.span.kind\",\"value\":{\"string_value\":\"RETRIEVER\"}},{\"key\":\"pop.request.id\",\"value\":{\"string_value\":\"E8A1F8D9-D35A-1F9F-B724-5DA12E6F612\"}},{\"key\":\"retrieval.documents\",\"value\":{\"array_value\":{\"values\":[{\"kvlist_value\":{\"values\":[{\"key\":\"document.content\",\"value\":{\"string_value\":\".PHONY: clean package test\\n\\nclean:\\n\\trm -rf build dist src/*.egg-info src/pai/llm_trace/__pycache__ src/pai/llm_trace/__pycache__\\n\\npackage: clean\\n\\tpip install build && python -m build\\n\\nrun:\\n\\tbash tools/run_test.sh\"}},{\"key\":\"document.id\",\"value\":{\"string_value\":\"c3bd0f7e-2-42f7-964c-2fcc698657db\"}},{\"key\":\"document.metadata\",\"value\":{\"string_value\":\"{\\\"file_path\\\": \\\"/mnt/disk1t/weidan.kong/work/llm_trace/client/makefile\\\", \\\"file_name\\\": \\\"makefile\\\", \\\"file_size\\\": 213, \\\"creation_date\\\": \\\"2024-05-10\\\", \\\"last_modified_date\\\": \\\"2024-05-10\\\"}\"}},{\"key\":\"document.score\",\"value\":{\"double_value\":0.592955599044689}},{\"key\":\"vector_index\",\"value\":{\"int_value\":\"0\"}}]}}]}}}],\"status\":{\"code\":\"STATUS_CODE_OK\"}},{\"trace_id\":\"444bae4e77fbab21687bd7e41c\",\"span_id\":\"f2fc6f6acbc7\",\"parent_span_id\":\"f08a7a6b85b\",\"name\":\"embedding\",\"kind\":\"SPAN_KIND_INTERNAL\",\"start_time_unix_nano\":\"1718910212441468561\",\"end_time_unix_nano\":\"1718910212617063167\",\"attributes\":[{\"key\":\"embedding.embeddings\",\"value\":{\"array_value\":{\"values\":[{\"kvlist_value\":{\"values\":[{\"key\":\"embedding.text\",\"value\":{\"string_value\":\"Question 1: what is pai-llm-trace.\"}},{\"key\":\"embedding.vector\",\"value\":{\"string_value\":\"PAI LLM Trace system hid 1536 actual items\"}},{\"key\":\"vector_index\",\"value\":{\"int_value\":\"0\"}}]}}]}}},{\"key\":\"embedding.model_name\",\"value\":{\"string_value\":\"text-embedding-ada-002\"}},{\"key\":\"openinference.span.kind\",\"value\":{\"string_value\":\"EMBEDDING\"}},{\"key\":\"pop.request.id\",\"value\":{\"string_value\":\"56B9CCC3-CC84-0ED-53F60DC4A318\"}}],\"status\":{\"code\":\"STATUS_CODE_OK\"}},{\"trace_id\":\"ae25eae77fbab21687bd7e41c\",\"span_id\":\"f2ebc6f12b27\",\"parent_span_id\":\"e0a710fbd3c6\",\"name\":\"synthesize\",\"kind\":\"SPAN_KIND_INTERNAL\",\"start_time_unix_nano\":\"1718910213465461860\",\"end_time_unix_nano\":\"1718910215131655555\",\"attributes\":[{\"key\":\"input.value\",\"value\":{\"string_value\":\"Question 1: what is pai-llm-trace.\"}},{\"key\":\"openinference.span.kind\",\"value\":{\"string_value\":\"CHAIN\"}},{\"key\":\"output.value\",\"value\":{\"string_value\":\"The term \\\"pai-llm-trace\\\" refers to a component or module within the project or codebase being worked on.\"}},{\"key\":\"pop.request.id\",\"value\":{\"string_value\":\"67D34C0D-1CD5-11B7-BEE2-05649\"}}],\"status\":{\"code\":\"STATUS_CODE_OK\"}},{\"trace_id\":\"21687bd7e41c\",\"span_id\":\"d8f3fc7c\",\"parent_span_id\":\"f2ebc12b27\",\"name\":\"chunking\",\"kind\":\"SPAN_KIND_INTERNAL\",\"start_time_unix_nano\":\"1718910213467525240\",\"end_time_unix_nano\":\"1718910213467894216\",\"attributes\":[{\"key\":\"openinference.span.kind\",\"value\":{\"string_value\":\"CHAIN\"}},{\"key\":\"pop.request.id\",\"value\":{\"string_value\":\"8E1625-B8D1-EA177F9FC69D\"}}],\"status\":{\"code\":\"STATUS_CODE_OK\"}},{\"trace_id\":\"5eae77fbab7bd7e41c\",\"span_id\":\"0ab8a7c75\",\"parent_span_id\":\"f2ebcf12b27\",\"name\":\"chunking\",\"kind\":\"SPAN_KIND_INTERNAL\",\"start_time_unix_nano\":\"1718910213733013448\",\"end_time_unix_nano\":\"1718910213733446902\",\"attributes\":[{\"key\":\"openinference.span.kind\",\"value\":{\"string_value\":\"CHAIN\"}},{\"key\":\"pop.request.id\",\"value\":{\"string_value\":\"14D0D5-1675-BCA7-AF320E26C1A4\"}}],\"status\":{\"code\":\"STATUS_CODE_OK\"}},{\"trace_id\":\"25eae77fbab87bd7e41c\",\"span_id\":\"2cb6c51c5fb2\",\"parent_span_id\":\"f2ebcf12b27\",\"name\":\"llm\",\"kind\":\"SPAN_KIND_INTERNAL\",\"start_time_unix_nano\":\"1718910214008467118\",\"end_time_unix_nano\":\"1718910214849631714\",\"attributes\":[{\"key\":\"llm.completions\",\"value\":{\"array_value\":{\"values\":[{\"kvlist_value\":{\"values\":[{\"key\":\"message.content\",\"value\":{\"string_value\":\"The term \\\"pai-llm-trace\\\" refersto a component or module within the project or codebase being worked on.\"}},{\"key\":\"message.role\",\"value\":{\"string_value\":\"assistant\"}},{\"key\":\"vector_index\",\"value\":{\"int_value\":\"0\"}}]}}]}}},{\"key\":\"llm.invocation_parameters\",\"value\":{\"string_value\":\"{\\\"temperature\\\": 0.1, \\\"model\\\": \\\"gpt-3.5-turbo\\\"}\"}},{\"key\":\"llm.model_name\",\"value\":{\"string_value\":\"gpt-3.5-turbo\"}},{\"key\":\"llm.prompt_template.template\",\"value\":{\"string_value\":\"system: You are an expert Q&A system that is trusted around the world.\\nAlways answer the query using the provided context information, and not prior knowledge.\\nSome rules to follow:\\n1. Never directly reference thegiven context in your answer.\\n2. Avoid statements like 'Based on the context, ...' or 'The context information ...' or anything along those lines.\\nuser: Context information is below.\\n---------------------\\n{context_str}\\n---------------------\\nGiven the context information and not prior knowledge, answer the query.\\nQuery: {query_str}\\nAnswer: \\nassistant: \"}},{\"key\":\"llm.prompt_template.variables\",\"value\":{\"string_value\":\"{\\\"context_str\\\": \\\"file_path: /mnt/disk1t/weidan.kong/work/llm_trace/client/makefile\\\\n\\\\n.PHONY: clean package test\\\\n\\\\nclean:\\\\n\\\\trm -rf build dist src/*.egg-info src/pai/llm_trace/__pycache__ src/pai/llm_trace/__pycache__\\\\n\\\\npackage: clean\\\\n\\\\tpip install build && python -m build\\\\n\\\\nrun:\\\\n\\\\tbash tools/run_test.sh\\\", \\\"query_str\\\": \\\"Question 1: what is pai-llm-trace.\\\"}\"}},{\"key\":\"llm.prompts\",\"value\":{\"array_value\":{\"values\":[{\"kvlist_value\":{\"values\":[{\"key\":\"message.content\",\"value\":{\"string_value\":\"You are an expert Q&A system that is trusted around the world.\\nAlways answer the query using the provided context information, and not prior knowledge.\\nSome rules to follow:\\n1. Never directly reference the given context in your answer.\\n2. Avoid statements like 'Based on the context, ...' or 'The context information ...' or anything along those lines.\"}},{\"key\":\"message.role\",\"value\":{\"string_value\":\"system\"}},{\"key\":\"vector_index\",\"value\":{\"int_value\":\"0\"}}]}},{\"kvlist_value\":{\"values\":[{\"key\":\"message.content\",\"value\":{\"string_value\":\"Context information is below.\\n---------------------\\nfile_path: /mnt/disk1t/weidan.kong/work/llm_trace/client/makefile\\n\\n.PHONY: clean package test\\n\\nclean:\\n\\trm -rf build dist src/*.egg-info src/pai/llm_trace/__pycache__ src/pai/llm_trace/__pycache__\\n\\npackage: clean\\n\\tpip install build && python -m build\\n\\nrun:\\n\\tbash tools/run_test.sh\\n---------------------\\nGiven the context information and not prior knowledge, answer the query.\\nQuery: Question 1:what is pai-llm-trace.\\nAnswer: \"}},{\"key\":\"message.role\",\"value\":{\"string_value\":\"user\"}},{\"key\":\"vector_index\",\"value\":{\"int_value\":\"1\"}}]}}]}}},{\"key\":\"llm.token_count.completion\",\"value\":{\"string_value\":\"26\"}},{\"key\":\"llm.token_count.prompt\",\"value\":{\"string_value\":\"210\"}},{\"key\":\"llm.token_count.total\",\"value\":{\"string_value\":\"236\"}},{\"key\":\"openinference.span.kind\",\"value\":{\"string_value\":\"LLM\"}},{\"key\":\"output.value\",\"value\":{\"string_value\":\"The term \\\"pai-llm-trace\\\" refers to a component or module within the project or codebase being worked on.\"}},{\"key\":\"pop.request.id\",\"value\":{\"string_value\":\"AA68F16D-A8B7-1BB0-1BD1E47B5\"}}],\"status\":{\"code\":\"STATUS_CODE_OK\"}}]}]}]}\n\nopen telemetry incompatible:\n[{\"aliyunUid\": \"1773939154\", \"hostname\": \"qlc-ark2\", \"resources\": {\"deployment\": {\"environment\": \"cn-hangzhou\"}, \"service\": {\"app\": {\"name\": \"llm_trace_llamaindex_test\"}, \"name\": \"llm_trace_llamaindex_test_template\", \"owner\": {\"id\": \"177393915\", \"sub_id\": \"2305894433\"}, \"version\": \"0.0.1\"}, \"telemetry\": {\"sdk\": {\"language\": \"python\", \"name\": \"opentelemetry\", \"version\": \"1.25.0\"}}}, \"traceId\": \"444bae77fbab7bd7e41c\", \"spanId\": \"e0a0fbd3c6\", \"parentSpanId\": \"0\", \"kind\": \"SPAN_KIND_INTERNAL\", \"spanName\": \"query\", \"links\": [], \"events\": [], \"traceState\": \"\", \"startTime\": 1718910212440262841, \"endTime\": 1718910215403326721, \"duration\": 2963063880, \"attributes\": {\"input\": {\"value\": \"Question 1: what is pai-llm-trace.\"}, \"openinference\": {\"span\": {\"kind\": \"CHAIN\"}},\"output\": {\"value\": \"The term \\\\\"pai-llm-trace\\\\\" refers to a component or module within the project or codebase being worked on.\"}, \"pop\": {\"request\": {\"id\": \"1C70E14E-06F7D4F3\"}}}, \"statusCode\": \"STATUS_CODE_OK\", \"statusMessage\": \"\"}, {\"aliyunUid\": \"773939154\", \"hostname\": \"qlc-ark2\", \"resources\": {\"deployment\": {\"environment\": \"cn-hangzhou\"}, \"service\": {\"app\": {\"name\": \"llm_trace_llamaindex_test\"}, \"name\": \"llm_trace_llamaindex_test_template\", \"owner\": {\"id\": \"1773939\", \"sub_id\": \"2305819539\"}, \"version\": \"0.0.1\"}, \"telemetry\": {\"sdk\": {\"language\": \"python\", \"name\": \"opentelemetry\", \"version\": \"1.25.0\"}}}, \"traceId\": \"baeeae77fbab21687bd7e41c\", \"spanId\": \"ff08a7a6b85b\", \"parentSpanId\": \"e0a0fbd3c6\", \"kind\": \"SPAN_KIND_INTERNAL\", \"spanName\": \"retrieve\", \"links\": [], \"events\": [], \"traceState\": \"\",\"startTime\": 1718910212440884059, \"endTime\": 1718910213181800657, \"duration\": 740916598, \"attributes\": {\"input\": {\"value\": \"Question 1: what is pai-llm-trace.\"}, \"openinference\": {\"span\": {\"kind\": \"RETRIEVER\"}}, \"pop\": {\"request\": {\"id\": \"E8A1F8D9-D35A-1F9F2E6F612\"}}, \"retrieval\": {\"documents\": [{\"document\": {\"content\": \".PHONY: clean package test\\\\n\\\\nclean:\\\\n\\\\trm -rf build dist src/*.egg-info src/pai/llm_trace/__pycache__ src/pai/llm_trace/__pycache__\\\\n\\\\npackage: clean\\\\n\\\\tpip install build && python -m build\\\\n\\\\nrun:\\\\n\\\\tbash tools/run_test.sh\", \"id\": \"c3bd0f7e-7--2fcc8657db\", \"metadata\": \"{\\\\\"file_path\\\\\": \\\\\"/mnt/disk1t/weidan.kong/work/llm_trace/client/makefile\\\\\", \\\\\"file_name\\\\\": \\\\\"makefile\\\\\", \\\\\"file_size\\\\\": 213, \\\\\"creation_date\\\\\": \\\\\"2024-05-10\\\\\", \\\\\"last_modified_date\\\\\": \\\\\"2024-05-10\\\\\"}\", \"score\": 0.592955599044689}}]}}, \"statusCode\": \"STATUS_CODE_OK\", \"statusMessage\": \"\"}, {\"aliyunUid\": \"1773939154062345\", \"hostname\": \"qlc-ark2\", \"resources\": {\"deployment\": {\"environment\": \"cn-hangzhou\"}, \"service\": {\"app\": {\"name\": \"llm_trace_llamaindex_test\"}, \"name\": \"llm_trace_llamaindex_test_template\", \"owner\": {\"id\": \"1793915\", \"sub_id\": \"58944336\"}, \"version\": \"0.0.1\"}, \"telemetry\": {\"sdk\": {\"language\": \"python\", \"name\": \"opentelemetry\", \"version\": \"1.25.0\"}}}, \"traceId\": \"444baebd7e41c\", \"spanId\": \"fc6f6acbc7\", \"parentSpanId\": \"f08a7a6b85b\", \"kind\": \"SPAN_KIND_INTERNAL\", \"spanName\": \"embedding\", \"links\": [],\"events\": [], \"traceState\": \"\", \"startTime\": 1718910212441468561, \"endTime\": 1718910212617063167, \"duration\": 175594606, \"attributes\": {\"embedding\": {\"embeddings\": [{\"embedding\": {\"text\": \"Question 1: what is pai-llm-trace.\", \"vector\": \"PAI LLM Trace system hid 1536 actual items\"}}], \"model_name\": \"text-embedding-ada-002\"}, \"openinference\": {\"span\": {\"kind\": \"EMBEDDING\"}}, \"pop\": {\"request\": {\"id\": \"56B9CCC3-CC84-80ED-53F60DC4A\"}}}, \"statusCode\": \"STATUS_CODE_OK\", \"statusMessage\": \"\"}, {\"aliyunUid\": \"1773939\", \"hostname\": \"qlc-ark2\", \"resources\": {\"deployment\": {\"environment\": \"cn-hangzhou\"}, \"service\": {\"app\": {\"name\": \"llm_trace_llamaindex_test\"}, \"name\": \"llm_trace_llamaindex_test_template\", \"owner\": {\"id\": \"17739391540\", \"sub_id\": \"23058944336\"}, \"version\": \"0.0.1\"}, \"telemetry\": {\"sdk\": {\"language\": \"python\", \"name\": \"opentelemetry\", \"version\": \"1.25.0\"}}}, \"traceId\": \"eae77fbab21687bd7e41c\", \"spanId\": \"f2ebcf12b27\", \"parentSpanId\": \"e0afbd3c6\", \"kind\": \"SPAN_KIND_INTERNAL\", \"spanName\": \"synthesize\", \"links\": [], \"events\": [], \"traceState\": \"\", \"startTime\": 1718910213465461860, \"endTime\": 1718910215131655555, \"duration\": 1666193695, \"attributes\": {\"input\": {\"value\": \"Question 1: what is pai-llm-trace.\"}, \"openinference\": {\"span\": {\"kind\": \"CHAIN\"}}, \"output\": {\"value\": \"The term \\\\\"pai-llm-trace\\\\\" refers to a component or module within the project or codebase being worked on.\"}, \"pop\": {\"request\": {\"id\": \"67D34C0D-1CD5-11B7-BEE2-0F90DCC\"}}}, \"statusCode\": \"STATUS_CODE_OK\", \"statusMessage\": \"\"}, {\"aliyunUid\": \"1773939\", \"hostname\": \"qlc-ark2\", \"resources\": {\"deployment\": {\"environment\": \"cn-hangzhou\"}, \"service\": {\"app\": {\"name\": \"llm_trace_llamaindex_test\"}, \"name\": \"llm_trace_llamaindex_test_template\", \"owner\": {\"id\": \"177393915\", \"sub_id\": \"23058944336\"}, \"version\": \"0.0.1\"}, \"telemetry\": {\"sdk\": {\"language\": \"python\", \"name\": \"opentelemetry\", \"version\": \"1.25.0\"}}}, \"traceId\": \"ae0ae77fbab87bd7e41c\", \"spanId\": \"d8f3fc6d47c\", \"parentSpanId\": \"f2ebcf12b27\", \"kind\": \"SPAN_KIND_INTERNAL\", \"spanName\": \"chunking\", \"links\": [], \"events\": [], \"traceState\": \"\", \"startTime\": 1718910213467525240, \"endTime\": 1718910213467894216, \"duration\": 368976, \"attributes\": {\"openinference\": {\"span\": {\"kind\": \"CHAIN\"}}, \"pop\": {\"request\": {\"id\": \"8EA31C-5-B8D1-EA177F9FC69D\"}}}, \"statusCode\": \"STATUS_CODE_OK\", \"statusMessage\": \"\"}, {\"aliyunUid\": \"1773939\", \"hostname\": \"qlc-ark2\", \"resources\": {\"deployment\": {\"environment\": \"cn-hangzhou\"}, \"service\": {\"app\": {\"name\":\"llm_trace_llamaindex_test\"}, \"name\": \"llm_trace_llamaindex_test_template\", \"owner\": {\"id\": \"17739391\", \"sub_id\": \"23058944\"}, \"version\": \"0.0.1\"}, \"telemetry\": {\"sdk\": {\"language\": \"python\", \"name\":\"opentelemetry\", \"version\": \"1.25.0\"}}}, \"traceId\": \"444babd7e41c\", \"spanId\": \"0aba7c75\", \"parentSpanId\": \"f2ebc6f12b27\", \"kind\": \"SPAN_KIND_INTERNAL\", \"spanName\": \"chunking\", \"links\": [], \"events\": [], \"traceState\": \"\", \"startTime\": 1718910213733013448, \"endTime\": 1718910213733446902, \"duration\": 433454, \"attributes\": {\"openinference\": {\"span\": {\"kind\": \"CHAIN\"}}, \"pop\": {\"request\": {\"id\": \"14D0D75-BCA7-AFE26C1A4\"}}}, \"statusCode\": \"STATUS_CODE_OK\", \"statusMessage\": \"\"}, {\"aliyunUid\": \"1773939\", \"hostname\": \"qlc-ark2\", \"resources\": {\"deployment\": {\"environment\": \"cn-hangzhou\"}, \"service\": {\"app\": {\"name\": \"llm_trace_llamaindex_test\"}, \"name\": \"llm_trace_llamaindex_test_template\", \"owner\": {\"id\": \"177393915\", \"sub_id\": \"2305894433\"}, \"version\": \"0.0.1\"}, \"telemetry\": {\"sdk\": {\"language\": \"python\", \"name\": \"opentelemetry\", \"version\": \"1.25.0\"}}}, \"traceId\": \"ae025eae77fbab7bd7e41c\", \"spanId\": \"2cb6c51c5fb2\", \"parentSpanId\": \"f2ebc6f12b27\", \"kind\": \"SPAN_KIND_INTERNAL\", \"spanName\": \"llm\",\"links\": [], \"events\": [], \"traceState\": \"\", \"startTime\": 1718910214008467118, \"endTime\": 1718910214849631714, \"duration\": 841164596, \"attributes\": {\"llm\": {\"completions\": [{\"message\": {\"content\": \"The term \\\\\"pai-llm-trace\\\\\" refers to a component or module within the project or codebase being worked on.\", \"role\": \"assistant\"}}], \"invocation_parameters\": \"{\\\\\"temperature\\\\\": 0.1, \\\\\"model\\\\\": \\\\\"gpt-3.5-turbo\\\\\"}\", \"model_name\": \"gpt-3.5-turbo\", \"prompt_template\": {\"template\": \"system: You are an expert Q&A system that is trusted around the world.\\\\nAlways answer the query using the provided context information, and not prior knowledge.\\\\nSome rules to follow:\\\\n1. Never directly reference the given context in your answer.\\\\n2. Avoid statements like \\'Based on the context, ...\\' or \\'The context information ...\\' or anything along those lines.\\\\nuser: Contextinformation is below.\\\\n---------------------\\\\n{context_str}\\\\n---------------------\\\\nGiven the context information and not prior knowledge, answer the query.\\\\nQuery: {query_str}\\\\nAnswer: \\\\nassistant: \", \"variables\": \"{\\\\\"context_str\\\\\": \\\\\"file_path: /mnt/disk1t/weidan.kong/work/llm_trace/client/makefile\\\\\\\\n\\\\\\\\n.PHONY: clean package test\\\\\\\\n\\\\\\\\nclean:\\\\\\\\n\\\\\\\\trm -rf build dist src/*.egg-info src/pai/llm_trace/__pycache__src/pai/llm_trace/__pycache__\\\\\\\\n\\\\\\\\npackage: clean\\\\\\\\n\\\\\\\\tpip install build && python -m build\\\\\\\\n\\\\\\\\nrun:\\\\\\\\n\\\\\\\\tbash tools/run_test.sh\\\\\", \\\\\"query_str\\\\\": \\\\\"Question 1: what is pai-llm-trace.\\\\\"}\"}, \"prompts\": [{\"message\": {\"content\": \"You are an expert Q&A system that is trusted around the world.\\\\nAlways answer the query using the provided context information, and not prior knowledge.\\\\nSome rules to follow:\\\\n1. Never directly reference the given context in your answer.\\\\n2. Avoid statements like \\'Based on the context, ...\\' or \\'The context information ...\\' or anything along those lines.\", \"role\": \"system\"}}, {\"message\": {\"content\": \"Context information is below.\\\\n---------------------\\\\nfile_path: /mnt/disk1t/weidan.kong/work/llm_trace/client/makefile\\\\n\\\\n.PHONY: clean package test\\\\n\\\\nclean:\\\\n\\\\trm -rf build dist src/*.egg-info src/pai/llm_trace/__pycache__ src/pai/llm_trace/__pycache__\\\\n\\\\npackage: clean\\\\n\\\\tpip install build && python -m build\\\\n\\\\nrun:\\\\n\\\\tbash tools/run_test.sh\\\\n---------------------\\\\nGiven the context information and notprior knowledge, answer the query.\\\\nQuery: Question 1: what is pai-llm-trace.\\\\nAnswer: \", \"role\": \"user\"}}], \"token_count\": {\"completion\": \"26\", \"prompt\": \"210\", \"total\": \"236\"}}, \"openinference\": {\"span\": {\"kind\": \"LLM\"}}, \"output\": {\"value\": \"The term \\\\\"pai-llm-trace\\\\\" refers to a component or module within the project or codebase being worked on.\"}, \"pop\": {\"request\": {\"id\": \"AA68F16D-A8B7-1BB0-1BD1E47B5\"}}}, \"statusCode\": \"STATUS_CODE_OK\", \"statusMessage\": \"\"}]"
  ]
}

错误码

访问错误中心查看更多错误码。