OpenAI Agents SDK unsupported_country_region_territory 错误解决

发布于:2025-03-20 ⋅ 阅读:(33) ⋅ 点赞:(0)

OpenAI Agents SDK unsupported_country_region_territory 错误解决

1. 错误信息

在使用 Openai Agents SDK时,报错如下,

Tracing client error 403: {"error":{"code":"unsupported_country_region_territory","message":"Country, region, or territory not supported","param":null,"type":"request_forbidden"}}

2. 原因

Agents SDK 默认开启了 Tracing,会调用 https://api.openai.com/v1/traces/ingest 这个 api 上传 Trace。

2025-03-19 20:46:13 DEBUG [_trace.py:47] connect_tcp.started host='api.openai.com' port=443 local_address=None timeout=5.0 socket_options=None
2025-03-19 20:46:13 DEBUG [_trace.py:47] connect_tcp.complete return_value=<httpcore._backends.sync.SyncStream object at 0x7a35cfcf8220>
2025-03-19 20:46:13 DEBUG [_trace.py:47] start_tls.started ssl_context=<ssl.SSLContext object at 0x7a35cda61b40> server_hostname='api.openai.com' timeout=5.0
2025-03-19 20:46:13 DEBUG [_trace.py:47] start_tls.complete return_value=<httpcore._backends.sync.SyncStream object at 0x7a35cfcf81f0>
2025-03-19 20:46:13 DEBUG [_trace.py:47] send_request_headers.started request=<Request [b'POST']>
2025-03-19 20:46:13 DEBUG [_trace.py:47] send_request_headers.complete
2025-03-19 20:46:13 DEBUG [_trace.py:47] send_request_body.started request=<Request [b'POST']>
2025-03-19 20:46:13 DEBUG [_trace.py:47] send_request_body.complete
2025-03-19 20:46:13 DEBUG [_trace.py:47] receive_response_headers.started request=<Request [b'POST']>
2025-03-19 20:46:13 DEBUG [_trace.py:47] receive_response_headers.complete return_value=(b'HTTP/1.1', 403, b'Forbidden', [(b'Date', b'Wed, 19 Mar 2025 12:46:15 GMT'), (b'Content-Type', b'application/json'), (b'Transfer-Encoding', b'chunked'), (b'Connection', b'keep-alive'), (b'Vary', b'Accept-Encoding'), (b'Strict-Transport-Security', b'max-age=31536000; includeSubDomains; preload'), (b'X-Content-Type-Options', b'nosniff'), (b'Server', b'cloudflare'), (b'CF-RAY', b'922d1250ce9f332a-HKG'), (b'Content-Encoding', b'gzip'), (b'alt-svc', b'h3=":443"; ma=86400')])
2025-03-19 20:46:13 INFO [_client.py:1025] HTTP Request: POST https://api.openai.com/v1/traces/ingest "HTTP/1.1 403 Forbidden"
2025-03-19 20:46:13 DEBUG [_trace.py:47] receive_response_body.started request=<Request [b'POST']>
2025-03-19 20:46:13 DEBUG [_trace.py:47] receive_response_body.complete
2025-03-19 20:46:13 DEBUG [_trace.py:47] response_closed.started
2025-03-19 20:46:13 DEBUG [_trace.py:47] response_closed.complete
Tracing client error 403: {"error":{"code":"unsupported_country_region_territory","message":"Country, region, or territory not supported","param":null,"type":"request_forbidden"}}

3. 解决方法

方法一:

设置环境变量 OPENAI_AGENTS_DISABLE_TRACING=1

方法二:

设置 agents.run.RunConfig.tracing_disabled 为 True,

    result = Runner.run_sync(
        agent,
        "写一段编程中递归的代码示例。",
        run_config=RunConfig(tracing_disabled=False),
    )

方法三:

使用 LangSmith,

安装 “langsmith[openai-agents]”,

pip install "langsmith[openai-agents]"

示例代码,

import os
import asyncio
from agents import Agent, Runner, set_trace_processors
from langsmith.wrappers import OpenAIAgentsTracingProcessor

os.environ["LANGSMITH_API_KEY"] = "lsv2_pt_******"
os.environ["LANGSMITH_ENDPOINT"] = "https://api.smith.langchain.com"

async def main():
    agent = Agent(
        name="Captain Obvious",
        instructions="You are Captain Obvious, the world's most literal technical support agent.",
    )

    question = "Why is my code failing when I try to divide by zero? I keep getting this error message."
    result = await Runner.run(agent, question)
    print(result.final_output)

if __name__ == "__main__":
    set_trace_processors([OpenAIAgentsTracingProcessor()])
    asyncio.run(main())

之后可以在 LangSmith 平台查看 Trace,

在这里插入图片描述

方法四:

方法三: 的基础上,引入LangFuse,

不先改成使用 LangSmith,还是会调用 https://api.openai.com/v1/traces/ingest 这个 api 上传 Trace。

安装依赖,

pip install openai-agents
pip install nest_asyncio
pip install pydantic-ai[logfire]

设置环境变量和 LangFuse 认证信息,

import os
import base64
 
# Replace with your Langfuse keys.
os.environ["LANGFUSE_PUBLIC_KEY"] = "pk-lf-..."  
os.environ["LANGFUSE_SECRET_KEY"] = "sk-lf-..."  
os.environ["LANGFUSE_HOST"] = "https://cloud.langfuse.com"  # or 'https://us.cloud.langfuse.com'
 
# Build Basic Auth header.
LANGFUSE_AUTH = base64.b64encode(
    f"{os.environ.get('LANGFUSE_PUBLIC_KEY')}:{os.environ.get('LANGFUSE_SECRET_KEY')}".encode()
).decode()
 
# Configure OpenTelemetry endpoint & headers
os.environ["OTEL_EXPORTER_OTLP_ENDPOINT"] = os.environ.get("LANGFUSE_HOST") + "/api/public/otel"
os.environ["OTEL_EXPORTER_OTLP_HEADERS"] = f"Authorization=Basic {LANGFUSE_AUTH}"

Instrumenting the Agent,

import nest_asyncio
nest_asyncio.apply()
import logfire
 
# Configure logfire instrumentation.
logfire.configure(
    service_name='my_agent_service',
 
    send_to_logfire=False,
)
# This method automatically patches the OpenAI Agents SDK to send logs via OTLP to Langfuse.
logfire.instrument_openai_agents()

示例,

import asyncio
from agents import Agent, Runner
 
async def main():
    agent = Agent(
        name="Assistant",
        instructions="You only respond in haikus.",
    )
 
    result = await Runner.run(agent, "Tell me about recursion in programming.")
    print(result.final_output)
 
if __name__ == "__main__":
    set_trace_processors([OpenAIAgentsTracingProcessor()])
    asyncio.run(main())

这样的话,Trace 也会上传到 LangFuse。

在这里插入图片描述

参考资料:


网站公告

今日签到

点亮在社区的每一天
去签到