Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

asyncopenai failed #2116

Open
1 task done
willy808 opened this issue Feb 13, 2025 · 1 comment
Open
1 task done

asyncopenai failed #2116

willy808 opened this issue Feb 13, 2025 · 1 comment
Labels
bug Something isn't working

Comments

@willy808
Copy link

Confirm this is an issue with the Python library and not an underlying OpenAI API

  • This is an issue with the Python library

Describe the bug

async with PROCESS_VLM_CALLING_SEMAPHORE:

        client = openai.AsyncOpenAI(
            base_url = vlm_url,
            api_key = "EMPTY",
            )

        chat_response = await client.chat.completions.create(
            model=model_name,
            messages=messages,
            temperature=0.0,
            top_p=0.1,
            frequency_penalty = 0.2,
            n=1,
            stream=True,
            max_tokens=1024,
        )

        async for chunk in chat_response:

Traceback (most recent call last):
File "/usr/local/lib/python3.10/site-packages/openai/_base_client.py", line 1589, in _request
response = await self._client.send(
File "/usr/local/lib/python3.10/site-packages/httpx/_client.py", line 1629, in send
response = await self._send_handling_auth(
File "/usr/local/lib/python3.10/site-packages/httpx/_client.py", line 1657, in _send_handling_auth
response = await self._send_handling_redirects(
File "/usr/local/lib/python3.10/site-packages/httpx/_client.py", line 1694, in _send_handling_redirects
response = await self._send_single_request(request)
File "/usr/local/lib/python3.10/site-packages/httpx/_client.py", line 1730, in _send_single_request
response = await transport.handle_async_request(request)
File "/usr/local/lib/python3.10/site-packages/httpx/_transports/default.py", line 394, in handle_async_request
resp = await self._pool.handle_async_request(req)
File "/usr/local/lib/python3.10/site-packages/httpcore/_async/connection_pool.py", line 256, in handle_async_request
raise exc from None
File "/usr/local/lib/python3.10/site-packages/httpcore/_async/connection_pool.py", line 236, in handle_async_request
response = await connection.handle_async_request(
File "/usr/local/lib/python3.10/site-packages/httpcore/_async/connection.py", line 101, in handle_async_request
raise exc
File "/usr/local/lib/python3.10/site-packages/httpcore/_async/connection.py", line 78, in handle_async_request
stream = await self._connect(request)
File "/usr/local/lib/python3.10/site-packages/httpcore/_async/connection.py", line 124, in _connect
stream = await self._network_backend.connect_tcp(**kwargs)
File "/usr/local/lib/python3.10/site-packages/httpcore/_backends/auto.py", line 31, in connect_tcp
return await self._backend.connect_tcp(
File "/usr/local/lib/python3.10/site-packages/httpcore/_backends/anyio.py", line 115, in connect_tcp
stream: anyio.abc.ByteStream = await anyio.connect_tcp(
File "/usr/local/lib/python3.10/site-packages/anyio/_core/_sockets.py", line 227, in connect_tcp
async with create_task_group() as tg:
RuntimeError: Task got bad yield: True

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
File "/app/src/app.py", line 2106, in doc_analysis
File "/app/src/app.py", line 1450, in call_vlm
File "/usr/local/lib/python3.10/site-packages/openai/resources/chat/completions.py", line 1727, in create
return await self._post(
File "/usr/local/lib/python3.10/site-packages/openai/_base_client.py", line 1856, in post
return await self.request(cast_to, opts, stream=stream, stream_cls=stream_cls)
File "/usr/local/lib/python3.10/site-packages/openai/_base_client.py", line 1550, in request
return await self._request(
File "/usr/local/lib/python3.10/site-packages/openai/_base_client.py", line 1613, in _request
return await self._retry_request(
File "/usr/local/lib/python3.10/site-packages/openai/_base_client.py", line 1683, in _retry_request
return await self._request(
File "/usr/local/lib/python3.10/site-packages/openai/_base_client.py", line 1613, in _request
return await self._retry_request(
File "/usr/local/lib/python3.10/site-packages/openai/_base_client.py", line 1683, in _retry_request
return await self._request(
File "/usr/local/lib/python3.10/site-packages/openai/_base_client.py", line 1623, in _request
raise APIConnectionError(request=request) from err
openai.APIConnectionError: Connection error.

To Reproduce

async with PROCESS_VLM_CALLING_SEMAPHORE:

        client = openai.AsyncOpenAI(
            base_url = vlm_url,
            api_key = "EMPTY",
            )

        chat_response = await client.chat.completions.create(
            model=model_name,
            messages=messages,
            temperature=0.0,
            top_p=0.1,
            frequency_penalty = 0.2,
            n=1,
            stream=True,
            max_tokens=1024,
        )

        async for chunk in chat_response:

Traceback (most recent call last):
File "/usr/local/lib/python3.10/site-packages/openai/_base_client.py", line 1589, in _request
response = await self._client.send(
File "/usr/local/lib/python3.10/site-packages/httpx/_client.py", line 1629, in send
response = await self._send_handling_auth(
File "/usr/local/lib/python3.10/site-packages/httpx/_client.py", line 1657, in _send_handling_auth
response = await self._send_handling_redirects(
File "/usr/local/lib/python3.10/site-packages/httpx/_client.py", line 1694, in _send_handling_redirects
response = await self._send_single_request(request)
File "/usr/local/lib/python3.10/site-packages/httpx/_client.py", line 1730, in _send_single_request
response = await transport.handle_async_request(request)
File "/usr/local/lib/python3.10/site-packages/httpx/_transports/default.py", line 394, in handle_async_request
resp = await self._pool.handle_async_request(req)
File "/usr/local/lib/python3.10/site-packages/httpcore/_async/connection_pool.py", line 256, in handle_async_request
raise exc from None
File "/usr/local/lib/python3.10/site-packages/httpcore/_async/connection_pool.py", line 236, in handle_async_request
response = await connection.handle_async_request(
File "/usr/local/lib/python3.10/site-packages/httpcore/_async/connection.py", line 101, in handle_async_request
raise exc
File "/usr/local/lib/python3.10/site-packages/httpcore/_async/connection.py", line 78, in handle_async_request
stream = await self._connect(request)
File "/usr/local/lib/python3.10/site-packages/httpcore/_async/connection.py", line 124, in _connect
stream = await self._network_backend.connect_tcp(**kwargs)
File "/usr/local/lib/python3.10/site-packages/httpcore/_backends/auto.py", line 31, in connect_tcp
return await self._backend.connect_tcp(
File "/usr/local/lib/python3.10/site-packages/httpcore/_backends/anyio.py", line 115, in connect_tcp
stream: anyio.abc.ByteStream = await anyio.connect_tcp(
File "/usr/local/lib/python3.10/site-packages/anyio/_core/_sockets.py", line 227, in connect_tcp
async with create_task_group() as tg:
RuntimeError: Task got bad yield: True

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
File "/app/src/app.py", line 2106, in doc_analysis
File "/app/src/app.py", line 1450, in call_vlm
File "/usr/local/lib/python3.10/site-packages/openai/resources/chat/completions.py", line 1727, in create
return await self._post(
File "/usr/local/lib/python3.10/site-packages/openai/_base_client.py", line 1856, in post
return await self.request(cast_to, opts, stream=stream, stream_cls=stream_cls)
File "/usr/local/lib/python3.10/site-packages/openai/_base_client.py", line 1550, in request
return await self._request(
File "/usr/local/lib/python3.10/site-packages/openai/_base_client.py", line 1613, in _request
return await self._retry_request(
File "/usr/local/lib/python3.10/site-packages/openai/_base_client.py", line 1683, in _retry_request
return await self._request(
File "/usr/local/lib/python3.10/site-packages/openai/_base_client.py", line 1613, in _request
return await self._retry_request(
File "/usr/local/lib/python3.10/site-packages/openai/_base_client.py", line 1683, in _retry_request
return await self._request(
File "/usr/local/lib/python3.10/site-packages/openai/_base_client.py", line 1623, in _request
raise APIConnectionError(request=request) from err
openai.APIConnectionError: Connection error.

Code snippets

OS

ubuntu 22.04

Python version

python 3.10

Library version

openai v1.61.0

@willy808 willy808 added the bug Something isn't working label Feb 13, 2025
@Programmer-RD-AI
Copy link

Hi,
I think this issue is caused due to a network error, you can check https://help.openai.com/en/articles/6897191-apiconnectionerror to resolve the error
Hope this helps

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants