First time here? Check out the FAQ!

Revision history  [back]

Trouble setting up AI

I have trouble setting up AI. I've tried multiple models. For gpt-4o-mini it says "unsupported model" (see the error message). For gpt-4o it's even worse, I get "The requested model 'gpt-4o' does not exist". It allows gpt-4-turbo though. Is it not possible to use gpt-4o-mini or gpt-4o?

The console shows the following error message

> Starting AI Chat "C:\Program
> Files\Wing Pro
> 10\bin\__os__\win32\runtime-python3.11\python.exe" "C:\Users\gemis\AppData\Roaming\Wing
> Pro
> 10\updates\from_10.0.2.0\10.0.5.0\plugins\ai\_openai_chat.py"
> WINGHOME=C:\Program Files\Wing Pro 10
> WINGSETTINGSDIR=C:\Users\gemis\AppData\Roaming\Wing
> Pro 10 STARTED=127.0.0.1:57015
> 
> Could not create AI Assistant:
> Traceback (most recent call last):  
> File
> "C:\Users\gemis\AppData\Roaming\Wing
> Pro
> 10\updates\from_10.0.2.0\10.0.5.0\plugins\ai\_openai_chat.py",
> line 94, in create
>     self.assistant = client.beta.assistants.create(
>                      ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^   File
> "C:\Program Files\Wing Pro
> 10\bin\__os__\win32\runtime-pip-openai-py3.11\lib\site-packages\openai\resources\beta\assistants\assistants.py",
> line 95, in create
>     return self._post(
>            ^^^^^^^^^^^   File "C:\Program Files\Wing Pro
> 10\bin\__os__\win32\runtime-pip-openai-py3.11\lib\site-packages\openai\_base_client.py",
> line 1088, in post
>     return cast(ResponseT, self.request(cast_to, opts,
> stream=stream, stream_cls=stream_cls))
>                            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
> File "C:\Program Files\Wing Pro
> 10\bin\__os__\win32\runtime-pip-openai-py3.11\lib\site-packages\openai\_base_client.py",
> line 853, in request
>     return self._request(
>            ^^^^^^^^^^^^^^   File "C:\Program Files\Wing Pro
> 10\bin\__os__\win32\runtime-pip-openai-py3.11\lib\site-packages\openai\_base_client.py",
> line 930, in _request
>     raise self._make_status_error_from_response(err.response)
> from None openai.BadRequestError:
> Error code: 400 - {'error':
> {'message': "The requested model
> 'gpt-4o-mini' cannot be used with the
> Assistants API in v1. Follow the
> migration guide to upgrade to v2:
> https://platform.openai.com/docs/assistants/migration.",
> 'type': 'invalid_request_error',
> 'param': 'model', 'code':
> 'unsupported_model'}}

Trouble setting up AI

I have trouble setting up AI. I've tried multiple models. For gpt-4o-mini it says "unsupported model" (see the error message). For gpt-4o it's even worse, I get "The requested model 'gpt-4o' does not exist". It allows gpt-4-turbo though. Is it not possible to use gpt-4o-mini or gpt-4o?

The console shows the following error message

> Starting AI Chat "C:\Program
> Files\Wing Pro
> 10\bin\__os__\win32\runtime-python3.11\python.exe" "C:\Users\gemis\AppData\Roaming\Wing
> Pro
> 10\updates\from_10.0.2.0\10.0.5.0\plugins\ai\_openai_chat.py"
> WINGHOME=C:\Program Files\Wing Pro 10
> WINGSETTINGSDIR=C:\Users\gemis\AppData\Roaming\Wing
> Pro 10 STARTED=127.0.0.1:57015
> 
> Could not create AI Assistant:
> Traceback (most recent call last):  
> File
> "C:\Users\gemis\AppData\Roaming\Wing
> Pro
> 10\updates\from_10.0.2.0\10.0.5.0\plugins\ai\_openai_chat.py",
> line 94, in create
>     self.assistant = client.beta.assistants.create(
>                      ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^   File
> "C:\Program Files\Wing Pro
> 10\bin\__os__\win32\runtime-pip-openai-py3.11\lib\site-packages\openai\resources\beta\assistants\assistants.py",
> line 95, in create
>     return self._post(
>            ^^^^^^^^^^^   File "C:\Program Files\Wing Pro
> 10\bin\__os__\win32\runtime-pip-openai-py3.11\lib\site-packages\openai\_base_client.py",
> line 1088, in post
>     return cast(ResponseT, self.request(cast_to, opts,
> stream=stream, stream_cls=stream_cls))
>                            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
> File "C:\Program Files\Wing Pro
> 10\bin\__os__\win32\runtime-pip-openai-py3.11\lib\site-packages\openai\_base_client.py",
> line 853, in request
>     return self._request(
>            ^^^^^^^^^^^^^^   File "C:\Program Files\Wing Pro
> 10\bin\__os__\win32\runtime-pip-openai-py3.11\lib\site-packages\openai\_base_client.py",
> line 930, in _request
>     raise self._make_status_error_from_response(err.response)
> from None openai.BadRequestError:
> Error code: 400 - {'error':
> {'message': "The requested model
> 'gpt-4o-mini' cannot be used with the
> Assistants API in v1. Follow the
> migration guide to upgrade to v2:
> https://platform.openai.com/docs/assistants/migration.",
> 'type': 'invalid_request_error',
> 'param': 'model', 'code':
> 'unsupported_model'}}

Trouble setting up AI

I have trouble setting up AI. I've tried multiple models. For gpt-4o-mini it says "unsupported model" (see the error message). For gpt-4o it's even worse, I get "The requested model 'gpt-4o' does not exist". It allows gpt-4-turbo though. Is it not possible to use gpt-4o-mini or gpt-4o?

The console shows the following error message

> Starting AI Chat "C:\Program
> Files\Wing Pro
> 10\bin\__os__\win32\runtime-python3.11\python.exe" "C:\Users\gemis\AppData\Roaming\Wing
> Pro
> 10\updates\from_10.0.2.0\10.0.5.0\plugins\ai\_openai_chat.py"
> WINGHOME=C:\Program Files\Wing Pro 10
> WINGSETTINGSDIR=C:\Users\gemis\AppData\Roaming\Wing
> Pro 10 STARTED=127.0.0.1:57015
> 
> Could not create AI Assistant:
> Traceback (most recent call last):  
> File
> "C:\Users\gemis\AppData\Roaming\Wing
> Pro
> 10\updates\from_10.0.2.0\10.0.5.0\plugins\ai\_openai_chat.py",
> line 94, in create
>     self.assistant = client.beta.assistants.create(
>                      ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^   File
> "C:\Program Files\Wing Pro
> 10\bin\__os__\win32\runtime-pip-openai-py3.11\lib\site-packages\openai\resources\beta\assistants\assistants.py",
> line 95, in create
>     return self._post(
>            ^^^^^^^^^^^   File "C:\Program Files\Wing Pro
> 10\bin\__os__\win32\runtime-pip-openai-py3.11\lib\site-packages\openai\_base_client.py",
> line 1088, in post
>     return cast(ResponseT, self.request(cast_to, opts,
> stream=stream, stream_cls=stream_cls))
>                            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
> File "C:\Program Files\Wing Pro
> 10\bin\__os__\win32\runtime-pip-openai-py3.11\lib\site-packages\openai\_base_client.py",
> line 853, in request
>     return self._request(
>            ^^^^^^^^^^^^^^   File "C:\Program Files\Wing Pro
> 10\bin\__os__\win32\runtime-pip-openai-py3.11\lib\site-packages\openai\_base_client.py",
> line 930, in _request
>     raise self._make_status_error_from_response(err.response)
> from None openai.BadRequestError:
> Error code: 400 - {'error':
> {'message': "The requested model
> 'gpt-4o-mini' cannot be used with the
> Assistants API in v1. Follow the
> migration guide to upgrade to v2:
> https://platform.openai.com/docs/assistants/migration.",
> 'type': 'invalid_request_error',
> 'param': 'model', 'code':
> 'unsupported_model'}}

Trouble setting up AI

I have trouble setting up AI. I've tried multiple models. For gpt-4o-mini it says "unsupported model" (see the error message). For gpt-4o it's even worse, I get "The requested model 'gpt-4o' does not exist". It allows gpt-4-turbo though. Is it not possible to use gpt-4o-mini or gpt-4o?

The console shows the following error message

> Starting AI Chat "C:\Program
> Files\Wing Pro
> 10\bin\__os__\win32\runtime-python3.11\python.exe" "C:\Users\gemis\AppData\Roaming\Wing
> Pro
> 10\updates\from_10.0.2.0\10.0.5.0\plugins\ai\_openai_chat.py"
> WINGHOME=C:\Program Files\Wing Pro 10
> WINGSETTINGSDIR=C:\Users\gemis\AppData\Roaming\Wing
> Pro 10 STARTED=127.0.0.1:57015
> 
> Could not create AI Assistant:
> Traceback (most recent call last):  
> File
> "C:\Users\gemis\AppData\Roaming\Wing
> Pro
> 10\updates\from_10.0.2.0\10.0.5.0\plugins\ai\_openai_chat.py",
> line 94, in create
>     self.assistant = client.beta.assistants.create(
>                      ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^   File
> "C:\Program Files\Wing Pro
> 10\bin\__os__\win32\runtime-pip-openai-py3.11\lib\site-packages\openai\resources\beta\assistants\assistants.py",
> line 95, in create
>     return self._post(
>            ^^^^^^^^^^^   File "C:\Program Files\Wing Pro
> 10\bin\__os__\win32\runtime-pip-openai-py3.11\lib\site-packages\openai\_base_client.py",
> line 1088, in post
>     return cast(ResponseT, self.request(cast_to, opts,
> stream=stream, stream_cls=stream_cls))
>                            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
> File "C:\Program Files\Wing Pro
> 10\bin\__os__\win32\runtime-pip-openai-py3.11\lib\site-packages\openai\_base_client.py",
> line 853, in request
>     return self._request(
>            ^^^^^^^^^^^^^^   File "C:\Program Files\Wing Pro
> 10\bin\__os__\win32\runtime-pip-openai-py3.11\lib\site-packages\openai\_base_client.py",
> line 930, in _request
>     raise self._make_status_error_from_response(err.response)
> from None openai.BadRequestError:
> Error code: 400 - {'error':
> {'message': "The requested model
> 'gpt-4o-mini' cannot be used with the
> Assistants API in v1. Follow the
> migration guide to upgrade to v2:
> https://platform.openai.com/docs/assistants/migration.",
> 'type': 'invalid_request_error',
> 'param': 'model', 'code':
> 'unsupported_model'}}

Trouble setting up AI

I have trouble setting up AI. I've tried multiple models. For gpt-4o-mini it says "unsupported model" (see the error message). For gpt-4o it's even worse, I get "The requested model 'gpt-4o' does not exist". It allows gpt-4-turbo though. Is it not possible to use gpt-4o-mini or gpt-4o?

The console shows the following error message

> 

Starting AI Chat "C:\Program > Files\Wing Pro > 10\bin\__os__\win32\runtime-python3.11\python.exe" Files\Wing Pro 10\bin__os__\win32\runtime-python3.11\python.exe" "C:\Users\gemis\AppData\Roaming\Wing > Pro > 10\updates\from_10.0.2.0\10.0.5.0\plugins\ai\_openai_chat.py" > Pro 10\updates\from_10.0.2.0\10.0.5.0\plugins\ai_openai_chat.py" WINGHOME=C:\Program Files\Wing Pro 10 > WINGSETTINGSDIR=C:\Users\gemis\AppData\Roaming\Wing > Pro 10 STARTED=127.0.0.1:57015 > > STARTED=127.0.0.1:57015

Could not create AI Assistant: > Traceback (most recent call last): >
File > "C:\Users\gemis\AppData\Roaming\Wing > Pro > 10\updates\from_10.0.2.0\10.0.5.0\plugins\ai\_openai_chat.py", > Pro 10\updates\from_10.0.2.0\10.0.5.0\plugins\ai_openai_chat.py", line 94, in create > self.assistant = client.beta.assistants.create( > ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File > "C:\Program Files\Wing Pro > 10\bin\__os__\win32\runtime-pip-openai-py3.11\lib\site-packages\openai\resources\beta\assistants\assistants.py", > 10\bin__os__\win32\runtime-pip-openai-py3.11\lib\site-packages\openai\resources\beta\assistants\assistants.py", line 95, in create > return self._post( > ^^^^^^^^^^^ File "C:\Program Files\Wing Pro > 10\bin\__os__\win32\runtime-pip-openai-py3.11\lib\site-packages\openai\_base_client.py", > 10\bin__os__\win32\runtime-pip-openai-py3.11\lib\site-packages\openai_base_client.py", line 1088, in post > return cast(ResponseT, self.request(cast_to, opts, > stream=stream, stream_cls=stream_cls)) > ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ > File "C:\Program Files\Wing Pro > 10\bin\__os__\win32\runtime-pip-openai-py3.11\lib\site-packages\openai\_base_client.py", > 10\bin__os__\win32\runtime-pip-openai-py3.11\lib\site-packages\openai_base_client.py", line 853, in request > return self._request( > ^^^^^^^^^^^^^^ File "C:\Program Files\Wing Pro > 10\bin\__os__\win32\runtime-pip-openai-py3.11\lib\site-packages\openai\_base_client.py", > 10\bin__os__\win32\runtime-pip-openai-py3.11\lib\site-packages\openai_base_client.py", line 930, in _request > raise self._make_status_error_from_response(err.response) > from None openai.BadRequestError: > Error code: 400 - {'error': > {'message': "The requested model > 'gpt-4o-mini' cannot be used with the > Assistants API in v1. Follow the > migration guide to upgrade to v2: > https://platform.openai.com/docs/assistants/migration.", > 'type': 'invalid_request_error', > 'param': 'model', 'code': > 'unsupported_model'}}

'unsupported_model'}}