2

Use a 3-party openai provider.

It seems that the latest Wingpro version still doesn't let me use a 3-party openai provider, say, this one.

hongyi-zhao's avatar
557
hongyi-zhao
asked 2024-04-09 04:40:56 -0500
Wingware Support's avatar
4.2k
Wingware Support
updated 2024-04-11 09:59:42 -0500
edit flag offensive 0 remove flag close merge delete

Comments

add a comment see more comments

1 Answer

0

No, indeed, we haven't had time to work on this or adding support for any other AI provider yet. Do you happen to know if there's a way to trick the official OpenAI API into using the one you want? If so, it might not be hard to modify the files in plugin/ai in your Wing installation. If you send us those mods we can try to get them into our code base in some form.

Wingware Support's avatar
4.2k
Wingware Support
answered 2024-04-11 09:59:31 -0500
edit flag offensive 0 remove flag delete link

Comments

hongyi-zhao's avatar hongyi-zhao (2024-04-11 21:19:16 -0500) edit
2

When you look at adding other AI providers, could you consider starting with Ollama? It's trivial to install, uses both a simple OpenAI compatible REST interface or command line and, perhaps most importantly, allows the LLM to be run on the users machine. I can see many people preferring to run their own LLM so that their code isn't being transferred to a third party. If you use the REST interface and allow the user to change the machine and port it uses, then a company could even set up Ollama on one machine and share it to all programmers with practically no effort.

Ian_S's avatar Ian_S (2024-05-07 09:32:19 -0500) edit
1

Yes, that's at the top of our list for other AI providers, for the reasons you highlight. Thanks!

Wingware Support's avatar Wingware Support (2024-05-07 15:45:12 -0500) edit

Any updates on this? It would be a game-changer :)

Joril's avatar Joril (2025-03-09 13:50:41 -0500) edit

We did look at doing this but the results obtained from locally run models was so bad that we paused work on this until they work better. Have you had decent results from any of the locally run models? If so, which one?

Wingware Support's avatar Wingware Support (2025-03-09 16:14:48 -0500) edit
add a comment see more comments

Your Answer

Please start posting anonymously - your entry will be published after you log in or create a new account. This space is reserved only for answers. If you would like to engage in a discussion, please instead post a comment under the question or an answer that you would like to discuss.

Add Answer