Use a 3-party openai provider.
It seems that the latest Wingpro version still doesn't let me use a 3-party openai provider, say, this one.
It seems that the latest Wingpro version still doesn't let me use a 3-party openai provider, say, this one.
No, indeed, we haven't had time to work on this or adding support for any other AI provider yet. Do you happen to know if there's a way to trick the official OpenAI API into using the one you want? If so, it might not be hard to modify the files in plugin/ai in your Wing installation. If you send us those mods we can try to get them into our code base in some form.
When you look at adding other AI providers, could you consider starting with Ollama? It's trivial to install, uses both a simple OpenAI compatible REST interface or command line and, perhaps most importantly, allows the LLM to be run on the users machine. I can see many people preferring to run their own LLM so that their code isn't being transferred to a third party. If you use the REST interface and allow the user to change the machine and port it uses, then a company could even set up Ollama on one machine and share it to all programmers with practically no effort.
Yes, that's at the top of our list for other AI providers, for the reasons you highlight. Thanks!
To enter a block of code:
Comments