Hey HExLEsEO,
Indeed, getting the various language model WebUIs out there to work properly can be quite a pain. When I originally built AutoComplete, which feels like not that long ago, following the instructions on the oobabooga WebUI repository worked fine with GPT-4chan.
Nowadays, it should still work with Wurst 7.42 and oobabooga's new OpenAI-compatible API, but I don't know for sure. The settings would be as follows:
Setting | Value | Notes |
Custom model | gpt-4chan | |
Custom model type | Legacy | |
OpenAI chat endpoint | http://127.0.0.1:5000/v1/chat/completions | Replace 5000 with your WebUI port if it's different. |
OpenAI legacy endpoint | http://127.0.0.1:5000/v1/completions | Replace 5000 with your WebUI port if it's different. |
Environment variable WURST_OPENAI_KEY | sk-123 | Can't be blank, but the value is otherwise ignored. |
Regarding gpt4all, it seems like it doesn't support the OpenAI format for API calls, and its own API is different from the old oobabooga API too, so I'm afraid that one just won't work. I know this sucks, but there are so many different WebUIs and hosting providers out there, with so many competing API formats, that I just can't support all of them.
I hope that makes sense.