get rid of openrouter proxying via llama-swap
All checks were successful
ci/woodpecker/push/flux-reconcile-source Pipeline was successful
All checks were successful
ci/woodpecker/push/flux-reconcile-source Pipeline was successful
This commit is contained in:
@@ -17,13 +17,6 @@ macros:
|
||||
thinking_on: "--chat-template-kwargs '{\"enable_thinking\": true}'"
|
||||
thinking_off: "--chat-template-kwargs '{\"enable_thinking\": false}'"
|
||||
|
||||
peers:
|
||||
openrouter:
|
||||
proxy: https://openrouter.ai/api
|
||||
apiKey: ${env.OPENROUTER_API_KEY}
|
||||
models:
|
||||
- z-ai/glm-5
|
||||
|
||||
hooks:
|
||||
on_startup:
|
||||
preload:
|
||||
|
||||
Reference in New Issue
Block a user