|
|
ad3b2229c2
|
get rid of openrouter proxying via llama-swap
ci/woodpecker/push/flux-reconcile-source Pipeline was successful
|
2026-04-04 02:39:26 +02:00 |
|
|
|
e923fc3c30
|
chore(deps): update ghcr.io/mostlygeek/llama-swap docker tag to v199-vulkan-b8637
|
2026-04-04 00:00:54 +00:00 |
|
|
|
4e30c9b94d
|
chore(deps): update ghcr.io/mostlygeek/llama-swap docker tag to v199-vulkan-b8606
|
2026-04-03 00:00:32 +00:00 |
|
|
|
3d53b4b10b
|
chore(deps): update ghcr.io/mostlygeek/llama-swap docker tag to v199-vulkan-b8589
|
2026-04-02 00:00:30 +00:00 |
|
|
|
e485a4fc7f
|
chore(deps): update ghcr.io/mostlygeek/llama-swap docker tag to v199-vulkan-b8576
|
2026-03-30 00:00:49 +00:00 |
|
|
|
99bc04b76a
|
chore(deps): update ghcr.io/mostlygeek/llama-swap docker tag to v199-vulkan-b8562
|
2026-03-29 00:00:50 +00:00 |
|
|
|
cb53301926
|
chore(deps): update ghcr.io/mostlygeek/llama-swap docker tag to v199-vulkan-b8547
|
2026-03-27 17:42:04 +00:00 |
|
|
|
66cb3c9d82
|
chore(deps): update ghcr.io/mostlygeek/llama-swap docker tag to v199
|
2026-03-27 00:00:28 +00:00 |
|
|
|
9a1fe1f740
|
chore(deps): update ghcr.io/mostlygeek/llama-swap docker tag to v198-vulkan-b8508
|
2026-03-26 00:00:49 +00:00 |
|
|
|
8cf02fea0e
|
chore(deps): update ghcr.io/mostlygeek/llama-swap docker tag to v198-vulkan-b8496
|
2026-03-25 00:00:29 +00:00 |
|
|
|
1d85bf3a88
|
chore(deps): update ghcr.io/mostlygeek/llama-swap docker tag to v198-vulkan-b8477
|
2026-03-24 00:00:39 +00:00 |
|
|
|
bfede17c87
|
chore(deps): update ghcr.io/mostlygeek/llama-swap docker tag to v198-vulkan-b8468
|
2026-03-23 00:00:21 +00:00 |
|
|
|
471c0ba62d
|
chore(deps): update ghcr.io/mostlygeek/llama-swap docker tag to v198-vulkan-b8461
|
2026-03-22 00:00:23 +00:00 |
|
|
|
8717526358
|
chore(deps): update ghcr.io/mostlygeek/llama-swap docker tag to v198-vulkan-b8445
|
2026-03-20 22:31:36 +00:00 |
|
|
|
73d6d1f15a
|
chore(deps): update ghcr.io/mostlygeek/llama-swap docker tag to v198-vulkan-b8400
|
2026-03-19 00:00:34 +00:00 |
|
|
|
8d994e7aa1
|
chore(deps): update ghcr.io/mostlygeek/llama-swap docker tag to v198-vulkan-b8390
|
2026-03-18 00:00:28 +00:00 |
|
|
|
82864a4738
|
chore(deps): update ghcr.io/mostlygeek/llama-swap docker tag to v198-vulkan-b8369
|
2026-03-17 00:00:58 +00:00 |
|
|
|
afbcea4e82
|
chore(deps): update ghcr.io/mostlygeek/llama-swap docker tag to v198-vulkan-b8352
|
2026-03-15 17:40:26 +00:00 |
|
|
|
4b4cec10be
|
chore(deps): update ghcr.io/mostlygeek/llama-swap docker tag to v198
|
2026-03-15 00:00:34 +00:00 |
|
|
|
f219abb74f
|
chore(deps): update ghcr.io/mostlygeek/llama-swap docker tag to v197-vulkan-b8248
|
2026-03-13 04:00:10 +01:00 |
|
|
|
0130991c74
|
refactor: add move llama-swap package config to renovate.json
|
2026-03-13 04:00:10 +01:00 |
|
|
|
966d2c50c0
|
update renovate comment for llama-swap image tag management
|
2026-03-13 04:00:10 +01:00 |
|
|
|
e72a79be8f
|
add glm-5 from openrouter to llama-swap
|
2026-03-13 04:00:10 +01:00 |
|
|
|
88a73cbb41
|
set strategy to recreate on llama-swap deployment
|
2026-03-13 04:00:10 +01:00 |
|
|
|
8d7cf402fd
|
manually update llama-swap image tag
|
2026-03-13 04:00:10 +01:00 |
|
|
|
ec038d7154
|
fix models mount
|
2026-03-13 04:00:10 +01:00 |
|
|
|
1ddef7951a
|
update llama-swap image
|
2026-03-13 04:00:10 +01:00 |
|
|
|
9f55d67ffa
|
migrate llama models to ssd
|
2026-03-13 04:00:10 +01:00 |
|
|
|
a3f30873f9
|
switch llama models dir to lvm hdd
|
2026-03-13 04:00:09 +01:00 |
|
|
|
e3325670de
|
fix cache location after llama-swap update
|
2026-03-13 04:00:08 +01:00 |
|
|
|
b9200d3a4c
|
update llama-swap
|
2026-03-13 04:00:08 +01:00 |
|
|
|
8063cbaf80
|
update llama-swap docker image
|
2026-03-13 04:00:08 +01:00 |
|
|
|
feaf805208
|
update llama-swap
|
2026-03-13 04:00:07 +01:00 |
|
|
|
6f3e612dde
|
move llama models to ssd
|
2026-03-13 04:00:07 +01:00 |
|
|
|
5813db75dc
|
gpu offload in llama.cpp
|
2026-03-13 04:00:07 +01:00 |
|
|
|
af6545444b
|
llama-swap
|
2026-03-13 04:00:07 +01:00 |
|