16 Commits

Author SHA1 Message Date
08c47a16a0 Fix operations tests 2025-10-27 18:35:16 +01:00
5ef0654cdd Use %w for error wrapping in log messages across multiple files 2025-10-27 17:54:39 +01:00
58c8899fd9 Update import path for API documentation 2025-10-26 14:08:48 +01:00
836e918fc5 Rename ProxyToInstance to InstanceProxy for clarity in routing 2025-10-26 10:22:37 +01:00
a7593e9a58 Split LlamaCppProxy handler 2025-10-26 10:21:40 +01:00
16b28bac05 Merge branch 'main' into feat/multi-host 2025-10-07 18:04:24 +02:00
Anuruth Lertpiya
fa43f9e967 Added support for proxying llama.cpp native API endpoints via /llama-cpp/{name}/ 2025-10-05 14:28:33 +00:00
Anuruth Lertpiya
0e1bc8a352 Added support for configuring CORS headers 2025-10-04 09:13:40 +00:00
da56456504 Add node management endpoints to handle listing and retrieving node details 2025-10-02 22:51:41 +02:00
30e40ecd30 Refactor API endpoints to use /backends/llama-cpp path and update related documentation 2025-09-23 21:27:58 +02:00
4df02a6519 Initial vLLM backend support 2025-09-19 18:05:12 +02:00
154b754aff Add MLX command parsing and routing support 2025-09-16 21:39:08 +02:00
323056096c Implement llama-server command parsing and add UI components for command input 2025-09-15 21:04:14 +02:00
afef3d0180 Update import path for API documentation to use apidocs 2025-08-07 19:48:28 +02:00
e2b64620b5 Expose version endpoint 2025-08-07 19:10:06 +02:00
6a7a9a2d09 Split large package into subpackages 2025-08-04 19:23:56 +02:00