|
|
38790aa507
|
Support llama.cpp router mode for openai endpoints
|
2025-12-21 23:32:33 +01:00 |
|
|
|
4f8f4b96cd
|
Fix docker_enabled inconsistency
|
2025-11-14 23:41:16 +01:00 |
|
|
|
7544fbb1ce
|
Refactor JSON marshaling in Options to improve thread safety
|
2025-11-14 21:50:58 +01:00 |
|
|
|
511889e56d
|
Implement per instance command override on backend
|
2025-11-14 18:38:31 +01:00 |
|
|
|
f9eb424690
|
Fix concurrent map write issue in MarshalJSON by initializing BackendOptions
|
2025-10-27 20:36:42 +01:00 |
|
|
|
bd6436840e
|
Implement common ParseCommand interface
|
2025-10-25 18:41:46 +02:00 |
|
|
|
b25ad48605
|
Refactor backend options marshaling/unmarshaling
|
2025-10-19 20:48:05 +02:00 |
|
|
|
d8e0da9cf8
|
Refactor backend options to implement common interface and streamline validation
|
2025-10-19 20:36:57 +02:00 |
|
|
|
55f671c354
|
Refactor backend options handling and validation
|
2025-10-19 17:41:08 +02:00 |
|
|
|
2a7010d0e1
|
Flatten backends package structure
|
2025-10-19 15:50:42 +02:00 |
|
|
|
4df02a6519
|
Initial vLLM backend support
|
2025-09-19 18:05:12 +02:00 |
|
|
|
988c4aca40
|
Add MLX backend config options
|
2025-09-16 21:14:19 +02:00 |
|
|
|
d9542ba117
|
Refactor instance management to support backend types and options
|
2025-09-01 21:59:18 +02:00 |
|