Unified management and routing for llama.cpp, MLX and vLLM models with web dashboard.
llama-cpp
llamacpp
llama-server
llm
llm-inference
llm-router
localllama
localllm
mlx
mlx-lm
openai-api
self-hosted
vllm
Updated 2025-12-12 09:43:33 +00:00
Small kubernetes cluster managed by Flux
Updated 2025-12-11 17:13:54 +00:00
Yet another note-taking app
Updated 2025-12-08 18:35:25 +00:00
Updated 2025-08-28 19:14:51 +00:00