Block a user
Unified management and routing for llama.cpp, MLX and vLLM models with web dashboard.
llama-cpp
llamacpp
llama-server
llm
llm-inference
llm-router
localllama
localllm
mlx
mlx-lm
openai-api
self-hosted
vllm
Updated 2025-12-21 22:32:33 +00:00
Small kubernetes cluster managed by Flux
Updated 2025-12-21 18:52:32 +00:00
Yet another note-taking app
Updated 2025-12-15 20:36:41 +00:00
Updated 2025-08-28 19:14:51 +00:00