Improve index.md

This commit is contained in:
2025-08-31 15:50:45 +02:00
parent b51974bbf7
commit 92af14b350

View File

@@ -1,16 +1,27 @@
# Llamactl Documentation # Llamactl Documentation
Welcome to the Llamactl documentation! Llamactl is a powerful management tool for llama-server instances that provides both a web interface and REST API for managing large language models. Welcome to the Llamactl documentation! **Management server and proxy for multiple llama.cpp instances with OpenAI-compatible API routing.**
![Dashboard Screenshot](images/screenshot.png)
## What is Llamactl? ## What is Llamactl?
Llamactl is designed to simplify the deployment and management of llama-server instances. It provides: Llamactl is designed to simplify the deployment and management of llama-server instances. It provides a modern solution for running multiple large language models with centralized management.
- **Instance Management**: Start, stop, and monitor multiple llama-server instances ## Why llamactl?
- **Web UI**: User-friendly interface for managing your models
- **REST API**: Programmatic access to all functionality 🚀 **Multiple Model Serving**: Run different models simultaneously (7B for speed, 70B for quality)
- **Health Monitoring**: Real-time status and health checks 🔗 **OpenAI API Compatible**: Drop-in replacement - route requests by model name
- **Configuration Management**: Easy setup and configuration options 🌐 **Web Dashboard**: Modern React UI for visual management (unlike CLI-only tools)
🔐 **API Key Authentication**: Separate keys for management vs inference access
📊 **Instance Monitoring**: Health checks, auto-restart, log management
**Smart Resource Management**: Idle timeout, LRU eviction, and configurable instance limits
💡 **On-Demand Instance Start**: Automatically launch instances upon receiving OpenAI-compatible API requests
💾 **State Persistence**: Ensure instances remain intact across server restarts
**Choose llamactl if**: You need authentication, health monitoring, auto-restart, and centralized management of multiple llama-server instances
**Choose Ollama if**: You want the simplest setup with strong community ecosystem and third-party integrations
**Choose LM Studio if**: You prefer a polished desktop GUI experience with easy model management
## Key Features ## Key Features
@@ -24,9 +35,13 @@ Llamactl is designed to simplify the deployment and management of llama-server i
## Quick Links ## Quick Links
- [Installation Guide](getting-started/installation.md) - Get Llamactl up and running - [Installation Guide](getting-started/installation.md) - Get Llamactl up and running
- [Configuration Guide](getting-started/configuration.md) - Detailed configuration options
- [Quick Start](getting-started/quick-start.md) - Your first steps with Llamactl - [Quick Start](getting-started/quick-start.md) - Your first steps with Llamactl
- [Web UI Guide](user-guide/web-ui.md) - Learn to use the web interface - [Web UI Guide](user-guide/web-ui.md) - Learn to use the web interface
- [Managing Instances](user-guide/managing-instances.md) - Instance lifecycle management
- [API Reference](user-guide/api-reference.md) - Complete API documentation - [API Reference](user-guide/api-reference.md) - Complete API documentation
- [Monitoring](advanced/monitoring.md) - Health checks and monitoring
- [Backends](advanced/backends.md) - Backend configuration options
## Getting Help ## Getting Help
@@ -34,7 +49,8 @@ If you need help or have questions:
- Check the [Troubleshooting](advanced/troubleshooting.md) guide - Check the [Troubleshooting](advanced/troubleshooting.md) guide
- Visit the [GitHub repository](https://github.com/lordmathis/llamactl) - Visit the [GitHub repository](https://github.com/lordmathis/llamactl)
- Review the [Configuration Guide](getting-started/configuration.md) for advanced settings
--- ## License
Ready to get started? Head over to the [Installation Guide](getting-started/installation.md)! MIT License - see the [LICENSE](https://github.com/lordmathis/llamactl/blob/main/LICENSE) file.