Add create instance screenshot and update managing instances documentation

This commit is contained in:
2025-09-03 23:23:55 +02:00
parent 5eada9b6ce
commit 8c121dd28c
3 changed files with 7 additions and 16 deletions

Binary file not shown.

After

Width:  |  Height:  |  Size: 69 KiB

View File

@@ -2,13 +2,13 @@
Welcome to the Llamactl documentation! **Management server and proxy for multiple llama.cpp instances with OpenAI-compatible API routing.**
![Dashboard Screenshot](images/screenshot.png)
![Dashboard Screenshot](images/dashboard.png)
## What is Llamactl?
Llamactl is designed to simplify the deployment and management of llama-server instances. It provides a modern solution for running multiple large language models with centralized management.
## Why llamactl?
## Features
🚀 **Multiple Model Serving**: Run different models simultaneously (7B for speed, 70B for quality)
🔗 **OpenAI API Compatible**: Drop-in replacement - route requests by model name
@@ -19,19 +19,6 @@ Llamactl is designed to simplify the deployment and management of llama-server i
💡 **On-Demand Instance Start**: Automatically launch instances upon receiving OpenAI-compatible API requests
💾 **State Persistence**: Ensure instances remain intact across server restarts
**Choose llamactl if**: You need authentication, health monitoring, auto-restart, and centralized management of multiple llama-server instances
**Choose Ollama if**: You want the simplest setup with strong community ecosystem and third-party integrations
**Choose LM Studio if**: You prefer a polished desktop GUI experience with easy model management
## Key Features
- 🚀 **Easy Setup**: Quick installation and configuration
- 🌐 **Web Interface**: Intuitive web UI for model management
- 🔧 **REST API**: Full API access for automation
- 📊 **Monitoring**: Real-time health and status monitoring
- 🔒 **Security**: Authentication and access control
- 📱 **Responsive**: Works on desktop and mobile devices
## Quick Links
- [Installation Guide](getting-started/installation.md) - Get Llamactl up and running

View File

@@ -9,6 +9,8 @@ Llamactl provides two ways to manage instances:
- **Web UI**: Accessible at `http://localhost:8080` with an intuitive dashboard
- **REST API**: Programmatic access for automation and integration
![Dashboard Screenshot](../images/dashboard.png)
### Authentication
If authentication is enabled:
@@ -33,7 +35,9 @@ Each instance is displayed as a card showing:
### Via Web UI
1. Click the **"Add Instance"** button on the dashboard
![Create Instance Screenshot](../images/create_instance.png)
1. Click the **"Create Instance"** button on the dashboard
2. Enter a unique **Name** for your instance (only required field)
3. Configure model source (choose one):
- **Model Path**: Full path to your downloaded GGUF model file