mirror of
https://github.com/lordmathis/llamactl.git
synced 2025-11-05 16:44:22 +00:00
Update README.md for data persistence features
This commit is contained in:
33
README.md
33
README.md
@@ -11,6 +11,7 @@ A control server for managing multiple Llama Server instances with a web-based d
|
||||
- **Auto-restart**: Configurable automatic restart on instance failure
|
||||
- **Instance Monitoring**: Real-time health checks and status monitoring
|
||||
- **Log Management**: View, search, and download instance logs
|
||||
- **Data Persistence**: Persistent storage of instance state.
|
||||
- **REST API**: Full API for programmatic control
|
||||
- **OpenAI Compatible**: Route requests to instances by instance name
|
||||
- **Configuration Management**: Comprehensive llama-server parameter support
|
||||
@@ -79,7 +80,6 @@ go build -o llamactl ./cmd/server
|
||||
|
||||
## Configuration
|
||||
|
||||
|
||||
llamactl can be configured via configuration files or environment variables. Configuration is loaded in the following order of precedence:
|
||||
|
||||
1. Hardcoded defaults
|
||||
@@ -89,19 +89,23 @@ llamactl can be configured via configuration files or environment variables. Con
|
||||
|
||||
### Configuration Files
|
||||
|
||||
Configuration files are searched in the following locations:
|
||||
|
||||
#### Configuration File Locations
|
||||
|
||||
Configuration files are searched in the following locations (in order of precedence):
|
||||
|
||||
**Linux/macOS:**
|
||||
- `./llamactl.yaml` or `./config.yaml` (current directory)
|
||||
- `~/.config/llamactl/config.yaml`
|
||||
- `$HOME/.config/llamactl/config.yaml`
|
||||
- `/etc/llamactl/config.yaml`
|
||||
|
||||
**Windows:**
|
||||
- `./llamactl.yaml` or `./config.yaml` (current directory)
|
||||
- `%APPDATA%\llamactl\config.yaml`
|
||||
- `%USERPROFILE%\llamactl\config.yaml`
|
||||
- `%PROGRAMDATA%\llamactl\config.yaml`
|
||||
|
||||
You can specify the path to config file with `LLAMACTL_CONFIG_PATH` environment variable
|
||||
You can specify the path to config file with `LLAMACTL_CONFIG_PATH` environment variable.
|
||||
|
||||
## API Key Authentication
|
||||
|
||||
@@ -117,6 +121,7 @@ llamactl now supports API Key authentication for both management and inference (
|
||||
|
||||
### Configuration Options
|
||||
|
||||
|
||||
#### Server Configuration
|
||||
|
||||
```yaml
|
||||
@@ -133,12 +138,16 @@ server:
|
||||
- `LLAMACTL_ALLOWED_ORIGINS` - Comma-separated CORS origins
|
||||
- `LLAMACTL_ENABLE_SWAGGER` - Enable Swagger UI (true/false)
|
||||
|
||||
|
||||
#### Instance Configuration
|
||||
|
||||
```yaml
|
||||
instances:
|
||||
port_range: [8000, 9000] # Port range for instances
|
||||
logs_dir: "/tmp/llamactl" # Directory for instance logs
|
||||
port_range: [8000, 9000] # Port range for instances (default: [8000, 9000])
|
||||
data_dir: "~/.local/share/llamactl" # Directory for all llamactl data (default varies by OS)
|
||||
configs_dir: "~/.local/share/llamactl/instances" # Directory for instance configs (default: data_dir/instances)
|
||||
logs_dir: "~/.local/share/llamactl/logs" # Directory for instance logs (default: data_dir/logs)
|
||||
auto_create_dirs: true # Automatically create data/config/logs directories (default: true)
|
||||
max_instances: -1 # Maximum instances (-1 = unlimited)
|
||||
llama_executable: "llama-server" # Path to llama-server executable
|
||||
default_auto_restart: true # Default auto-restart setting
|
||||
@@ -148,13 +157,17 @@ instances:
|
||||
|
||||
**Environment Variables:**
|
||||
- `LLAMACTL_INSTANCE_PORT_RANGE` - Port range (format: "8000-9000" or "8000,9000")
|
||||
- `LLAMACTL_DATA_DIRECTORY` - Data directory path
|
||||
- `LLAMACTL_INSTANCES_DIR` - Instance configs directory path
|
||||
- `LLAMACTL_LOGS_DIR` - Log directory path
|
||||
- `LLAMACTL_AUTO_CREATE_DATA_DIR` - Auto-create data/config/logs directories (true/false)
|
||||
- `LLAMACTL_MAX_INSTANCES` - Maximum number of instances
|
||||
- `LLAMACTL_LLAMA_EXECUTABLE` - Path to llama-server executable
|
||||
- `LLAMACTL_DEFAULT_AUTO_RESTART` - Default auto-restart setting (true/false)
|
||||
- `LLAMACTL_DEFAULT_MAX_RESTARTS` - Default maximum restarts
|
||||
- `LLAMACTL_DEFAULT_RESTART_DELAY` - Default restart delay in seconds
|
||||
|
||||
|
||||
#### Auth Configuration
|
||||
|
||||
```yaml
|
||||
@@ -171,6 +184,7 @@ auth:
|
||||
- `LLAMACTL_REQUIRE_MANAGEMENT_AUTH` - Require auth for management endpoints (true/false)
|
||||
- `LLAMACTL_MANAGEMENT_KEYS` - Comma-separated management API keys
|
||||
|
||||
|
||||
### Example Configuration
|
||||
|
||||
```yaml
|
||||
@@ -180,13 +194,16 @@ server:
|
||||
|
||||
instances:
|
||||
port_range: [8001, 8100]
|
||||
log_directory: "/var/log/llamactl"
|
||||
data_dir: "/var/lib/llamactl"
|
||||
configs_dir: "/var/lib/llamactl/instances"
|
||||
logs_dir: "/var/log/llamactl"
|
||||
auto_create_dirs: true
|
||||
max_instances: 10
|
||||
llama_executable: "/usr/local/bin/llama-server"
|
||||
default_auto_restart: true
|
||||
default_max_restarts: 5
|
||||
default_restart_delay: 10
|
||||
|
||||
|
||||
auth:
|
||||
require_inference_auth: true
|
||||
inference_keys: ["sk-inference-abc123"]
|
||||
|
||||
Reference in New Issue
Block a user