mirror of
https://github.com/lordmathis/llamactl.git
synced 2025-11-06 00:54:23 +00:00
Fix typos and consistent naming for Llamactl across documentation
This commit is contained in:
@@ -1,10 +1,10 @@
|
|||||||
# Backends
|
# Backends
|
||||||
|
|
||||||
LlamaCtl supports multiple backends for running large language models. This guide covers the available backends and their configuration.
|
Llamactl supports multiple backends for running large language models. This guide covers the available backends and their configuration.
|
||||||
|
|
||||||
## Llama.cpp Backend
|
## Llama.cpp Backend
|
||||||
|
|
||||||
The primary backend for LlamaCtl, providing robust support for GGUF models.
|
The primary backend for Llamactl, providing robust support for GGUF models.
|
||||||
|
|
||||||
### Features
|
### Features
|
||||||
|
|
||||||
@@ -107,7 +107,7 @@ options:
|
|||||||
|
|
||||||
## Future Backends
|
## Future Backends
|
||||||
|
|
||||||
LlamaCtl is designed to support multiple backends. Planned additions:
|
Llamactl is designed to support multiple backends. Planned additions:
|
||||||
|
|
||||||
### vLLM Backend
|
### vLLM Backend
|
||||||
|
|
||||||
@@ -137,7 +137,7 @@ Integration with Ollama for easy model management:
|
|||||||
|
|
||||||
### Automatic Detection
|
### Automatic Detection
|
||||||
|
|
||||||
LlamaCtl can automatically detect the best backend:
|
Llamactl can automatically detect the best backend:
|
||||||
|
|
||||||
```yaml
|
```yaml
|
||||||
backends:
|
backends:
|
||||||
|
|||||||
@@ -1,10 +1,10 @@
|
|||||||
# Monitoring
|
# Monitoring
|
||||||
|
|
||||||
Comprehensive monitoring setup for LlamaCtl in production environments.
|
Comprehensive monitoring setup for Llamactl in production environments.
|
||||||
|
|
||||||
## Overview
|
## Overview
|
||||||
|
|
||||||
Effective monitoring of LlamaCtl involves tracking:
|
Effective monitoring of Llamactl involves tracking:
|
||||||
|
|
||||||
- Instance health and performance
|
- Instance health and performance
|
||||||
- System resource usage
|
- System resource usage
|
||||||
@@ -15,7 +15,7 @@ Effective monitoring of LlamaCtl involves tracking:
|
|||||||
|
|
||||||
### Health Checks
|
### Health Checks
|
||||||
|
|
||||||
LlamaCtl provides built-in health monitoring:
|
Llamactl provides built-in health monitoring:
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
# Check overall system health
|
# Check overall system health
|
||||||
@@ -45,7 +45,7 @@ curl http://localhost:8080/metrics
|
|||||||
|
|
||||||
### Configuration
|
### Configuration
|
||||||
|
|
||||||
Add LlamaCtl as a Prometheus target:
|
Add Llamactl as a Prometheus target:
|
||||||
|
|
||||||
```yaml
|
```yaml
|
||||||
# prometheus.yml
|
# prometheus.yml
|
||||||
@@ -59,7 +59,7 @@ scrape_configs:
|
|||||||
|
|
||||||
### Custom Metrics
|
### Custom Metrics
|
||||||
|
|
||||||
Enable additional metrics in LlamaCtl:
|
Enable additional metrics in Llamactl:
|
||||||
|
|
||||||
```yaml
|
```yaml
|
||||||
# config.yaml
|
# config.yaml
|
||||||
@@ -76,7 +76,7 @@ monitoring:
|
|||||||
|
|
||||||
## Grafana Dashboards
|
## Grafana Dashboards
|
||||||
|
|
||||||
### LlamaCtl Dashboard
|
### Llamactl Dashboard
|
||||||
|
|
||||||
Import the official Grafana dashboard:
|
Import the official Grafana dashboard:
|
||||||
|
|
||||||
@@ -135,7 +135,7 @@ groups:
|
|||||||
labels:
|
labels:
|
||||||
severity: critical
|
severity: critical
|
||||||
annotations:
|
annotations:
|
||||||
summary: "LlamaCtl instance {{ $labels.instance_name }} is down"
|
summary: "Llamactl instance {{ $labels.instance_name }} is down"
|
||||||
|
|
||||||
- alert: HighMemoryUsage
|
- alert: HighMemoryUsage
|
||||||
expr: llamactl_instance_memory_percent > 90
|
expr: llamactl_instance_memory_percent > 90
|
||||||
@@ -170,7 +170,7 @@ receivers:
|
|||||||
slack_configs:
|
slack_configs:
|
||||||
- api_url: 'https://hooks.slack.com/services/...'
|
- api_url: 'https://hooks.slack.com/services/...'
|
||||||
channel: '#alerts'
|
channel: '#alerts'
|
||||||
title: 'LlamaCtl Alert'
|
title: 'Llamactl Alert'
|
||||||
text: '{{ range .Alerts }}{{ .Annotations.summary }}{{ end }}'
|
text: '{{ range .Alerts }}{{ .Annotations.summary }}{{ end }}'
|
||||||
```
|
```
|
||||||
|
|
||||||
@@ -373,7 +373,7 @@ curl http://localhost:8080/metrics | grep rate_limit
|
|||||||
**Metrics not appearing:**
|
**Metrics not appearing:**
|
||||||
1. Check Prometheus configuration
|
1. Check Prometheus configuration
|
||||||
2. Verify network connectivity
|
2. Verify network connectivity
|
||||||
3. Review LlamaCtl logs for errors
|
3. Review Llamactl logs for errors
|
||||||
|
|
||||||
**High memory usage:**
|
**High memory usage:**
|
||||||
1. Check for memory leaks in profiles
|
1. Check for memory leaks in profiles
|
||||||
|
|||||||
@@ -1,6 +1,6 @@
|
|||||||
# Troubleshooting
|
# Troubleshooting
|
||||||
|
|
||||||
Common issues and solutions for LlamaCtl deployment and operation.
|
Common issues and solutions for Llamactl deployment and operation.
|
||||||
|
|
||||||
## Installation Issues
|
## Installation Issues
|
||||||
|
|
||||||
@@ -29,7 +29,7 @@ Common issues and solutions for LlamaCtl deployment and operation.
|
|||||||
|
|
||||||
### Permission Denied
|
### Permission Denied
|
||||||
|
|
||||||
**Problem:** Permission errors when starting LlamaCtl
|
**Problem:** Permission errors when starting Llamactl
|
||||||
|
|
||||||
**Solutions:**
|
**Solutions:**
|
||||||
1. Check file permissions:
|
1. Check file permissions:
|
||||||
@@ -269,7 +269,7 @@ Common issues and solutions for LlamaCtl deployment and operation.
|
|||||||
|
|
||||||
### High CPU Usage
|
### High CPU Usage
|
||||||
|
|
||||||
**Problem:** LlamaCtl consuming excessive CPU
|
**Problem:** Llamactl consuming excessive CPU
|
||||||
|
|
||||||
**Diagnostic Steps:**
|
**Diagnostic Steps:**
|
||||||
1. Identify CPU-intensive processes:
|
1. Identify CPU-intensive processes:
|
||||||
@@ -302,7 +302,7 @@ Common issues and solutions for LlamaCtl deployment and operation.
|
|||||||
|
|
||||||
### Connection Refused
|
### Connection Refused
|
||||||
|
|
||||||
**Problem:** Cannot connect to LlamaCtl web interface
|
**Problem:** Cannot connect to Llamactl web interface
|
||||||
|
|
||||||
**Diagnostic Steps:**
|
**Diagnostic Steps:**
|
||||||
1. Check if service is running:
|
1. Check if service is running:
|
||||||
|
|||||||
@@ -1,6 +1,6 @@
|
|||||||
# Building from Source
|
# Building from Source
|
||||||
|
|
||||||
This guide covers building LlamaCtl from source code for development and production deployment.
|
This guide covers building Llamactl from source code for development and production deployment.
|
||||||
|
|
||||||
## Prerequisites
|
## Prerequisites
|
||||||
|
|
||||||
@@ -261,7 +261,7 @@ LDFLAGS := -s -w -X main.version=$(VERSION) -X main.buildTime=$(BUILD_TIME)
|
|||||||
.PHONY: build clean test install
|
.PHONY: build clean test install
|
||||||
|
|
||||||
build:
|
build:
|
||||||
@echo "Building LlamaCtl..."
|
@echo "Building Llamactl..."
|
||||||
@cd webui && npm run build
|
@cd webui && npm run build
|
||||||
@go build -ldflags="$(LDFLAGS)" -o llamactl cmd/server/main.go
|
@go build -ldflags="$(LDFLAGS)" -o llamactl cmd/server/main.go
|
||||||
|
|
||||||
@@ -423,7 +423,7 @@ Create a systemd service:
|
|||||||
```ini
|
```ini
|
||||||
# /etc/systemd/system/llamactl.service
|
# /etc/systemd/system/llamactl.service
|
||||||
[Unit]
|
[Unit]
|
||||||
Description=LlamaCtl Server
|
Description=Llamactl Server
|
||||||
After=network.target
|
After=network.target
|
||||||
|
|
||||||
[Service]
|
[Service]
|
||||||
|
|||||||
@@ -1,6 +1,6 @@
|
|||||||
# Contributing
|
# Contributing
|
||||||
|
|
||||||
Thank you for your interest in contributing to LlamaCtl! This guide will help you get started with development and contribution.
|
Thank you for your interest in contributing to Llamactl! This guide will help you get started with development and contribution.
|
||||||
|
|
||||||
## Development Setup
|
## Development Setup
|
||||||
|
|
||||||
@@ -370,4 +370,4 @@ Contributors are recognized in:
|
|||||||
- Documentation credits
|
- Documentation credits
|
||||||
- Annual contributor highlights
|
- Annual contributor highlights
|
||||||
|
|
||||||
Thank you for contributing to LlamaCtl!
|
Thank you for contributing to Llamactl!
|
||||||
|
|||||||
@@ -1,6 +1,6 @@
|
|||||||
# Configuration
|
# Configuration
|
||||||
|
|
||||||
LlamaCtl can be configured through various methods to suit your needs.
|
Llamactl can be configured through various methods to suit your needs.
|
||||||
|
|
||||||
## Configuration File
|
## Configuration File
|
||||||
|
|
||||||
@@ -39,7 +39,7 @@ limits:
|
|||||||
|
|
||||||
## Environment Variables
|
## Environment Variables
|
||||||
|
|
||||||
You can also configure LlamaCtl using environment variables:
|
You can also configure Llamactl using environment variables:
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
# Server settings
|
# Server settings
|
||||||
|
|||||||
@@ -1,10 +1,10 @@
|
|||||||
# Installation
|
# Installation
|
||||||
|
|
||||||
This guide will walk you through installing LlamaCtl on your system.
|
This guide will walk you through installing Llamactl on your system.
|
||||||
|
|
||||||
## Prerequisites
|
## Prerequisites
|
||||||
|
|
||||||
Before installing LlamaCtl, ensure you have:
|
Before installing Llamactl, ensure you have:
|
||||||
|
|
||||||
- Go 1.19 or later
|
- Go 1.19 or later
|
||||||
- Git
|
- Git
|
||||||
@@ -52,4 +52,4 @@ llamactl --version
|
|||||||
|
|
||||||
## Next Steps
|
## Next Steps
|
||||||
|
|
||||||
Now that LlamaCtl is installed, continue to the [Quick Start](quick-start.md) guide to get your first instance running!
|
Now that Llamactl is installed, continue to the [Quick Start](quick-start.md) guide to get your first instance running!
|
||||||
|
|||||||
@@ -1,16 +1,16 @@
|
|||||||
# Quick Start
|
# Quick Start
|
||||||
|
|
||||||
This guide will help you get LlamaCtl up and running in just a few minutes.
|
This guide will help you get Llamactl up and running in just a few minutes.
|
||||||
|
|
||||||
## Step 1: Start LlamaCtl
|
## Step 1: Start Llamactl
|
||||||
|
|
||||||
Start the LlamaCtl server:
|
Start the Llamactl server:
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
llamactl
|
llamactl
|
||||||
```
|
```
|
||||||
|
|
||||||
By default, LlamaCtl will start on `http://localhost:8080`.
|
By default, Llamactl will start on `http://localhost:8080`.
|
||||||
|
|
||||||
## Step 2: Access the Web UI
|
## Step 2: Access the Web UI
|
||||||
|
|
||||||
@@ -20,7 +20,7 @@ Open your web browser and navigate to:
|
|||||||
http://localhost:8080
|
http://localhost:8080
|
||||||
```
|
```
|
||||||
|
|
||||||
You should see the LlamaCtl web interface.
|
You should see the Llamactl web interface.
|
||||||
|
|
||||||
## Step 3: Create Your First Instance
|
## Step 3: Create Your First Instance
|
||||||
|
|
||||||
|
|||||||
@@ -1,10 +1,10 @@
|
|||||||
# LlamaCtl Documentation
|
# Llamactl Documentation
|
||||||
|
|
||||||
Welcome to the LlamaCtl documentation! LlamaCtl is a powerful management tool for Llama.cpp instances that provides both a web interface and REST API for managing large language models.
|
Welcome to the Llamactl documentation! Llamactl is a powerful management tool for Llama.cpp instances that provides both a web interface and REST API for managing large language models.
|
||||||
|
|
||||||
## What is LlamaCtl?
|
## What is Llamactl?
|
||||||
|
|
||||||
LlamaCtl is designed to simplify the deployment and management of Llama.cpp instances. It provides:
|
Llamactl is designed to simplify the deployment and management of Llama.cpp instances. It provides:
|
||||||
|
|
||||||
- **Instance Management**: Start, stop, and monitor multiple Llama.cpp instances
|
- **Instance Management**: Start, stop, and monitor multiple Llama.cpp instances
|
||||||
- **Web UI**: User-friendly interface for managing your models
|
- **Web UI**: User-friendly interface for managing your models
|
||||||
@@ -23,8 +23,8 @@ LlamaCtl is designed to simplify the deployment and management of Llama.cpp inst
|
|||||||
|
|
||||||
## Quick Links
|
## Quick Links
|
||||||
|
|
||||||
- [Installation Guide](getting-started/installation.md) - Get LlamaCtl up and running
|
- [Installation Guide](getting-started/installation.md) - Get Llamactl up and running
|
||||||
- [Quick Start](getting-started/quick-start.md) - Your first steps with LlamaCtl
|
- [Quick Start](getting-started/quick-start.md) - Your first steps with Llamactl
|
||||||
- [Web UI Guide](user-guide/web-ui.md) - Learn to use the web interface
|
- [Web UI Guide](user-guide/web-ui.md) - Learn to use the web interface
|
||||||
- [API Reference](user-guide/api-reference.md) - Complete API documentation
|
- [API Reference](user-guide/api-reference.md) - Complete API documentation
|
||||||
|
|
||||||
@@ -34,7 +34,7 @@ If you need help or have questions:
|
|||||||
|
|
||||||
- Check the [Troubleshooting](advanced/troubleshooting.md) guide
|
- Check the [Troubleshooting](advanced/troubleshooting.md) guide
|
||||||
- Visit our [GitHub repository](https://github.com/lordmathis/llamactl)
|
- Visit our [GitHub repository](https://github.com/lordmathis/llamactl)
|
||||||
- Read the [Contributing guide](development/contributing.md) to help improve LlamaCtl
|
- Read the [Contributing guide](development/contributing.md) to help improve Llamactl
|
||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
|
|||||||
@@ -1,6 +1,6 @@
|
|||||||
# API Reference
|
# API Reference
|
||||||
|
|
||||||
Complete reference for the LlamaCtl REST API.
|
Complete reference for the Llamactl REST API.
|
||||||
|
|
||||||
## Base URL
|
## Base URL
|
||||||
|
|
||||||
@@ -314,7 +314,7 @@ GET /api/system/info
|
|||||||
|
|
||||||
### Get Configuration
|
### Get Configuration
|
||||||
|
|
||||||
Get current LlamaCtl configuration.
|
Get current Llamactl configuration.
|
||||||
|
|
||||||
```http
|
```http
|
||||||
GET /api/config
|
GET /api/config
|
||||||
@@ -322,7 +322,7 @@ GET /api/config
|
|||||||
|
|
||||||
### Update Configuration
|
### Update Configuration
|
||||||
|
|
||||||
Update LlamaCtl configuration (requires restart).
|
Update Llamactl configuration (requires restart).
|
||||||
|
|
||||||
```http
|
```http
|
||||||
PUT /api/config
|
PUT /api/config
|
||||||
|
|||||||
@@ -1,6 +1,6 @@
|
|||||||
# Managing Instances
|
# Managing Instances
|
||||||
|
|
||||||
Learn how to effectively manage your Llama.cpp instances with LlamaCtl.
|
Learn how to effectively manage your Llama.cpp instances with Llamactl.
|
||||||
|
|
||||||
## Instance Lifecycle
|
## Instance Lifecycle
|
||||||
|
|
||||||
|
|||||||
@@ -1,6 +1,6 @@
|
|||||||
# Web UI Guide
|
# Web UI Guide
|
||||||
|
|
||||||
The LlamaCtl Web UI provides an intuitive interface for managing your Llama.cpp instances.
|
The Llamactl Web UI provides an intuitive interface for managing your Llama.cpp instances.
|
||||||
|
|
||||||
## Overview
|
## Overview
|
||||||
|
|
||||||
@@ -169,7 +169,7 @@ If authentication is enabled:
|
|||||||
### Common UI Issues
|
### Common UI Issues
|
||||||
|
|
||||||
**Page won't load:**
|
**Page won't load:**
|
||||||
- Check if LlamaCtl server is running
|
- Check if Llamactl server is running
|
||||||
- Verify the correct URL and port
|
- Verify the correct URL and port
|
||||||
- Check browser console for errors
|
- Check browser console for errors
|
||||||
|
|
||||||
|
|||||||
Reference in New Issue
Block a user