Logo
Explore Help
Register Sign In
Mathis/llamactl
1
0
Fork 0
You've already forked llamactl
mirror of https://github.com/lordmathis/llamactl.git synced 2025-11-06 09:04:27 +00:00
Code Issues Packages Projects Releases Wiki Activity
327 Commits 2 Branches 19 Tags
92cb57e8162eb5a5e69274fc145a8149b7ffb9b6
Commit Graph

10 Commits

Author SHA1 Message Date
LordMathis
1fbf809a2d Add EnvironmentVariablesInput component and integrate into InstanceSettingsCard 2025-09-28 14:42:10 +02:00
LordMathis
b665194307 Add vLLM backend support to webui 2025-09-21 20:58:43 +02:00
Matus Namesny
5121f0e302 Remove PythonPath references from MlxServerOptions and related configurations 2025-09-17 21:59:55 +02:00
LordMathis
587be68077 Add MLX backend support with configuration and parsing enhancements 2025-09-16 22:38:39 +02:00
LordMathis
0fd3613798 Refactor backend type from LLAMA_SERVER to LLAMA_CPP across components and tests 2025-09-02 21:19:22 +02:00
LordMathis
4f6bb6292e Implement backend configuration options and refactor related components 2025-09-02 21:12:14 +02:00
LordMathis
8265a94bf7 Add on-demand start configuration to instance options and basic fields 2025-08-20 14:56:11 +02:00
LordMathis
1aaab96cec Add idle timeout configuration to instance options and basic fields 2025-08-19 19:24:54 +02:00
LordMathis
a26d853ad5 Fix missing or wrong llama server options on frontend 2025-08-06 18:40:05 +02:00
LordMathis
f337a3efe2 Refactor project structure 2025-07-26 11:37:28 +02:00
Powered by Gitea Version: 1.24.6 Page: 3701ms Template: 7ms
English
Bahasa Indonesia Deutsch English Español Français Gaeilge Italiano Latviešu Magyar nyelv Nederlands Polski Português de Portugal Português do Brasil Suomi Svenska Türkçe Čeština Ελληνικά Български Русский Українська فارسی മലയാളം 日本語 简体中文 繁體中文(台灣) 繁體中文(香港) 한국어
Licenses API