{% extends "admin/base.html" %} {% block title %}Server Management{% endblock %} {% block header_title %}Server Management{% endblock %} {% block content %}
The industry standard for local LLMs.
ollama.com.https://ollama.com/api with your Ollama Cloud API key./api/tags and /api/chat endpoints automatically.High-throughput serving for production.
pip install openllm or use our auto-installer.openllm start llama3. Default port: 3000.Connecting to other AI frontends.
http://127.0.0.1:3000/api.| Name | URL | Type | API Key | Status | Models | Actions |
|---|---|---|---|---|---|---|
| {{ server.name }} | {{ server.url }} | {{ server.server_type | upper }} | {% if server.has_api_key %} Configured {% else %} - {% endif %} | {% if not server.is_active %} Disabled {% elif server.last_error %} {% else %} Active {% endif %} | {{ server.available_models|length if server.available_models is not none else 'N/A' }} model(s) {% if server.models_last_updated %} (updated {{ server.models_last_updated.strftime('%Y-%m-%d %H:%M') }}) {% endif %} | |
| No servers configured. | ||||||