{% extends "admin/base.html" %} {% block title %}Ollama Instance Manager{% endblock %} {% block header_title %}Local Instance Manager{% endblock %} {% block content %}

Ollama

{% if is_installed %} Detected {% else %} Not Found {% endif %}

vLLM

{% if is_vllm_installed %}Detected{% else %}Optional{% endif %}

{% if is_vllm_installed %} {% else %} {% endif %}

OpenLLM

{% if is_openllm_installed %}Detected{% else %}Optional{% endif %}

{% if is_openllm_installed %} {% else %} {% endif %}

Binary Hub

Internal llama.cpp/vLLM engines.

Integrated Model Downloader

Search Hugging Face and download GGUF or Safetensors directly to your server.

Deploy New Instance

{% if discovered %}

Unmanaged Local Instances Detected

The following Ollama instances are running locally but are not yet managed by the Fortress supervisor.

{% for found in discovered %}
PORT {{ found.port }}
{% endfor %}
{% endif %}

Managed Instances

{% for item in instances %} {% endfor %}
Name Port Config Status Actions
{{ item.config.name }} {{ item.config.port }} GPU: {{ item.config.gpu_ids or 'All' }} | Alive: {{ item.config.keep_alive }} {% if item.state == 'RUNNING' %} MANAGED {% elif item.state == 'SYSTEM' %} SYSTEM SERVICE {% elif item.state == 'CONFLICT' %} PORT CONFLICT {% else %} OFFLINE {% endif %} {% if item.state == 'SYSTEM' %} Manage via OS {% else %}
{% endif %}
{% endblock %}