MVP of an idea using multiple local LLM models to simulate and play D&D
-
Updated
Nov 3, 2024 - Python
MVP of an idea using multiple local LLM models to simulate and play D&D
Handy tool to measure the performance and efficiency of LLMs workloads.
AURORA (Artificial Unified Responsive Optimized Reasoning Agent) uses lobes and web research for RAG based memory and learning.
Ollama with Let's Encrypt Using Docker Compose
Issue report classification demo with SetFit and Ollama for NASA's Flight System software repositories
This repo brings numerous use cases from the Open Source Ollama
Приложение на Streamlit и Ollama позволяющее получать сжатый текст при помощи открытых LLM моделей
Effortlessly rename files using local AI - No tokens, No API. Based on Ollama.
A compact LLM research tool for rapid experimentation, powered by open source!
Auto Install and Running Public API Service for Ollama with Any Model (library)
Run Ollama models anywhere easily
Ollama Web UI is a simple yet powerful web-based interface for interacting with large language models. It offers chat history, voice commands, voice output, model download and management, conversation saving, terminal access, multi-model chat, and more—all in one streamlined platform.
Simple Llama UI is a comprehensive chat application utilizing AI language models with streaming support.
Choose the model that's right for you
CLI tool for chatting with llms running via ollama written using modern python tooling. The tool focuses on enhancing nvim based coding workflow for free cuz author is a damn cheapskate.
macOS app for interacting with and proxying Ollama traffic. Focused on chat/instruct applications.
Welcome to the Llama-3 Chatbot project! This chatbot allows you to interact with the Llama-3 model via a simple command-line interface. Type your messages, and receive responses from Llama-3.
Add a description, image, and links to the ollama-python topic page so that developers can more easily learn about it.
To associate your repository with the ollama-python topic, visit your repo's landing page and select "manage topics."