You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Ollama Client – Chat with Local LLMs Inside Your Browser A lightweight, privacy-first Ollama Chrome extension to chat with locally hosted Ollama lllm models like LLaMA 2, Mistral, and CodeLLaMA. Supports streaming, stop/regenerate and easy model switching — all without cloud APIs or data leaks.
A Streamlit-powered platform to create, configure, and interact with customizable AI avatars using Ollama and LLMs. This modular system auto-generates modelfiles, stores avatar metadata in SQLite, and streams real-time responses via a local Ollama server.