Skip to content
#

ollama-chatbot

Here are 3 public repositories matching this topic...

Language: All
Filter by language

Ollama Client – Chat with Local LLMs Inside Your Browser A lightweight, privacy-first Ollama Chrome extension to chat with locally hosted Ollama lllm models like LLaMA 2, Mistral, and CodeLLaMA. Supports streaming, stop/regenerate and easy model switching — all without cloud APIs or data leaks.

  • Updated Sep 12, 2025
  • TypeScript

Improve this page

Add a description, image, and links to the ollama-chatbot topic page so that developers can more easily learn about it.

Curate this topic

Add this topic to your repo

To associate your repository with the ollama-chatbot topic, visit your repo's landing page and select "manage topics."

Learn more