
streamlit_agent_chat
Playing around with Streamlit, Ollama, Agents, and MCP
Repository Info
About This Server
Playing around with Streamlit, Ollama, Agents, and MCP
Model Context Protocol (MCP) - This server can be integrated with AI applications to provide additional context and capabilities, enabling enhanced AI interactions and functionality.
Documentation
# Streamlit LLM Chat with Ollama A Streamlit application that provides a chat interface with Gemma3:1b via Ollama, using Langgraph for agent-based workflows. ## Setup 1. Make sure you have Python 3.11 installed 2. Install and run Ollama: https://ollama.com/ 3. Pull the Gemma3:1b model: `ollama pull gemma3:1b` 4. Set up the Python environment: ```bash python3.11 -m venv venv source venv/bin/activate pip install -r requirements.txt ``` ## Running the Application ```bash streamlit run app/main.py ``` ## Features - Chat with Gemma3:1b via Ollama - Structured agent workflow using Langgraph - Expandable for tool calling capabilities
Quick Start
Clone the repository
git clone https://github.com/bryangsmith/streamlit_agent_chatInstall dependencies
cd streamlit_agent_chat
npm installFollow the documentation
Check the repository's README.md file for specific installation and usage instructions.
Repository Details
Recommended MCP Servers
Discord MCP
Enable AI assistants to seamlessly interact with Discord servers, channels, and messages.
Knit MCP
Connect AI agents to 200+ SaaS applications and automate workflows.
Apify MCP Server
Deploy and interact with Apify actors for web scraping and data extraction.
BrowserStack MCP
BrowserStack MCP Server for automated testing across multiple browsers.
Zapier MCP
A Zapier server that provides automation capabilities for various apps.