Skip to content

GlacierEQ/Autogen_GraphRAG_Ollama

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

40 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

GraphRAG + AutoGen + Ollama + Chainlit UI = Local Multi-Agent RAG Superbot

Graphical Abstract

This application integrates GraphRAG with AutoGen agents, powered by local LLMs from Ollama, for free and offline embedding and inference. Key highlights include:

  • Agentic-RAG: - Integrating GraphRAG's knowledge search method with an AutoGen agent via function calling.
  • Offline LLM Support: - Configuring GraphRAG (local & global search) to support local models from Ollama for inference and embedding.
  • Non-OpenAI Function Calling: - Extending AutoGen to support function calling with non-OpenAI LLMs from Ollama via Lite-LLM proxy server.
  • Interactive UI: - Deploying Chainlit UI to handle continuous conversations, multi-threading, and user input settings.

Main Interfacce Widget Settings

Useful Links 🔗

  • Full Guide: Microsoft's GraphRAG + AutoGen + Ollama + Chainlit = Fully Local & Free Multi-Agent RAG Superbot Medium.com 📚

📦 Installation and Setup Linux

Follow these steps to set up and run AutoGen GraphRAG Local with Ollama and Chainlit UI:

  1. Install LLMs:

    Visit Ollama's website for installation files.

    ollama pull mistral
    ollama pull nomic-embed-text
    ollama pull llama3
    ollama serve
  2. Create conda environment and install packages:

    conda create -n RAG_agents python=3.12
    conda activate RAG_agents
    git clone https://github.com/karthik-codex/autogen_graphRAG.git
    cd autogen_graphRAG
    pip install -r requirements.txt
  3. Initiate GraphRAG root folder:

    mkdir -p ./input
    python -m graphrag.index --init  --root .
    mv ./utils/settings.yaml ./
  4. Replace 'embedding.py' and 'openai_embeddings_llm.py' in the GraphRAG package folder using files from Utils folder:

    sudo find / -name openai_embeddings_llm.py
    sudo find / -name embedding.py
  5. Create embeddings and knowledge graph:

    python -m graphrag.index --root .
  6. Start Lite-LLM proxy server:

    litellm --model ollama_chat/llama3
  7. Run app:

    chainlit run appUI.py

📦 Installation and Setup Windows

Follow these steps to set up and run AutoGen GraphRAG Local with Ollama and Chainlit UI on Windows:

  1. Install LLMs:

    Visit Ollama's website for installation files.

    ollama pull mistral
    ollama pull nomic-embed-text
    ollama pull llama3
    ollama serve
  2. Create conda environment and install packages:

    git clone https://github.com/karthik-codex/autogen_graphRAG.git
    cd autogen_graphRAG
    python -m venv venv
    ./venv/Scripts/activate
    pip install -r requirements.txt
  3. Initiate GraphRAG root folder:

    mkdir input
    python -m graphrag.index --init  --root .
    cp ./utils/settings.yaml ./
  4. Replace 'embedding.py' and 'openai_embeddings_llm.py' in the GraphRAG package folder using files from Utils folder:

    cp ./utils/openai_embeddings_llm.py .\venv\Lib\site-packages\graphrag\llm\openai\openai_embeddings_llm.py
    cp ./utils/embedding.py .\venv\Lib\site-packages\graphrag\query\llm\oai\embedding.py 
  5. Create embeddings and knowledge graph:

    python -m graphrag.index --root .
  6. Start Lite-LLM proxy server:

    litellm --model ollama_chat/llama3
  7. Run app:

    chainlit run appUI.py

About

Microsoft's GraphRAG + AutoGen + Ollama + Chainlit = Fully Local & Free Multi-Agent RAG Superbot

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 82.0%
  • PowerShell 7.5%
  • Batchfile 5.7%
  • CSS 2.7%
  • HTML 1.0%
  • JavaScript 0.7%
  • Shell 0.4%
pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy