llm-ollama-llamaindex-bootstrap
PublicDesigned for offline use, this RAG application template offers a starting point for building your own local RAG pipeline, independent of online APIs and cloud-based LLM services like OpenAI.
Designed for offline use, this RAG application template offers a starting point for building your own local RAG pipeline, independent of online APIs and cloud-based LLM services like OpenAI.