Skip to content

Fully local web research and report writing assistant. This repo is a Typescript edition of the Ollama Deep Researcher.

License

Notifications You must be signed in to change notification settings

PacoVK/ollama-deep-researcher-ts

Repository files navigation

Ollama Deep Researcher


Typescript Edition

This repo is a Typescript edition of the Ollama Deep Researcher and the repo structure is inspired by the langgraphjs-starter-template

Ollama Deep Researcher is a fully local web research assistant that uses any LLM hosted by Ollama. Give it a topic and it will generate a web search query, gather web search results (via Tavily), summarize the results of web search, reflect on the summary to examine knowledge gaps, generate a new search query to address the gaps, search, and improve the summary for a user-defined number of cycles. It will provide the user a final markdown summary with all sources used.

research-rabbit

Short summary:

Ollama.Deep.Researcher.Overview-enhanced-v2-90p.mp4

📺 Video Tutorials

See it in action or build it yourself? Check out these helpful video tutorials:

🚀 Quickstart

Configuration

In ./src/agent/configuration.ts you can set the following configurations:

  • Set the name of your local LLM to use with Ollama (it will by default be llama3.2)
  • You can set the depth of the research iterations (it will by default be 3)
  • You can set the base url of your llm (it will by default be http://localhost:11434)

Mac

  1. Download the Ollama app for Mac here.

  2. Pull a local LLM from Ollama. As an example:

ollama pull deepseek-r1:8b
  1. Setup web search tool:

Set the corresponding environment variable via .env file you can simply copy from .env.example:

TAVILY_API_KEY=<your_tavily_api_key>
  1. Clone the repository and launch the assistant with the LangGraph server:
# Clone the repository and start the LangGraph server
git clone https://github.com/pacovk/ollama-deep-researcher-ts.git
cd ollama-deep-researcher-ts
yarn install
docker compose up -d

Using the LangGraph Studio UI

Docker Compose

You can use the provided docker-compose.yml file to run LangGraph Studio with the Ollama Deep Researcher assistant.

  1. Start the LangGraph server with the Ollama Deep Researcher assistant:
docker compose up -d

Visit LangGraph Studio Web UI to interact with the assistant.

Give the assistant a topic for research, and you can visualize its process!

Screenshot 2025-01-24 at 10 08 22 PM

ℹ️ There is also a Ollama Web UI available at http://localhost:2024 to interact with the assistant.

How it works

Ollama Deep Researcher is inspired by IterDRAG. This approach will decompose a query into sub-queries, retrieve documents for each one, answer the sub-query, and then build on the answer by retrieving docs for the second sub-query. Here, we do similar:

  • Given a user-provided topic, use a local LLM (via Ollama) to generate a web search query
  • Uses a search engine (configured for Tavily) to find relevant sources
  • Uses LLM to summarize the findings from web search related to the user-provided research topic
  • Then, it uses the LLM to reflect on the summary, identifying knowledge gaps
  • It generates a new search query to address the knowledge gaps
  • The process repeats, with the summary being iteratively updated with new information from web search
  • It will repeat down the research rabbit hole
  • Runs for a configurable number of iterations (see configuration tab)

Outputs

The output of the graph is a markdown file containing the research summary, with citations to the sources used.

All sources gathered during research are saved to the graph state.

You can visualize them in the graph state, which is visible in LangGraph Studio:

Screenshot 2024-12-05 at 4 08 59 PM

The final summary is saved to the graph state as well:

Screenshot 2024-12-05 at 4 10 11 PM

About

Fully local web research and report writing assistant. This repo is a Typescript edition of the Ollama Deep Researcher.

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published