An autonomous agent that conducts deep research on any data using any LLM providers
- ✓Open-source license (Apache-2.0)
- ✓Recently active
- ✓Healthy fork ratio
- ✓Clear description
- ✓Topics declared
- ✓Mature repo (>1y old)
{
"mcpServers": {
"gpt-researcher": {
"command": "npx",
"args": ["-y", "skills"],
"env": {
"GOOGLE_API_KEY": "<google_api_key>"
}
}
}
}~/Library/Application Support/Claude/claude_desktop_config.json (Mac) or %APPDATA%\Claude\claude_desktop_config.json (Windows).<placeholder> values with your API keys or paths.GOOGLE_API_KEYMCP Servers overview
<div align="center" id="top">
<img src="https://github.com/assafelovic/gpt-researcher/assets/13554167/20af8286-b386-44a5-9a83-3be1365139c3" alt="Logo" width="80">
####
[](https://gptr.dev)
[](https://docs.gptr.dev)
[](https://discord.gg/QgZXvJAccX)
[](https://badge.fury.io/py/gpt-researcher)

[](https://colab.research.google.com/github/assafelovic/gpt-researcher/blob/master/docs/docs/examples/pip-run.ipynb)
[](https://hub.docker.com/r/gptresearcher/gpt-researcher)
[](https://skills.sh/assafelovic/gpt-researcher/gpt-researcher)
[](https://twitter.com/assaf_elovic)
[English](README.md) | [中文](README-zh_CN.md) | [日本語](README-ja_JP.md) | [한국어](README-ko_KR.md)
</div>
# 🔎 GPT Researcher
**GPT Researcher the first open deep research agent designed for both web and local research on any given task.**
The agent produces detailed, factual, and unbiased research reports with citations. GPT Researcher provides a full suite of customization options to create tailor made and domain specific research agents. Inspired by the recent [Plan-and-Solve](https://arxiv.org/abs/2305.04091) and [RAG](https://arxiv.org/abs/2005.11401) papers, GPT Researcher addresses misinformation, speed, determinism, and reliability by offering stable performance and increased speed through parallelized agent work.
**Our mission is to empower individuals and organizations with accurate, unbiased, and factual information through AI.**
## Why GPT Researcher?
- Objective conclusions for manual research can take weeks, requiring vast resources and time.
- LLMs trained on outdated information can hallucinate, becoming irrelevant for current research tasks.
- Current LLMs have token limitations, insufficient for generating long research reports.
- Limited web sources in existing services lead to misinformation and shallow results.
- Selective web sources can introduce bias into research tasks.
## Demo
<a href="https://www.youtube.com/watch?v=f60rlc_QCxE" target="_blank" rel="noopener">
<img src="https://github.com/user-attachments/assets/ac2ec55f-b487-4b3f-ae6f-b8743ad296e4" alt="Demo video" width="800" target="_blank" />
</a>
## Install as Claude Skill
Extend Claude's deep research capabilities by installing GPT Researcher as a [Claude Skill](https://skills.sh/assafelovic/gpt-researcher/gpt-researcher):
```bash
npx skills add assafelovic/gpt-researcher
```
Once installed, Claude can leverage GPT Researcher's deep research capabilities directly within your conversations.
## Architecture
The core idea is to utilize 'planner' and 'execution' agents. The planner generates research questions, while the execution agents gather relevant information. The publisher then aggregates all findings into a comprehensive report.
<div align="center">
<img align="center" height="600" src="https://github.com/assafelovic/gpt-researcher/assets/13554167/4ac896fd-63ab-4b77-9688-ff62aafcc527">
</div>
Steps:
* Create a task-specific agent based on a research query.
* Generate questions that collectively form an objective opinion on the task.
* Use a crawler agent for gathering information for each question.
* Summarize and source-track each resource.
* Filter and aggregate summaries into a final research report.
## Tutorials
- [How it Works](https://docs.gptr.dev/blog/building-gpt-researcher)
- [How to Install](https://www.loom.com/share/04ebffb6ed2a4520a27c3e3addcdde20?sid=da1848e8-b1f1-42d1-93c3-5b0b9c3b24ea)
- [Live Demo](https://www.loom.com/share/6a3385db4e8747a1913dd85a7834846f?sid=a740fd5b-2aa3-457e-8fb7-86976f59f9b8)
## Features
- 📝 Generate detailed research reports using web and local documents.
- 🖼️ Smart image scraping and filtering for reports.
- 🍌 **AI-generated inline images** using Google Gemini (Nano Banana) for visual illustrations.
- 📜 Generate detailed reports exceeding 2,000 words.
- 🌐 Aggregate over 20 sources for objective conclusions.
- 🖥️ Frontend available in lightweight (HTML/CSS/JS) and production-ready (NextJS + Tailwind) versions.
- 🔍 JavaScript-enabled web scraping.
- 📂 Maintains memory and context throughout research.
- 📄 Export reports to PDF, Word, and other formats.
## 📖 Documentation
See the [Documentation](https://docs.gptr.dev/docs/gpt-researcher/getting-started) for:
- Installation and setup guides
- Configuration and customization options
- How-To examples
- Full API references
## ⚙️ Getting Started
### Installation
1. Install Python 3.11 or later. [Guide](https://www.tutorialsteacher.com/python/install-python).
2. Clone the project and navigate to the directory:
```bash
git clone https://github.com/assafelovic/gpt-researcher.git
cd gpt-researcher
```
3. Set up API keys by exporting them or storing them in a `.env` file.
```bash
export OPENAI_API_KEY={Your OpenAI API Key here}
export TAVILY_API_KEY={Your Tavily API Key here}
```
(Optional) For enhanced tracing and observability, you can also set:
```bash
# export LANGCHAIN_TRACING_V2=true
# export LANGCHAIN_API_KEY={Your LangChain API Key here}
```
For custom OpenAI-compatible APIs (e.g., local models, other providers), you can also set:
```bash
export OPENAI_BASE_URL={Your custom API base URL here}
```
4. Install dependencies and start the server:
```bash
pip install -r requirements.txt
python -m uvicorn main:app --reload
```
Visit [http://localhost:8000](http://localhost:8000) to start.
For other setups (e.g., Poetry or virtual environments), check the [Getting Started page](https://docs.gptr.dev/docs/gpt-researcher/getting-started).
## Run as PIP package
```bash
pip install gpt-researcher
```
### Example Usage:
```python
...
from gpt_researcher import GPTResearcher
query = "why is Nvidia stock going up?"
researcher = GPTResearcher(query=query)
# Conduct research on the given query
research_result = await researcher.conduct_research()
# Write the report
report = await researcher.write_report()
...
```
**For more examples and configurations, please refer to the [PIP documentation](https://docs.gptr.dev/docs/gpt-researcher/gptr/pip-package) page.**
### 🔧 MCP Client
GPT Researcher supports MCP integration to connect with specialized data sources like GitHub repositories, databases, and custom APIs. This enables research from data sources alongside web search.
```bash
export RETRIEVER=tavily,mcp # Enable hybrid web + MCP research
```
```python
from gpt_researcher import GPTResearcher
import asyncio
import os
async def mcp_research_example():
# Enable MCP with web search
os.environ["RETRIEVER"] = "tavily,mcp"
researcher = GPTResearcher(
query="What are the top open source web research agents?",
mcp_configs=[
{
"name": "github",
"command": "npx",
"args": ["-y", "@modelcontextprotocol/server-github"],
"env": {"GITHUB_TOKEN": os.getenv("GITHUB_TOKEN")}
}
]
)
research_result = await researcher.conduct_research()
report = await researcher.write_report()
return report
```
> For comprehensive MCP documentation and advanced examples, visit the [MCP Integration Guide](https://docs.gptr.dev/docs/gpt-researcher/retrievers/mcp-configs).
## 🍌 Inline Image Generation
GPT Researcher can automatically generate and embed AI-created illustrations in your research reports using Google's Gemini models (Nano Banana).
```bash
# Enable in your .env file
IMAGE_GENERATION_ENABLED=true
GOOGLE_API_KEY=your_google_api_key
IMAGE_GENERATION_MODEL=models/gemini-2.5-flash-image
```
When enabled, the system will:
1. Analyze your research context to identify visualization opportunities
2. Pre-generate 2-3 relevant images during the research phase
3. Embed them inline as the report is written
Images are generated with dark-mode styling that matches the GPT Researcher UI, featuring professional infographic aesthetics with teal accents.
[Learn more about Image Generation](https://docs.gptr.dev/docs/gpt-researcher/gptr/image_generation) in our documentation.
## ✨ Deep Research
GPT Researcher now includes Deep Research - an advanced recursive research workflow that explores topics with agentic depth and breadth. This feature employs a tree-like exploration pattern, diving deeper into subtopics while maintaining a comprehensive view of the research subject.
- 🌳 Tree-like exploration with configurable depth and breadth
- ⚡️ Concurrent processing for faster results
- 🤝 Smart context management across research branches
- ⏱️ Takes ~5 minutes per deep research
- 💰 Costs ~$0.4 per research (using `o3-mini` on "high" reasoning effort)
[Learn more about Deep Research](https://docs.gptr.dev/docs/gpt-researcher/gptr/deep_research) in our documentation.
## Run with Docker
> **Step 1** - [Install Docker](https://docs.gptr.dev/docs/gpt-reseaWhat people ask about gpt-researcher
What is assafelovic/gpt-researcher?
+
assafelovic/gpt-researcher is mcp servers for the Claude AI ecosystem. An autonomous agent that conducts deep research on any data using any LLM providers It has 26.7k GitHub stars and was last updated 10d ago.
How do I install gpt-researcher?
+
You can install gpt-researcher by cloning the repository (https://github.com/assafelovic/gpt-researcher) or following the README instructions on GitHub. ClaudeWave also provides quick install blocks on this page.
Is assafelovic/gpt-researcher safe to use?
+
Our security agent has analyzed assafelovic/gpt-researcher and assigned a Trust Score of 100/100 (tier: Verified). See the full breakdown of passed checks and flags on this page.
Who maintains assafelovic/gpt-researcher?
+
assafelovic/gpt-researcher is maintained by assafelovic. The last recorded GitHub activity is from 10d ago, with 211 open issues.
Are there alternatives to gpt-researcher?
+
Yes. On ClaudeWave you can browse similar mcp servers at /categories/mcp, sorted by popularity or recent activity.
Deploy gpt-researcher to your cloud
Ship this repo to production in minutes. Each platform spins up its own environment with editable env vars.
Maintain this repo? Add a badge to your README
Drop the badge into your GitHub README to show it's tracked on ClaudeWave. Each badge links back to this page and reflects the live Trust Score.
[](https://claudewave.com/repo/assafelovic-gpt-researcher)<a href="https://claudewave.com/repo/assafelovic-gpt-researcher"><img src="https://claudewave.com/api/badge/assafelovic-gpt-researcher" alt="Featured on ClaudeWave — assafelovic/gpt-researcher" width="320" height="64" /></a>More MCP Servers
Fair-code workflow automation platform with native AI capabilities. Combine visual building with custom code, self-host or cloud, 400+ integrations.
User-friendly AI Interface (Supports Ollama, OpenAI API, ...)
An open-source AI agent that brings the power of Gemini directly into your terminal.
A collection of MCP servers.
The fastest path to AI-powered full stack observability, even for lean teams.
The all-in-one AI productivity accelerator. On device and privacy first with no annoying setup or configuration.