DR-Tulu is AI2's open-source research agent—an 8B model that autonomously plans research strategies, searches the web, reads pages, and synthesizes comprehensive answers with citations. Unlike typical LLMs with tools bolted on, DR-Tulu was trained end-to-end with its MCP tools, making it a compelling open alternative to proprietary research APIs. This guide walks through deploying DR-Tulu on Vast.ai, where you can run the full agent stack on affordable GPU instances and integrate it into your own agentic applications.
Key features include:
Our latest documentation walks you through deploying DR-Tulu on Vast.ai with a split architecture: vLLM for GPU inference on Vast.ai, MCP backend running locally for tool execution.
Complete Deployment Instructions
Three Ways to Use DR-Tulu
The split deployment keeps GPU-intensive inference on Vast.ai while running tool orchestration locally. This means:
This deployment guide is perfect for:
The complete guide is now available in our documentation:
Read: Running DR-Tulu on Vast.ai →
Whether you're exploring agentic AI architectures or ready to deploy your own research assistant, this guide provides everything you need to get DR-Tulu running on Vast.ai infrastructure.
Ready to try it? Sign up for Vast.ai and follow the guide to deploy your first instance.