Bringing MCPs to the Cloud: How We Won the E2B Hackathon

AI

AI

AI

At the E2B Hackathon in Prague, we built a fully browser-based AI assistant powered by Anthropic’s Model Context Protocol (MCP). Our project lets you query live data from tools like GitHub—no local setup required.

The Challenge: Real-Time AI Tool Integration Without Local Setup

At the recent E2B Hackathon in Prague, our team set out to address an important challenge in the AI ecosystem: giving AI assistants the ability to retrieve and interact with live data from dynamic sources like internal databases, business systems etc. Our project focused on utilizing Anthropic’s Model Context Protocol (MCP) in a fully browser-based environment—making it easier to integrate AI agents with real-world tools without requiring any local setup.

In classic hackathon fashion, we pulled it off just in time—our project started working a minute before the submission deadline. The demo wasn’t flawless, but we successfully showcased a browser-based agentic flow that used MCP to query a database using natural language, all without any local setup. While we focused on a database example for simplicity, MCP servers are quickly emerging for tools like Github, Sentry, Zapier, and many more—making it clear this standard is gaining traction fast. The project resonated with the judges, and we were proud to walk away with first place.

Curious how we built it? Keep reading—we’re breaking it all down below.

What is MCP

The Model Context Protocol (MCP) is an open standard developed by Anthropic that enables two-way connections between AI assistants and data sources—whether that’s internal databases, Gmail, dev environments, or business platforms.

Instead of building and maintaining custom connectors for every tool, MCP offers a unified protocol that simplifies and scales the integration process.

MCP servers expose access to specific sources, while MCP clients—like AI applications or assistants—can query and interact with those servers to fetch relevant context. This allows AI models to understand the data landscape they’re operating in, choose the right tools for the task, and respond in a more informed and practical way.

In short: MCP connects your AI to the rest of your stack—consistently.

Our Hackathon Challenge: Running MCP in the Browser

While MCP is powerful, it's typically run in local environments utilizing hosts like Claude Desktop as shown in picture above. We wanted to take that one step further: What if you could use an MCP-powered workflow right from the browser—no installs, no setup, just connect and go?

Here’s what we built:

  • A browser-based chat interface that can connect to any MCP server

  • Integration with E2B’s sandbox to run MCP servers securely

  • Support for live queries to data sources like GitHub using existing MCP servers

  • All of it runs fully in the browser—no local setup or config needed

This meant a user could open a web app and ask the AI,

“What are the open pull requests on our GitHub repo?”

…and get live results—powered by MCP, running securely in a hosted environment.

Zero Local Setup AI Chat with MCP Servers

Here's a live version you can try directly in your browser:

🌎 https://netglade.github.io/mcp-chat/

Everything runs on the client-side, meaning your API keys and data are never ever sent to our servers.

We’ve refactored our hackathon demo to showcase how you can integrate MCP support directly in the browser. It acts as a minimal, real-world example of how to:

  • Run MCP servers in the E2B cloud

  • Integrate with Vercel’s AI SDK and its newly added MCP support

  • Connect tools like GitHub via MCP without any local dependencies

Open Source Tools to Help You Get Started

We've open-sourced the demo on GitHub, so you can dive in, fork it, or use it as a boilerplate for your own MCP projects.

To make things easier, we turned the core of our implementation into an npm package @netglade/mcp-sandbox you can build on. Making it dead simple to spin up any MCP server inside an E2B sandbox environment.

npm install @netglade/mcp-sandbox
import { startMcpSandbox } from '@netglade/mcp-sandbox';

// Start the MCP sandbox
const mcpSandbox = await startMcpSandbox({
  command: 'npx -y @modelcontextprotocol/server-brave-search',
  apiKey: 'e2b_****',
});

// Get the MCP server URL to connect your AI assistant
const mcpUrl = mcpSandbox.getUrl();
console.log("MCP server URL:", mcpUrl);

Features:

🧪 Uses supergateway to convert stdio-based MCP servers to SSE

🚀 Runs in a secure, isolated cloud sandbox

🔌 Supports any existing MCP server

Technical Breakdown: How It Works

  • MCP Server Integration
    You can use existing, open-source MCP servers for services like GitHub and Postgres. These act as standardized connectors to real-time data, exposing each service through a consistent protocol.

  • E2B Sandbox Runtime
    We run these servers inside E2B’s hosted sandbox environment. It provides an isolated, secure runtime with just enough access to execute tools and stream results—no infrastructure or manual setup required.

  • Browser-Based MCP Clients via Vercel AI SDK
    To connect to the MCP servers, we use the Vercel AI SDK’s newly added MCP support. This allows us to create MCP clients directly in our browser chat client, fetch available tools from remote servers, and pass them seamlessly into the LLM—all with minimal code.

  • Custom Web Interface
    We modified the E2B’s fragments chat interface demo, enabling users to ask questions in natural language. Behind the scenes, the app collects all available tools from connected MCP servers and passes them, along with the user’s query, to the LLM. The model then decides which tools (if any) to call and with what parameters.

  • No Local Setup Required
    The entire system runs fully in the browser—no installs, no local setup. Just open the app, connect any MCP server you need, and start querying live data with natural language.

Final Shoutouts & Thanks

Big thanks to the E2B team and Rohlik Group for organizing a smooth, well-run event that gave us the room to experiment and actually build something that works. It was also great to see the variety of creative projects other teams pulled together in just a single day—plenty of inspiration all around.

Sources:
https://lu.ma/b9ekgwqf?tk=oVk5oe

https://vercel.com/blog/ai-sdk-4-2

https://www.anthropic.com/news/model-context-protocol

https://modelcontextprotocol.io/introduction