The idea

"Postman meets Datadog for MCP servers." Continuous compliance testing, tool-call monitoring, latency tracking, scored health reports. A thin, affordable SaaS — $29 / $99 / $249 — for indie devs and small teams running MCP servers in production.

I'd been using Anthropic's MCP Inspector locally and was frustrated that it didn't run continuously, didn't alert, didn't keep history. That gap looked real. Model Context Protocol is exploding; surely someone needed monitoring.

It turned out several people already did it, better.

What I built in week one

The work itself was fine. The engineering was clean, the compliance library was genuinely useful, and the architecture would have scaled. That's not the problem.

What I found in week two when I actually looked

Here's who was already in the space on the day I started. I just hadn't searched hard enough.

destilabs/mcp-doctor
Open-source Python CLI that scores MCP servers 0–100, on mcpdoctor.dev. Same name. Same scoring concept. Free. The name I'd been congratulating myself for picking was already someone else's open-source project with the better TLD.
Apify MCP Validator
Hosted, $10/month. The "affordable SaaS" wedge I was planning to own at $29? There's a $10 one already on a marketplace with an existing audience.
MintMCP
Enterprise governance, backed by Andrej Karpathy, Jeff Dean, and Coatue. Not competing with that cap table. But they define the ceiling of the market and squeeze everyone beneath them.
Speakeasy MCP Platform
Full MCP control plane for enterprises. Covers the "serious production use" segment.
Janix-ai/mcp-validator
Compliance test scripts from an ex-SXSW CTO. Open source, developer credibility built in.
MCPJam Inspector
Fork of Anthropic's inspector with an OAuth debugger and LLM playground. Free. What most developers would actually use.

Put together: a free open-source CLI at the low end, a $10/month hosted validator on a discovery-optimized marketplace, a funded enterprise platform, a funded governance player, and Anthropic's own free inspector as the baseline every developer sees first. My $29–$249 SaaS sat in the middle of a six-way squeeze with no defensible wedge.

Why I killed it instead of pushing through

I had three honest choices:

  1. Push forward. Build the MVP, launch the waitlist, try to find 50 people who want to pay $29/month for compliance testing despite the above. Realistic outcome: a long grind against better-funded, better-known competitors with me as a solo founder.
  2. Pivot the positioning. Reframe as "MCP uptime monitoring for Claude Code users" or something narrower. The narrow version has even fewer paying customers, and competitors can add that feature in a sprint.
  3. Kill it. Refund my own attention. Redirect the two domains to this post. Ship something the market is actually asking for.
Decision rule I wish I'd applied on day one: if there are already three funded or popular competitors covering the same price band, you need a real moat or a specific buyer who hates them — not just "a better version." I had neither.

Killing is cheap when the numbers are small. It gets exponentially harder after you hire, take funding, or attract users. Nine days in was the easiest it would ever be to walk away. So I did.

The real cost wasn't the money

The dollar cost was negligible. The real cost was the confidence trap.

I had built things that worked before, so I assumed this market was as open as it looked from the inside. I skipped searching because I wanted to feel productive building instead of slow for validating. That's the whole mistake in one sentence: confidence compounded into momentum, and momentum delayed the search that would have stopped me on day one.

The antidote is a habit, not a resolution. Before I buy a domain, write a landing page, or pick a name, I now do this:

  1. Google the name in quotes. Check the first two pages.
  2. Search NPM, PyPI, and GitHub for it.
  3. Check whether the .dev and .ai TLDs are taken by something related.
  4. Find the three closest competitors and read their pricing pages.
  5. Write down what my moat is in one sentence. If I can't, stop.

Four minutes. Every time. No exceptions for "I already know this space."

What mcpdoctor.ai does now

Both domains 301 redirect to this post. If you came looking for an MCP monitoring SaaS:

One of those is the right answer for you.

What I'm building instead

Smara — a memory API for AI applications, priced at $19/month for developers and $99/month for Pro. That wedge is defensible because the comparable tools (Mem0 at $249/mo, Zep at $475/mo Flex Plus, Letta at $200/mo Max) are charging 2.5–25× more and targeting mid-market enterprise buyers, not the indie dev who wants to ship a memory-aware bot this weekend.

This time I validated the pricing, the buyer, and the moat before writing a line of product code. I'll write about that process separately.

The takeaway

Shipping fast is a virtue. Shipping the wrong thing fast is just velocity aimed at a wall. The cheapest tool I now have in my stack isn't a framework — it's a four-minute search I run before I commit.

If you're about to build something, go do it now. For your name, your category, your wedge. You'll either validate faster or save yourself the detour.