The Problem We Were Solving: Why Does a FinOps Engineer Need Wiv MCP?

If you use an AI assistant like Cursor, Claude, or ChatGPT, you’ve probably found yourself doing something like this: asking the AI to analyze a cost anomaly or draft a rightsizing recommendation, then switching to another tab to manually update the case, ping the resource owner on Slack, and open a ticket on your Jira – then switching back to your chat. It’s tedious – and it breaks your focus right when you need it most.

Wiv is a FinOps automation platform. We help cloud cost teams move from insight to action – automating the repetitive operational work that sits between a recommendation and an actual savings. When the MCP (Model Context Protocol) standard emerged, we saw an opportunity: what if your AI assistant could handle that entire follow-up loop directly – opening a case, notifying the right team, and tracking progress – without you ever leaving the chat?

That’s what we built. This post is the story of how we did it – and a few things we learned along the way.

What Is MCP, in Plain English?

MCP stands for Model Context Protocol. Think of it as a plug-and-socket standard: AI assistants are the plugs, and tools like Wiv are the sockets. Once you connect them, the AI can call your tools directly – search cases, trigger workflows, check execution status – without you doing the copy-paste in between.

Before MCP, every AI tool integration was custom-built and fragile. MCP gives us one standard that works across Cursor, Claude, ChatGPT, VS Code, and more.

What Can Your AI Assistant Actually Do?

Once connected, your AI assistant has access to Wiv tools, organized into five areas:

CategoryWhat it covers
WorkflowsList, view, and update your workflows
ExecutionsStart, stop, and check the status of workflow runs
SpacesBrowse and explore your Wiv workspaces
CasesSearch, filter, and summarize cases – including across multiple tenants
AI GenerationGenerate new workflows from a plain-language description

In practice, this means you can say things like:

  • “List all my workflows in the Operations space”
  • “Run the Monthly Report workflow for ACME Corp”
  • “Show me all open cases from last week”
  • “Create a new workflow that monitors our AWS costs and alerts on anomalies”

The AI assistant figures out which Wiv tool to call and handles all the API details. You just describe what you want.

The Infrastructure: Simple by Design

We chose AWS App Runner to host the server. The pitch is simple: give it a container image, and it handles everything else – scaling, HTTPS, health checks, zero-downtime deploys. We don’t manage servers.

The server itself is a Python application that speaks MCP. When an AI assistant connects, it gets a list of Wiv tools it can call. When it calls one, the server translates that into a Wiv API request and returns the result.

One interesting infrastructure detail: we support two connection styles. The newer “Streamable HTTP” style makes short, independent requests each time a tool is called – which means the server can scale down to zero when nobody’s using it. The older “SSE” style holds a persistent connection open, which we still support for backward compatibility.

Sign In Once, Use Everywhere

The part of this project that took the most thought wasn’t the tools themselves – it was authentication. How do you securely connect an AI assistant (running on someone else’s server) to a user’s Wiv account?

We use a standard called OAuth 2.1 – the same protocol used when you click “Sign in with Google” on a website. Here’s the flow from a user’s perspective:

  1. Visit mcp.wiv.ai/connect – you see a branded page with a “Sign in with Wiv” button.
  2. Sign in – you’re taken to Wiv’s normal login page. No new account needed.
  3. One-click install – after signing in, you see ready-made install buttons for Cursor, VS Code, Claude Desktop, and more. Click one, and the MCP server is added to your AI assistant automatically.
  4. Start using it – open your AI assistant and ask it to list your workflows, run a workflow, or check an execution status. It just works.

Behind the scenes, we do some extra work to make this secure: we generate a short-lived credential for your specific Wiv organization, and that credential is what the AI assistant uses to make requests on your behalf. It never touches your actual password or long-lived account tokens.

Setting It Up: Seven Clients, One Page

Different AI assistants handle authentication differently. Cursor and Claude have native OAuth support – they’ll pop up a sign-in window automatically. Others like VS Code and Windsurf use an API key that you paste into a config file. We support both flows.

The /connect page generates the right setup instructions for each client – no guesswork. Here’s a quick overview:

AI AssistantHow you connect
CursorOne-click OAuth – no config needed
Claude (Desktop or Web)One-click OAuth or paste a URL
ChatGPTOne-click OAuth
VS CodePaste a short JSON config with your API key
WindsurfPaste a short JSON config with your API key
Gemini CLIPaste a one-liner command

For OAuth-supported clients, the config is as simple as:

{ "mcpServers": { "wiv": { "url": "https://mcp.wiv.ai/mcp" } } }

That’s it. The AI assistant handles the rest.

What We Learned

A few things surprised us during the build that are worth sharing:

The sign-in flow is the hardest part

The actual tool definitions were straightforward. Getting OAuth to work correctly across multiple AI clients – each with slightly different expectations – took the most iteration. Modern clients expect the MCP server itself to act as an OAuth server, not just redirect to a third-party identity provider.

Stateless connections scale much better

Our newer “Streamable HTTP” connection style makes individual requests only when needed. This means the server can spin down when idle and scale up instantly when someone starts using it. The older persistent connection style (SSE) keeps a connection open the entire time, which is fine but less efficient for cloud infrastructure.

Small details matter for the user experience

One-click install links, pre-formatted config snippets, and clear labels like “OAuth” vs “API Key” make a big difference. The faster someone gets from “I want to try this” to “it’s working in my AI assistant,” the better. We aim for under 30 seconds.

Security defaults should be invisible

Things like PKCE, token introspection, and URL fragment-based key delivery are important for security – but users shouldn’t need to know about them. We made sure the secure path is also the easy path.

Try It Yourself

If you’re a Wiv customer, you can connect your AI assistant at https://mcp.wiv.ai. It takes about 30 seconds.

If you’re an AI assistant user curious about what’s possible: open Cursor, Claude, or ChatGPT, connect the Wiv MCP server, and try asking it to list your workflows. It’s a good demo of what this new generation of AI-tool integrations can do.

If you’re a developer interested in the technical details – the dual PKCE chains, the ASGI middleware approach, the contextvars pattern – we’re happy to go deeper. Reach out to the Wiv team.