Skip to content

Feature Request: Add MCP Client Sampling Support to Codex #4929

@yaonyan

Description

@yaonyan

What feature would you like to see?

Add MCP Client Sampling support to Codex, following the MCP Specification.

What this enables:
This allows MCP servers to call AI models through the CLI client, transforming simple MCP tools into intelligent agents that can reason and make decisions.

Practical Example - Background Code Style Checker:

While coding, you start an MCP style checker tool that returns immediately, then works silently in the background.

It monitors file saves and uses AI (via sampling) to analyze violations in context - understanding your project's conventions and complexity patterns.

The analysis runs asynchronously without blocking your workflow, using your existing OpenAI subscription and requiring no separate API keys.

Technical Requirements:

  • Declare sampling: {} capability when connecting to MCP servers
  • Handle sampling/createMessage requests from MCP servers
  • Support model preferences (hints, cost/speed/intelligence priorities)
  • Forward requests to appropriate OpenAI models

Benefits:

  • Users don't need separate API keys for each MCP server
  • Users control which model is used (balancing cost vs intelligence)
  • Enables context-aware MCP tools that can perform complex reasoning
  • Provides feature parity with VS Code (which already supports this)

Additional information

This feature is part of the MCP specification and would unlock a new ecosystem of intelligent MCP tools for codex users.

For reference, VS Code already supports this feature with GitHub Copilot, allowing MCP servers to leverage the user's existing AI subscription through sampling requests:

Image

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions