Codex CLI
85034b1 Core Metadata
Apache-2.0
This repository is licensed under the [Apache-2.0 License](LICENSE). Edit
apply_patch
builder.register_handler("apply_patch", apply_patch_handler); name: "apply_patch" Context-anchored diff hunks (`@@` + context/old lines) matched with seek-sequence; mismatch errors if context or expected lines cannot be found.
change_context: ("@@" | "@@ " /(.+)/) LF If a chunk has a `change_context`, we use seek_sequence to find it Failed to find context '{}' in {} Failed to find expected lines in {} Reject-on-mismatch: patch input is reparsed/verified before apply; parse/context errors fail the call; no fuzzy merge path.
Re-parse and verify the patch apply_patch verification failed: {parse_error} apply_patch handler received invalid patch input No automatic apply_patch retry loop in the handler; a single verified parse/apply attempt returns success or error.
match codex_apply_patch::maybe_parse_apply_patch_verified(&command, &cwd) { MaybeApplyPatchVerified::CorrectnessError apply_patch handler received non-apply_patch input - implicit_invocation_rejected Raw patch bodies without explicit apply_patch invocation are rejected.
- parse_or_payload_error Unsupported payload and invalid/non-apply_patch input produce model-visible errors.
- context_or_expected_lines_missing Update hunks fail when context/old lines cannot be located.
- io_failures Read/write/delete failures bubble up as apply errors.
- partial_apply_possible Hunks are applied sequentially with no rollback, so early changes can persist if a later hunk fails.
CorrectnessError(ApplyPatchError::ImplicitInvocation) apply_patch handler received unsupported payload Failed to find expected lines in {} Failed to read file to update for hunk in hunks Tools
exec_command, write_stdin, shell, apply_patch, read_file, list_dir, grep_files, list_mcp_resources, list_mcp_resource_templates, read_mcp_resource, update_plan, request_user_input, js_repl, js_repl_reset, search_tool_bm25, view_image, web_search, spawn_agent, send_input, resume_agent, wait, close_agent, dynamic_tools, mcp_tools
create_exec_command_tool create_list_mcp_resources_tool builder.push_spec(create_apply_patch_freeform_tool()) builder.push_spec_with_parallel_support(create_read_file_tool(), true) builder.push_spec(ToolSpec::WebSearch builder.push_spec(create_spawn_agent_tool()) builder.push_spec(ToolSpec::Function(converted_tool)) builder.register_handler(tool.name.clone(), dynamic_tool_handler.clone()) Mixed: JSON-schema function tools, freeform grammar tools (Lark), native web_search tool type, plus MCP-to-tool conversion and dynamic tools.
ToolSpec::Function(ResponsesApiTool ToolSpec::Freeform(FreeformTool ToolSpec::WebSearch match mcp_tool_to_openai_tool Command execution is sandbox-policy driven (none/macOS seatbelt/Linux seccomp/Windows restricted token) with approval gating and execpolicy allow/prompt/forbid decisions.
match sandbox SandboxType::LinuxSeccomp SandboxType::WindowsRestrictedToken default_exec_approval_requirement Decision::Forbidden Config
admin managed preferences (macOS), system `/etc/codex/config.toml` (Unix) or `%ProgramData%\OpenAI\Codex\config.toml` (Windows), user `${CODEX_HOME}/config.toml`, cwd `${PWD}/config.toml` (trust-gated), tree parent `./.codex/config.toml` (trust-gated), repo `$(git rev-parse --show-toplevel)/.codex/config.toml` (trust-gated), runtime overrides (`--config`, UI model selector)
Configuration is built up from multiple layers in the following order `${CODEX_HOME}/config.toml` tree parent directories up to root looking for `./.codex/config.toml` runtime e.g., --config flags - name: CODEX_HOME meaning: Overrides Codex state/config home directory (default `~/.codex`).
- name: RUST_LOG meaning: Controls Codex logging verbosity.
- name: OPENAI_BASE_URL meaning: Overrides default OpenAI provider base URL.
- name: OPENAI_ORGANIZATION meaning: Supplies OpenAI-Organization request header.
- name: OPENAI_PROJECT meaning: Supplies OpenAI-Project request header.
- name: CODEX_OSS_PORT meaning: Sets default local OSS provider port used to build base URL.
- name: CODEX_OSS_BASE_URL meaning: Directly overrides local OSS provider base URL.
- name: CODEX_JS_REPL_NODE_PATH meaning: Highest-precedence Node runtime path for js_repl.
specified by the `CODEX_HOME` environment variable honors the `RUST_LOG` environment variable exporting `OPENAI_BASE_URL` "OPENAI_ORGANIZATION".to_string(), ("OpenAI-Project".to_string(), "OPENAI_PROJECT".to_string()), std::env::var("CODEX_OSS_PORT") std::env::var("CODEX_OSS_BASE_URL") `CODEX_JS_REPL_NODE_PATH` environment variable Extensions
true
pub use loader::load_skills const SKILLS_FILENAME: &str = "SKILL.md"; project config folders: `<layer-config-folder>/skills`, deprecated user location: `$CODEX_HOME/skills`, user-installed: `$HOME/.agents/skills`, embedded system cache: `$CODEX_HOME/skills/.system`, repo-local: directories between project root and cwd at `./.agents/skills`
path: config_folder.as_path().join(SKILLS_DIR_NAME) Deprecated user skills location (`$CODEX_HOME/skills`) `$HOME/.agents/skills` (user-installed skills). `$CODEX_HOME/skills/.system` let agents_skills = dir.join(AGENTS_DIR_NAME).join(SKILLS_DIR_NAME); Skill body in `SKILL.md` with required YAML frontmatter; optional metadata in `.agents/agents/openai.yaml`.
const SKILLS_FILENAME: &str = "SKILL.md"; const SKILLS_METADATA_DIR: &str = "agents"; const SKILLS_METADATA_FILENAME: &str = "openai.yaml"; missing YAML frontmatter delimited by --- Dynamic tool injection plus app connector tools; no separate package registry was found in this scan.
if !dynamic_tools.is_empty() builder.register_handler("search_tool_bm25", search_tool_handler) Use `$` in the composer to insert a ChatGPT connector Supports MCP client integration (configured servers/tools/resources) and an experimental MCP server mode (`codex mcp-server`).
Codex CLI functions as an MCP client Codex can be launched as an MCP _server_ by running `codex mcp-server` Manage external MCP servers for Codex pub enum McpServerTransportConfig Providers
openai, ollama, lmstudio, user_defined_via_model_providers
Built-in default provider list. ("openai", P::create_openai_provider()) OLLAMA_OSS_PROVIDER_ID LMSTUDIO_OSS_PROVIDER_ID User-defined entries inside `~/.codex/config.toml` under the `model_providers` Provider auth supports OpenAI login/auth token flow (`requires_openai_auth`) and environment-based API keys via `env_key`/`api_key()` lookup.
Does this provider require an OpenAI API Key or ChatGPT login token? Environment variable that stores the user's API key pub fn api_key(&self) Model/provider can be selected by CLI flags (`--model`, `--oss`, `--local-provider`) and config (`model_provider`, profile, fallback default `openai`).
#[arg(long, short = 'm', global = true)] #[arg(long = "oss", default_value_t = false)] #[arg(long = "local-provider")] let model_provider_id = model_provider unwrap_or_else(|| "openai".to_string()) Ux
CLI multitool with interactive TUI default and non-interactive `exec` mode.
If no subcommand is specified, options will be forwarded to the interactive CLI. Run Codex non-interactively. To run Codex non-interactively, run `codex exec PROMPT` Yes. Streaming is exposed in app-server event notifications and SDK `runStreamed()` event generator.
The app-server streams JSON-RPC notifications while a turn is running. use `runStreamed()` instead, which returns an async generator Approval controls include `--ask-for-approval` policies, sandbox policy selection, and a dangerous bypass mode; runtime execpolicy can allow/prompt/forbid commands.
Configure when the model requires human approval Select the sandbox policy Skip all confirmation prompts and execute commands without sandboxing match evaluation.decision Session rollouts persist as JSONL under `$CODEX_HOME/sessions` with archived sessions and append-only `session_index.jsonl`; `--ephemeral` disables persistence.
Rollouts are recorded as JSONL pub const SESSIONS_SUBDIR: &str = "sessions"; pub const ARCHIVED_SESSIONS_SUBDIR: &str = "archived_sessions"; const SESSION_INDEX_FILE: &str = "session_index.jsonl"; Run without persisting session files to disk. Context
Auto-compaction is token-threshold driven (`model_auto_compact_token_limit`), run pre-sampling and mid-turn; implementation chooses inline local or remote compaction task.
Token usage threshold triggering auto-compaction let token_limit_reached = total_usage_tokens >= auto_compact_limit if total_usage_tokens >= auto_compact_limit async fn run_auto_compact On normal sampling overflow, the turn errors with `ContextWindowExceeded`; during compaction, overflow triggers trimming oldest history items and retrying until fit or terminal failure.
Err(CodexErr::ContextWindowExceeded) Err(e @ CodexErr::ContextWindowExceeded) history.remove_first_item(); continue; Reliability
Automatic retries exist for retryable stream failures (with backoff and provider-configured retry budget) and HTTP transport failures (5xx/network/timeout) via request retry policy.
if !err.is_retryable() { let max_retries = turn_context.provider.stream_max_retries(); requested_delay.unwrap_or_else(|| backoff(retries)) max_attempts: self.request_max_retries() retry_5xx: true TransportError::Timeout | TransportError::Network(_) => self.retry_transport Recovery loops include stream transport fallback (WebSocket->HTTPS), invalid-image sanitization+continue, and compaction retry-by-trimming when compact prompts overflow context.
try_switch_fallback_transport Falling back from WebSockets to HTTPS transport if state.history.replace_last_turn_images("Invalid image") { Trim from the beginning to preserve cache (prefix-based) and keep recent messages intact. Integration
Supported TypeScript SDK package: `@openai/codex-sdk` wrapping the CLI.
The TypeScript SDK wraps the `codex` CLI npm install @openai/codex-sdk interactive_tui_default, non_interactive_exec, jsonl_event_output_mode, mcp_server_mode, app_server_mode
If no subcommand is specified, options will be forwarded to the interactive CLI. Run Codex non-interactively. Print events to stdout as JSONL. Start Codex as an MCP server (stdio). Run the app server Exec JSON mode uses JSONL event framing; app-server uses bidirectional JSON-RPC 2.0 over stdio JSONL or websocket text frames.
In --json mode, stdout must be valid JSONL, one event per line. supports bidirectional communication using JSON-RPC 2.0 messages stdio (`--listen stdio://`, default): newline-delimited JSON (JSONL) websocket (`--listen ws://IP:PORT`): one JSON-RPC message per websocket text frame Customization
System/developer instructions are customizable via config (`instructions`, `developer_instructions`, `model_instructions_file`), global AGENTS files under CODEX_HOME, and project AGENTS hierarchy (root->cwd).
/// System instructions. Developer instructions Optional path to a file containing model instructions that will override let user_instructions = Self::load_instructions(Some(&codex_home)); for candidate in [LOCAL_PROJECT_DOC_FILENAME, DEFAULT_PROJECT_DOC_FILENAME] Collect every `AGENTS.md` found from the repository root down to the Built-in instruction templates are Markdown files embedded via `include_str!`; user custom prompts load from `$CODEX_HOME/prompts` as `.md` files with optional frontmatter keys (`description`, `argument-hint`).
include_str!("prompts/base_instructions/default.md") include_str!("prompts/permissions/approval_policy/never.md") default prompts directory: `$CODEX_HOME/prompts` Only include Markdown files with a .md extension - `description`: short description shown in the slash popup No keybinding config file/setting location was found; shortcuts appear implemented in TUI code.
fn shortcut_overlay_lines(state: ShortcutsState) for descriptor in SHORTCUTS tui.alternate_screen No explicit theme configuration file/setting was found in this scan; TUI color handling references terminal palette behavior.
The first 16 colors vary based on terminal theme Enable ASCII animations and shimmer effects in the TUI. Distribution
- npm_global_package: @openai/codex
- homebrew_cask: codex
npm install -g @openai/codex brew install --cask codex download the appropriate binary for your platform Packages
global_npm_install, global_homebrew_cask_install
Install globally with your preferred package manager npm install -g @openai/codex brew install --cask codex Extras
App-server protocol outputs and generated TypeScript bindings are specific to the Codex version used.
Each output is specific to the version of Codex you used to run the command codex app-server generate-ts --out DIR /// [experimental] Generate TypeScript bindings for the app server protocol. Codex can switch from WebSockets to HTTPS as a fallback transport during retry handling.
.try_switch_fallback_transport(&turn_context.otel_manager, &turn_context.model_info) message: format!("Falling back from WebSockets to HTTPS transport. {err:#}"), Embedded system skills are installed into `CODEX_HOME/skills/.system`.
/// This is typically located at `CODEX_HOME/skills/.system`. /// Installs embedded system skills into `CODEX_HOME/skills/.system`. App-server clients can suppress selected notifications per connection by method name.
Clients can suppress specific notifications per connection by sending exact method names Applies to both legacy (`codex/event/*`) and v2 (`thread/*`, `turn/*`, `item/*`, etc.) notifications.