You just watched a colleague refactor an entire authentication module from their terminal in under three minutes. They typed a single sentence, hit Enter, and an AI agent read the codebase, proposed patches, ran the test suite, and committed clean code — all without leaving the command line. The tool was OpenAI's Codex CLI, and now you want it on your Windows machine.
There's just one problem: Codex CLI was built for Unix-first environments. While macOS and Linux users enjoy native sandbox security and straightforward npm installs, Windows developers face a different path — one that involves choosing between an experimental native sandbox, a WSL2-based Linux environment, or a standalone binary download. Each approach carries trade-offs in performance, security, and workflow integration that aren't immediately obvious from the official docs.
This guide walks through every viable installation method for Windows in 2026, explains the security implications of each approach, and gets you from zero to a working Codex CLI session with the right configuration for your development workflow.
✓ Key Takeaways
- Three installation paths exist for Windows: Native via npm/PowerShell (experimental sandbox), WSL2-based install (mature Linux sandbox), or standalone binary download from GitHub Releases.
- WSL2 remains the recommended approach for production use — it provides the same Landlock/seccomp sandbox security as Linux, matching the environment where Codex models were primarily trained.
- Native Windows support has improved significantly in early 2026, with an AppContainer-based sandbox that restricts filesystem writes and blocks network access by default.
- Codex CLI is included with ChatGPT Plus ($20/mo), Pro ($200/mo), Business, Edu, and Enterprise plans — no separate license required. API key authentication is available for CI/CD workflows.
- Node.js 22+ is required for npm installation. Alternatively, download the pre-built Rust binary directly from GitHub Releases for a zero-dependency install.
What Is Codex CLI and Why It Matters for Windows Developers
Codex CLI is OpenAI's open-source terminal-based coding agent, rebuilt from the ground up in Rust for speed and efficiency. Unlike browser-based AI assistants that work through a chat window, Codex CLI operates directly in your development environment — reading your repository structure, proposing file edits, executing shell commands, and running your test suite. It's the same agent that powers Codex in ChatGPT, but running locally on your machine with full access to your project files [OpenAI Developer Documentation].
The model lineup has expanded considerably since the original launch in April 2025. As of early 2026, Codex CLI defaults to GPT-5.3-Codex — a model that's approximately 25% faster than its predecessor with stronger reasoning capabilities and mid-turn steering. Pro subscribers also get access to GPT-5.3-Codex-Spark, an ultra-low-latency variant running on Cerebras hardware that enables near-instant interactive editing [OpenAI Changelog].
For Windows developers, Codex CLI fills a gap that IDE-integrated tools like GitHub Copilot don't fully cover. Where Copilot focuses on inline code suggestions, Codex CLI is an autonomous agent that can orchestrate multi-step development tasks — analyzing a bug report, tracing the issue through your codebase, writing a fix, updating tests, and preparing a commit message. It supports MCP (Model Context Protocol) for connecting external tools, multi-agent collaboration for parallelizing complex tasks, and a skills/plugin system for reusable automation [OpenAI Developer Documentation].
Prerequisites: What You Need Before Installing
Before choosing an installation method, verify that your system meets these baseline requirements. All three installation paths share common prerequisites, with some method-specific additions.
System Requirements
- Operating System: Windows 10 (build 19041+) or Windows 11
- RAM: 2GB minimum, 4GB recommended — Codex CLI itself is lightweight since all AI inference runs in OpenAI's cloud
- Internet: Stable outbound HTTPS access to api.openai.com
- Authentication: A ChatGPT Plus, Pro, Business, Edu, or Enterprise subscription — or an OpenAI API key with credits
- Node.js 22+: Required for npm installation method only (binary download has zero dependencies)
- Git 2.23+: Recommended for repository-aware operations and version control integration
One detail that trips up many Windows users: Codex CLI is included at no additional cost with paid ChatGPT subscriptions. There is no separate "Codex license" to purchase. ChatGPT Plus at $20/month provides access to the Codex agent with usage-based limits that vary by task complexity — generally 30 to 150 local messages per five-hour window. Pro at $200/month significantly expands those limits for full-time development use [OpenAI Codex Pricing].
Method 1: Native Windows Installation via npm (Experimental)
The fastest path to a working Codex CLI is installing natively on Windows through npm and running it directly in PowerShell or Windows Terminal. This approach has improved substantially in early 2026, but OpenAI still labels Windows support as experimental. The primary reason: the Windows sandbox implementation uses an AppContainer-based restricted token approach that cannot prevent file writes in directories where the Everyone SID already has write permissions.
If you're comfortable with that limitation and want the simplest setup, here's the process.
Step 1: Install Node.js 22+
If you don't already have Node.js installed, download the LTS installer from nodejs.org or use winget:
winget install OpenJS.NodeJS.LTSVerify the installation by checking both Node and npm versions:
node --version
npm --versionYou need Node.js 22 or higher. If you manage multiple Node versions across projects, nvm-windows lets you switch without breaking existing project dependencies:
nvm install 22
nvm use 22Step 2: Install Codex CLI Globally
npm install -g @openai/codexImportant:
The package name is @openai/codex — not codex. The unscoped codex package on npm is an entirely different tool from 2012. Installing the wrong package is a common mistake that produces confusing errors.
Verify the installation:
codex --versionStep 3: Authenticate
Launch Codex for the first time:
codexYou'll be prompted to sign in. Two authentication paths are available. ChatGPT account authentication opens a browser window for OAuth — this is the recommended path for individual developers because it ties directly to your subscription and requires no key management. Alternatively, you can authenticate with an OpenAI API key for headless environments or CI/CD pipelines:
$env:OPENAI_API_KEY="sk-your-key-here"
codexFor persistent API key storage in PowerShell, add the environment variable to your profile:
[System.Environment]::SetEnvironmentVariable("OPENAI_API_KEY", "sk-your-key-here", "User")Understanding the Windows Native Sandbox
When running natively on Windows, Codex uses an experimental sandbox that works differently from the mature Linux implementation. The Windows sandbox launches commands inside a restricted token derived from an AppContainer profile, grants only specifically requested filesystem capabilities via capability SIDs, and disables outbound network access by overriding proxy-related environment variables and inserting stub executables for common network tools [OpenAI Security Documentation].
You can configure the native sandbox behavior in your config.toml file. The default workspace-write mode restricts writes to your working directory and temporary folders while blocking network access. When Codex first runs in a new directory, it will scan for world-writable folders and recommend you restrict their permissions.
Method 2: WSL2 Installation (Recommended for Production)
For developers who need reliable sandbox isolation and maximum compatibility with Codex's Linux-trained models, WSL2 is the recommended approach. The Linux sandbox implementation is mature, using Landlock and seccomp for filesystem and system call restriction — the same security boundary used in production Linux deployments.
Step 1: Install WSL2
Open PowerShell as Administrator and install WSL with the default Ubuntu distribution:
wsl --installRestart your computer after installation completes. On the next boot, Ubuntu will finish setting up and prompt you to create a Linux username and password.
Step 2: Install Node.js Inside WSL
Enter your WSL shell and install Node.js via nvm (Node Version Manager):
wsl
# Install nvm
curl -o- https://raw.githubusercontent.com/nvm-sh/nvm/master/install.sh | bash
# Close and reopen your terminal, then install Node.js
nvm install 22After running the nvm install script, you'll need to either open a new terminal window or run source ~/.bashrc before the nvm command becomes available.
Step 3: Install and Run Codex CLI
npm i -g @openai/codex
codexAuthentication works the same way as native Windows — browser-based OAuth for ChatGPT subscriptions or API key export for programmatic access.
Performance Considerations for WSL2
One critical performance detail that catches many developers: working on files stored in Windows-mounted paths (like /mnt/c/Users/...) through WSL is significantly slower than working with files stored in the Linux filesystem. The cross-filesystem boundary introduces I/O overhead that compounds during code analysis and file operations [OpenAI Windows Documentation].
Keep your repositories under your Linux home directory for optimal performance:
mkdir -p ~/code && cd ~/code
git clone https://github.com/your/repo.git
cd repoIf you need to access these files from Windows Explorer, they're available at \\wsl$\Ubuntu\home\<username>.
VS Code Integration with WSL
For developers using Visual Studio Code, the WSL approach pairs naturally with VS Code's Remote - WSL extension. Install the WSL extension, then open your project from the WSL terminal:
cd ~/code/your-project
code .This opens a WSL remote window where integrated terminals run in Linux. Verify the connection by checking for the green "WSL: Ubuntu" indicator in the VS Code status bar. If you don't see it, press Ctrl+Shift+P and select "WSL: Reopen Folder in WSL."
Method 3: Standalone Binary Download
If you don't want to install Node.js or npm at all, you can download a pre-built Codex binary directly from GitHub Releases. Each release includes compiled executables for multiple platforms, including Windows x86-64 and ARM64 [OpenAI Codex GitHub].
Navigate to https://github.com/openai/codex/releases/latest and download the appropriate binary:
- x86-64 Windows:
codex-x86_64-pc-windows-msvc.exe - ARM64 Windows:
codex-aarch64-pc-windows-msvc.exe
Rename the downloaded file to codex.exe, move it to a directory in your system PATH (or add its location to PATH), and run it from PowerShell or Command Prompt. This approach requires zero dependencies — no Node.js, no npm, no build tools.
Configuring Codex CLI for Your Workflow
Once Codex is installed and authenticated, the next step is configuring it for how you actually work. Codex reads configuration from ~/.codex/config.toml (or %USERPROFILE%\.codex\config.toml on native Windows). The CLI and IDE extension share the same configuration file [OpenAI Configuration Documentation].
Understanding Approval Modes
Approval modes are the most important configuration decision you'll make. They determine how much autonomy Codex has when interacting with your files and executing commands.
| Mode | File Edits | Shell Commands | Network Access | Best For |
|---|---|---|---|---|
| Suggest | Requires approval | Requires approval | Requires approval | Code review, planning, learning a new codebase |
| Auto (Default) | Automatic | Automatic in workspace | Requires approval | Everyday development |
| Full Access | Automatic | Automatic everywhere | Automatic | Disposable sandboxes, CI/CD, throwaway branches |
Set your preferred mode via command-line flags or in config.toml:
# Command-line flags
codex --approval-mode suggest "analyze this codebase"
codex --approval-mode auto-edit "add error handling"
codex --full-auto "run tests and fix failures"
# Or set defaults in ~/.codex/config.toml
# model = "gpt-5.3-codex"
# approval_policy = "on-request"
# sandbox_mode = "workspace-write"You can also switch modes mid-session using slash commands inside the TUI: /mode suggest, /mode auto-edit, or /mode full-auto. This is useful when you want to start conservatively in suggest mode to review Codex's approach, then switch to auto once you trust the direction.
Sample Configuration for Windows Developers
Here's a starting config.toml configuration optimized for Windows development:
# ~/.codex/config.toml
# Core model selection
model = "gpt-5.3-codex"
# Approval and sandbox settings
approval_policy = "on-request"
sandbox_mode = "workspace-write"
# Enable network access when needed (for npm install, etc.)
[sandbox_workspace_write]
network_access = false
# TUI preferences
[tui]
notifications = true
animations = true
show_tooltips = trueWhen you need network access for commands like npm install or git push, you can enable it per-session without changing your default configuration:
codex -c 'sandbox_workspace_write.network_access=true' "install dependencies and run tests"Customizing Codex with AGENTS.md
One of Codex CLI's most powerful features is the AGENTS.md instruction system. Codex reads these files before starting any work, building a layered instruction chain that provides context about your preferences, project conventions, and team standards [OpenAI AGENTS.md Documentation].
Create a global AGENTS.md in your Codex home directory to set defaults across all projects:
# ~/.codex/AGENTS.md
## Working agreements
- Always run tests after modifying source files.
- Prefer TypeScript over JavaScript for new files.
- Use ESLint and Prettier formatting conventions.
- Ask for confirmation before adding new production dependencies.
- Write commit messages following Conventional Commits format.Then add project-specific guidance at the repository root:
# /your-project/AGENTS.md
## Repository expectations
- This is a Next.js 15 project with App Router.
- Run `npm run lint` before opening a pull request.
- API routes live in `src/app/api/`.
- Use Tailwind CSS for all styling — no CSS modules.Codex concatenates these files from root to current directory, with closer files taking precedence. This means your team's shared project AGENTS.md establishes baseline conventions, while subdirectory overrides can specify rules for specific services or modules.
Essential Commands and Daily Workflows
With Codex installed and configured, here are the commands and workflows that will define your daily use.
Interactive TUI Mode
Running codex with no arguments launches the full-screen terminal UI. This is the primary interface for conversational development — you type natural language prompts, watch Codex explain its plan, approve or reject actions inline, and iterate in real time.
# Launch the TUI
codex
# Launch with a starting prompt
codex "explain the authentication flow in this project"
# Launch with an image attachment (screenshot, design spec)
codex --image screenshot.png "implement this design"Inside the TUI, several slash commands enhance your workflow. Use /model to switch between available models mid-session, /status to check remaining usage quota, /clear to start a fresh conversation, and /copy to copy Codex's latest output to your clipboard.
Non-Interactive Exec Mode
For scripted or CI/CD workflows, codex exec runs without human interaction and pipes results to stdout:
# Run a task non-interactively
codex exec --full-auto "update the CHANGELOG for the next release"
# Combine with shell scripting
codex exec --full-auto "find and fix all TypeScript type errors" | tee fix-report.txtUseful Everyday Patterns
# Analyze a codebase you're new to
codex "explain the architecture of this project"
# Fix failing tests
codex --full-auto "run all tests, identify failures, and fix them"
# Code review
codex "review src/auth/ for security vulnerabilities"
# Refactor with context
codex "refactor the database queries in src/models/ to use connection pooling"
# Resume a previous session
codex resume --lastUpgrade Your Development Workflow with AI-Powered IT Consulting
ITECS helps organizations integrate AI coding tools like Codex CLI, Claude Code, and GitHub Copilot into secure, managed development environments — with proper access controls, compliance guardrails, and infrastructure that keeps your team productive.
Troubleshooting Common Windows Issues
Windows installations encounter a specific set of issues that don't affect macOS or Linux users. Here are the most common problems and their solutions.
▶ "codex: command not found" after npm install
Your PATH doesn't include npm's global bin directory. Find it with npm bin -g, then add it to your system PATH. On PowerShell, you can add it permanently:
$npmBin = npm bin -g
[System.Environment]::SetEnvironmentVariable("PATH", "$env:PATH;$npmBin", "User")Restart your terminal after making PATH changes.
▶ Sandbox errors on native Windows
If you encounter sandbox-related errors when running natively, verify that you have the Visual Studio Build Tools with the C++ workload installed. Some native dependencies require these tools:
winget install --id Microsoft.VisualStudio.2022.BuildTools -eIf sandbox errors persist, you can bypass the sandbox for testing (not recommended for production):
$env:CODEX_UNSAFE_ALLOW_NO_SANDBOX=1
codex "your prompt here"▶ Slow performance in WSL2
If Codex feels sluggish, check whether your repository is stored under /mnt/c/. Move it to your Linux home directory (~/code/...) for dramatically faster I/O. Also ensure WSL is up to date:
wsl --update
wsl --shutdownThen restart WSL and try again. You can also allocate more memory and CPU to WSL via the .wslconfig file in your Windows user directory.
▶ VS Code in WSL can't find the codex command
Verify the binary exists and is on PATH inside WSL:
which codex || echo "codex not found"If not found, reinstall within WSL: npm i -g @openai/codex. Also ensure you're launching VS Code from within the WSL terminal (code .) rather than from the Windows Start menu — launching from Windows may not inherit your WSL PATH.
▶ Approval prompts persist even with --full-auto
A reconnect or profile mismatch can reset your approval policy. Check your current settings with /status inside the TUI, then restart with explicit flags:
codex -a never -s workspace-write "your task"For persistent settings, define a profile in your config.toml and launch with codex --profile <name>.
Security Considerations for Enterprise Deployments
For organizations deploying Codex CLI across development teams, the security model deserves careful evaluation — particularly on Windows where the sandbox is less mature than on Linux.
The Windows native sandbox uses a Restricted Token approach with filesystem ACLs, running commands as a dedicated Windows Sandbox User with firewall rules that limit network access. However, its primary limitation is that it cannot prevent writes to directories where the Everyone SID already has write permissions. In enterprise environments with standardized workstation images, this is often manageable — but it requires auditing directory permissions as part of your deployment process [OpenAI Security Documentation].
For organizations that need stronger isolation guarantees, consider standardizing on the WSL2 approach and managing it through your existing endpoint security tools. Alternatively, running Codex inside Docker containers with workspace-write sandbox mode provides an additional layer of containment.
Enterprise and Business ChatGPT plans include important security controls: data is not used to train OpenAI's models by default, RBAC limits which team members can access Codex, and the Compliance API provides audit logs for monitoring usage. AGENTS.md files at the repository level can enforce team-wide coding standards that Codex follows automatically — ensuring consistency without manual oversight.
Organizations that need to route Codex through internal proxies or use Azure-hosted models can configure custom model providers in config.toml. Microsoft's Foundry platform supports running Codex entirely on Azure infrastructure, keeping data within your compliance boundary while maintaining the same CLI experience [Microsoft Foundry Documentation].
Keeping Codex CLI Updated
Codex CLI receives frequent updates — the changelog shows multiple releases per month throughout 2025 and into 2026, with improvements to sandbox security, model support, and Windows compatibility landing regularly. To upgrade via npm:
npm update -g @openai/codexIf you installed via binary download, download the latest release from GitHub Releases and replace your existing executable. For WSL installations, run the npm update command from within your WSL shell.
You can check your current version at any time with codex --version, and the /status command inside the TUI shows both your CLI version and remaining usage quota.
| Installation Method | Sandbox Maturity | Setup Complexity | Performance | Best For |
|---|---|---|---|---|
| Native npm (PowerShell) | Experimental | Low | Best (native I/O) | Quick setup, personal projects |
| WSL2 + npm | Mature (Landlock/seccomp) | Medium | Fast (Linux filesystem) | Production, enterprise, teams |
| Binary download | Varies by environment | Lowest | Best (native, zero deps) | No Node.js requirement, air-gapped setups |
What Comes Next: The Codex Ecosystem Beyond CLI
Codex CLI is one surface in a broader ecosystem that now includes the Codex desktop app (available for Windows and macOS), the Codex IDE extension for VS Code and compatible editors, Codex Cloud for delegating tasks from the web, and the Codex SDK for programmatic integration. All surfaces share your ChatGPT account and usage limits within the same five-hour rolling window [OpenAI Codex Documentation].
The desktop app, launched in February 2026, provides a native Windows experience with multi-threaded task management, inline diff review, and Git worktree support for isolating changes across parallel Codex sessions. For teams already invested in Codex CLI, the app adds a visual layer on top of the same agent capabilities — you can even switch between CLI and app mid-workflow since they share session state.
For organizations evaluating how AI coding tools fit into their broader IT strategy — including cybersecurity controls, managed IT services, and compliance requirements — the key consideration is governance. Tools like Codex CLI need to be deployed with clear policies around data handling, approval modes, and access controls that align with your existing security posture.
Sources
- OpenAI. "Codex CLI." Developer Documentation, 2026. developers.openai.com/codex/cli
- OpenAI. "Windows Setup Guide." Developer Documentation, 2026. developers.openai.com/codex/windows
- OpenAI. "Codex Security." Developer Documentation, 2026. developers.openai.com/codex/security
- OpenAI. "Codex Changelog." Developer Documentation, 2026. developers.openai.com/codex/changelog
- Microsoft. "Codex with Azure OpenAI in Microsoft Foundry." Microsoft Learn, 2026. learn.microsoft.com
Related Resources
- How To Install Codex CLI on Ubuntu Linux (2026 Guide) — Companion guide covering Linux installation, MCP server integration, and enterprise deployment
- AI Consulting & Strategy — ITECS helps organizations integrate AI coding tools into secure, compliant development workflows
- Cybersecurity Consulting — Evaluate the security implications of AI-powered development tools in your environment
- IT Consulting Services — Strategic technology guidance for teams adopting AI-assisted development
- The Future of MSPs — How AI tools are reshaping managed IT services and developer productivity
Need Help Integrating AI Development Tools into Your Organization?
ITECS provides AI consulting, cybersecurity assessments, and managed IT services that help businesses deploy tools like Codex CLI with the right security controls, access policies, and infrastructure support. Whether you're a solo developer or an enterprise team, we can help you get the most out of AI-powered development.
