---
title: "A pelican for GPT-5.5 via the semi-official Codex backdoor API"
source_name: "Simon Willison's Weblog"
original_url: "https://simonwillison.net/2026/Apr/23/gpt-5-5/#atom-everything"
canonical_url: "https://www.traeai.com/articles/5c73a819-6acb-4d64-8131-22d4fdc425b3"
content_type: "article"
language: "中文"
score: 5
tags: []
published_at: "2026-04-23T19:59:47+00:00"
created_at: "2026-04-23T22:52:57.278612+00:00"
---

# A pelican for GPT-5.5 via the semi-official Codex backdoor API

Canonical URL: https://www.traeai.com/articles/5c73a819-6acb-4d64-8131-22d4fdc425b3
Original source: https://simonwillison.net/2026/Apr/23/gpt-5-5/#atom-everything

## Summary

traeai 为开发者、研究员和内容团队筛选高质量 AI 技术内容，提供摘要、评分、趋势雷达与一键内容产出。

## Key Takeaways

- 
- 
- 

## Content

Title: A pelican for GPT-5.5 via the semi-official Codex backdoor API

URL Source: http://simonwillison.net/2026/Apr/23/gpt-5-5/

Published Time: Thu, 23 Apr 2026 22:07:46 GMT

Markdown Content:
# A pelican for GPT-5.5 via the semi-official Codex backdoor API

# [Simon Willison’s Weblog](http://simonwillison.net/)

[Subscribe](http://simonwillison.net/about/#subscribe)

**Sponsored by:** Honeycomb — AI agents behave unpredictably. Get the context you need to debug what actually happened. [Read the blog](https://fandf.co/4sb46pb)

## A pelican for GPT-5.5 via the semi-official Codex backdoor API

23rd April 2026

[GPT-5.5 is out](https://openai.com/index/introducing-gpt-5-5/). It’s available in OpenAI Codex and is rolling out to paid ChatGPT subscribers. I’ve had some preview access and found it to be a fast, effective and highly capable model. As is usually the case these days, it’s hard to put into words what’s good about it—I ask it to build things and it builds exactly what I ask for!

There’s one notable omission from today’s release—the API:

> API deployments require different safeguards and we are working closely with partners and customers on the safety and security requirements for serving it at scale. We’ll bring GPT‑5.5 and GPT‑5.5 Pro to the API very soon.

When I run my [pelican benchmark](https://simonwillison.net/tags/pelican-riding-a-bicycle/) I always prefer to use an API, to avoid hidden system prompts in ChatGPT or other agent harnesses from impacting the results.

#### The OpenClaw backdoor

One of the ongoing tension points in the AI world over the past few months has concerned how agent harnesses like OpenClaw and Pi interact with the APIs provided by the big providers.

Both OpenAI and Anthropic offer popular monthly subscriptions which provide access to their models at a significant discount to their raw API.

OpenClaw integrated directly with this mechanism, and was then [blocked from doing so](https://www.theverge.com/ai-artificial-intelligence/907074/anthropic-openclaw-claude-subscription-ban) by Anthropic. This kicked off a whole thing. OpenAI—who recently hired OpenClaw creator Peter Steinberger—saw an opportunity for an easy karma win and announced that OpenClaw was welcome to continue integrating with OpenAI’s subscriptions via the same mechanism used by their (open source) Codex CLI tool.

Does this mean _anyone_ can write code that integrates with OpenAI’s Codex-specific APIs to hook into those existing subscriptions?

The other day [Jeremy Howard asked](https://twitter.com/jeremyphoward/status/2046537816834965714):

> Anyone know whether OpenAI officially supports the use of the `/backend-api/codex/responses` endpoint that Pi and Opencode (IIUC) uses?

It turned out that on March 30th OpenAI’s Romain Huet [had tweeted](https://twitter.com/romainhuet/status/2038699202834841962):

> We want people to be able to use Codex, and their ChatGPT subscription, wherever they like! That means in the app, in the terminal, but also in JetBrains, Xcode, OpenCode, Pi, and now Claude Code.
> 
> 
> That’s why Codex CLI and Codex app server are open source too! 🙂

And Peter Steinberger [replied to Jeremy](https://twitter.com/steipete/status/2046775849769148838) that:

> OpenAI sub is officially supported.

#### llm-openai-via-codex

So... I had Claude Code reverse-engineer the [openai/codex](https://github.com/openai/codex) repo, figure out how authentication tokens were stored and build me [llm-openai-via-codex](https://github.com/simonw/llm-openai-via-codex), a new plugin for [LLM](https://llm.datasette.io/) which picks up your existing Codex subscription and uses it to run prompts!

(With hindsight I wish I’d used GPT-5.4 or the GPT-5.5 preview, it would have been funnier. I genuinely considered rewriting the project from scratch using Codex and GPT-5.5 for the sake of the joke, but decided not to spend any more time on this!)

Here’s how to use it:

1.   Install Codex CLI, buy an OpenAI plan, login to Codex
2.   Install LLM: `uv tool install llm`
3.   Install the new plugin: `llm install llm-openai-via-codex`
4.   Start prompting: `llm -m openai-codex/gpt-5.5 'Your prompt goes here'`

All existing LLM features should also work—use `-a filepath.jpg/URL` to attach an image, `llm chat -m openai-codex/gpt-5.5` to start an ongoing chat, `llm logs` to view logged conversations and `llm --tool ...` to [try it out with tool support](https://llm.datasette.io/en/stable/tools.html).

#### And some pelicans

Let’s generate a pelican!

undefinedshell
llm install llm-openai-via-codex
llm -m openai-codex/gpt-5.5 'Generate an SVG of a pelican riding a bicycle'
undefined

Here’s [what I got back](https://gist.github.com/simonw/edda1d98f7ba07fd95eeff473cb16634):

![Image 1: It is a bit mangled to be honest - good beak, pelican body shapes are slightly weird, legs do at least extend to the pedals, bicycle frame is not quite right.](https://static.simonwillison.net/static/2026/gpt-5.5-pelican.png)

I’ve seen better [from GPT-5.4](https://simonwillison.net/2026/Mar/17/mini-and-nano/#pelicans), so I tagged on `-o reasoning_effort xhigh` and [tried again](https://gist.github.com/simonw/a6168e4165a258e4d664aeae8e602cc5):

That one took almost four minutes to generate, but I think it’s a much better effort.

![Image 2: Pelican has gradients now, body is much better put together, bicycle is nearly the right shape albeit with one extra bar between pedals and front wheel, clearly a better image overall.](https://static.simonwillison.net/static/2026/gpt-5.5-pelican-xhigh.png)

If you compare the SVG code ([default](https://gist.github.com/simonw/edda1d98f7ba07fd95eeff473cb16634#response), [xhigh](https://gist.github.com/simonw/a6168e4165a258e4d664aeae8e602cc5#response)) the `xhigh` one took a very different approach, which is much more CSS-heavy—as demonstrated by those gradients. `xhigh` used 9,322 reasoning tokens where the default used just 39.

#### A few more notes on GPT-5.5

One of the most notable things about GPT-5.5 is the pricing. Once it goes live in the API it’s [going to be priced](https://openai.com/index/introducing-gpt-5-5/#availability-and-pricing) at _twice_ the cost of GPT-5.4—$5 per 1M input tokens and $30 per 1M output tokens, where 5.4 is $2.5 and $15.

GPT-5.5 Pro will be even more: $30 per 1M input tokens and $180 per 1M output tokens.

GPT-5.4 will remain available. At half the price of 5.5 this feels like 5.4 is to 5.5 as Claude Sonnet is to Claude Opus.

Ethan Mollick has a [detailed review of GPT-5.5](https://www.oneusefulthing.org/p/sign-of-the-future-gpt-55) where he put it (and GPT-5.5 Pro) through an array of interesting challenges. His verdict: the jagged frontier continues to hold, with GPT-5.5 excellent at some things and challenged by others in a way that remains difficult to predict.

Posted [23rd April 2026](http://simonwillison.net/2026/Apr/23/) at 7:59 pm · Follow me on [Mastodon](https://fedi.simonwillison.net/@simon), [Bluesky](https://bsky.app/profile/simonwillison.net), [Twitter](https://twitter.com/simonw) or [subscribe to my newsletter](https://simonwillison.net/about/#subscribe)

## More recent articles

*   [Extract PDF text in your browser with LiteParse for the web](http://simonwillison.net/2026/Apr/23/liteparse-for-the-web/) - 23rd April 2026
*   [Is Claude Code going to cost $100/month? Probably not - it's all very confusing](http://simonwillison.net/2026/Apr/22/claude-code-confusion/) - 22nd April 2026

This is **A pelican for GPT-5.5 via the semi-official Codex backdoor API** by Simon Willison, posted on [23rd April 2026](http://simonwillison.net/2026/Apr/23/).

[ai 1979](http://simonwillison.net/tags/ai/)[openai 408](http://simonwillison.net/tags/openai/)[generative-ai 1755](http://simonwillison.net/tags/generative-ai/)[chatgpt 195](http://simonwillison.net/tags/chatgpt/)[llms 1722](http://simonwillison.net/tags/llms/)[llm 587](http://simonwillison.net/tags/llm/)[llm-pricing 71](http://simonwillison.net/tags/llm-pricing/)[pelican-riding-a-bicycle 109](http://simonwillison.net/tags/pelican-riding-a-bicycle/)[llm-reasoning 98](http://simonwillison.net/tags/llm-reasoning/)[llm-release 195](http://simonwillison.net/tags/llm-release/)[codex-cli 31](http://simonwillison.net/tags/codex-cli/)[gpt 119](http://simonwillison.net/tags/gpt/)
**Next:**[Extract PDF text in your browser with LiteParse for the web](http://simonwillison.net/2026/Apr/23/liteparse-for-the-web/)

**Previous:**[Is Claude Code going to cost $100/month? Probably not - it's all very confusing](http://simonwillison.net/2026/Apr/22/claude-code-confusion/)

### Monthly briefing

Sponsor me for **$10/month** and get a curated email digest of the month's most important LLM developments.

Pay me to send you less!

[Sponsor & subscribe](https://github.com/sponsors/simonw/)

*   [Disclosures](http://simonwillison.net/about/#disclosures)
*   [Colophon](http://simonwillison.net/about/#about-site)
*   ©
*   [2002](http://simonwillison.net/2002/)
*   [2003](http://simonwillison.net/2003/)
*   [2004](http://simonwillison.net/2004/)
*   [2005](http://simonwillison.net/2005/)
*   [2006](http://simonwillison.net/2006/)
*   [2007](http://simonwillison.net/2007/)
*   [2008](http://simonwillison.net/2008/)
*   [2009](http://simonwillison.net/2009/)
*   [2010](http://simonwillison.net/2010/)
*   [2011](http://simonwillison.net/2011/)
*   [2012](http://simonwillison.net/2012/)
*   [2013](http://simonwillison.net/2013/)
*   [2014](http://simonwillison.net/2014/)
*   [2015](http://simonwillison.net/2015/)
*   [2016](http://simonwillison.net/2016/)
*   [2017](http://simonwillison.net/2017/)
*   [2018](http://simonwillison.net/2018/)
*   [2019](http://simonwillison.net/2019/)
*   [2020](http://simonwillison.net/2020/)
*   [2021](http://simonwillison.net/2021/)
*   [2022](http://simonwillison.net/2022/)
*   [2023](http://simonwillison.net/2023/)
*   [2024](http://simonwillison.net/2024/)
*   [2025](http://simonwillison.net/2025/)
*   [2026](http://simonwillison.net/2026/)
