openai_tokenize() and is
useful for debugging tokenization or reconstructing text from tokens.
Samples
Detokenize tokens
Convert token IDs back into text:Round-trip tokenization
Verify tokenization is reversible:Arguments
| Name | Type | Default | Required | Description |
|---|---|---|---|---|
model | TEXT | - | ✔ | The OpenAI model to detokenize for (e.g., text-embedding-ada-002, gpt-4o) |
tokens | INT[] | - | ✔ | Array of token IDs to convert back into text |
Returns
TEXT: The reconstructed text from the token IDs.
Related functions
openai_tokenize(): convert text into tokensopenai_embed(): generate embeddings from tokens