1. Why it’s important to take the locale from the platform instead of asking the user every time
If you approach localization the “old‑fashioned” way, the usual logic is: show a “Choose language” modal and store the result in localStorage. In ChatGPT Apps the approach is different: we already have a smart platform that generously provides signals about the user’s language and region. You need to learn to use them and avoid bothering the user with unnecessary questions.
ChatGPT adds to your App context on every request:
- the user’s preferred locale (language + region) — in the openai/locale field / _meta["openai/locale"];
- the user’s geolocation/region — in the _meta["openai/userLocation"] field.
On the widget side (frontend) you get the locale via window.openai or an SDK hook; on the MCP/backend side — via _meta in the MCP request.
As a result, a normal scenario looks like this: a user writes “Pick a gift for mom within €50.” ChatGPT already knows their locale and userLocation, the platform passes these signals to your App, and you:
- render the UI in a language they understand,
- load the correct catalog language,
- format prices in the right currency and format.
Without a separate dialogue like “By the way, what’s your language?”
2. Signal #1: openai/locale — the user’s language and region
What this field is and how it looks
openai/locale is a BCP‑47 string you’ve likely seen: "en", "en-US", "ru", "ru-RU", "uk-UA", etc.
Important points about the platform:
- it may send just a language ("en", "ru"),
- or a language + region ("en-US", "en-GB", "fr-CA").
BCP‑47 is a standard that works great with the browser’s Intl API and most i18n libraries. This means you can pass openai/locale almost directly into Intl.NumberFormat, your translation engine, and inside your tools.
Where the locale is available in the widget
In a custom UI rendered inside ChatGPT, the Apps SDK provides a global window.openai object that includes locale.
Typically it looks like this (TypeScript, Next.js 16, our GiftGenius widget):
// src/app/widgets/gift-widget.tsx
declare global {
interface Window {
openai?: { locale?: string };
}
}
function getOpenAiLocale(): string {
if (typeof window === "undefined") return "en";
return window.openai?.locale || "en";
}
In a real app it’s easier to create a hook that works both in the ChatGPT sandbox and in Storybook:
// src/app/hooks/useOpenAiLocale.ts
import { useEffect, useState } from "react";
export function useOpenAiLocale(defaultLocale: string = "en") {
const [locale, setLocale] = useState(defaultLocale);
useEffect(() => {
if (typeof window === "undefined") return;
const next = window.openai?.locale || defaultLocale;
setLocale(next);
}, [defaultLocale]);
return locale;
}
Now in any component:
import { useOpenAiLocale } from "../hooks/useOpenAiLocale";
export function GiftHeader() {
const locale = useOpenAiLocale();
return (
<h2>
{/* later we’ll put t('titles.gift_search') here */}
{locale.startsWith("ru") ? "Gift selection" : "Gift search"}
</h2>
);
}
In lecture 4 we’ll neatly move all strings into dictionaries, but even now we’ve bound the UI to a real signal from the platform, not to a random navigator.language. This hook is narrowly specialized; in a real project it’s convenient to build it on top of a more general mechanism for accessing ChatGPT globals — we’ll return to it in a separate section below.
Where the locale is available in MCP/backend
When ChatGPT calls an MCP tool, the SDK passes _meta["openai/locale"] in the JSON‑RPC request. On a TypeScript server (our GiftGenius MCP) this is usually available in the tool handler’s second argument.
Example:
// src/mcp/server.ts
import { McpServer } from "@openai/mcp-sdk";
const server = new McpServer();
server.registerTool(
"suggest_gifts",
{
title: "Gift selection",
description: "Suggests a list of gifts based on preferences",
inputSchema: {
type: "object",
properties: {
recipient: { type: "string" },
budget: { type: "number" }
},
required: ["recipient", "budget"]
}
},
async ({ input }, extra) => {
const locale = extra?._meta?.["openai/locale"] || "en";
// then you can load the correct catalog
const gifts = await loadGiftCatalog(locale);
// ...
return {
content: [
{
type: "text",
text: `Found ${gifts.length} gifts for locale ${locale}`
}
],
structuredContent: { gifts }
};
}
);
Thus the locale flows through the entire stack: ChatGPT → Apps SDK → your MCP server.
Insight
Each mcp-tool on the server has an extra parameter where the mcp server places all data that didn’t fit into the inputSchema. Here’s an example of such an object:
{
sessionId: undefined, // always undefined, use `openai/subject` below
_meta: {
'openai/userAgent': 'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/143.0.0.0 Safari/537.36',
'openai/locale': 'en-US', // user’s computer locale, may not match the chat language
'openai/userLocation': { // fairly accurate user location
city: 'London',
region: 'London City',
country: 'GB',
timezone: 'Europe/London',
latitude: '5.45466',
longitude: '-0.52380'
},
timezone_offset_minutes: -240, // timezone offset
'openai/subject': 'v1/sEtRuS92UEOPNdwzEUZORfeOKf7XSk2KZoIUGfAsb68BzZ8h5FAOgrH' // this is the sessionId
},
authInfo: undefined,
requestId: 1,
requestInfo: {
headers: {
accept: 'application/json, text/event-stream',
'accept-encoding': 'gzip, deflate, br, zstd',
'access-control-allow-headers': '*',
'access-control-allow-methods': 'GET,POST,PUT,DELETE,OPTIONS',
'access-control-allow-origin': '*',
'content-length': '542',
'content-type': 'application/json',
host: 'test.ngrok.app', // origin domain of the app
'mcp-protocol-version': '2025-11-25',
traceparent: '00-69399d3a000000004fb8cc13dc3a2203-8748a8698107eb34-00',
tracestate: 'dd=s:-1;p:01514e334c1ccef5;t.dm:-3',
'user-agent': 'openai-mcp/1.0.0',
'x-datadog-parent-id': '6089244476286233754',
'x-datadog-sampling-priority': '-1',
'x-datadog-tags': '_dd.p.tid=69399c3a00000000,_dd.p.dm=-3',
'x-datadog-trace-id': '5744565710382309891',
'x-forwarded-for': '199.210.139.232',
'x-forwarded-host': 'test.ngrok.app',
'x-forwarded-port': '3001',
'x-forwarded-proto': 'https'
}
},
}
It’s possible some headers here were set by ngrok, but there’s still a lot of interesting data.
3. Signal #2: _meta["openai/userLocation"] — the user’s geography
Structure and meaning
_meta["openai/userLocation"] is an object with geo information: country, region, city, time zone, and even coordinates. Roughly like this:
{
"city": "London",
"region": "England",
"country": "GB",
"timezone": "Europe/London",
"latitude": 51.5074,
"longitude": -0.1278
}
The main fields you’ll actually use in GiftGenius:
- country — a two-letter ISO country code, critical for assortment and currency;
- timezone — useful for date/time formats and reminders.
Insight
Experimentally verified — userLocation detection works very well. The data arrives with every MCP-tool call via the extra._meta["openai/userLocation"] parameter. You can rely on it when building your apps.
How to use userLocation in MCP tools
On the MCP server, userLocation lives in _meta["openai/userLocation"] next to _meta["openai/locale"].
Let’s expand our tool example:
server.registerTool(
"suggest_gifts",
{ /* schema as above */ },
async ({ input }, extra) => {
const meta = extra?._meta ?? {};
const locale = (meta["openai/locale"] as string) || "en";
const userLocation = meta["openai/userLocation"] as
| { country?: string; city?: string }
| undefined;
const country = userLocation?.country || "US";
const gifts = await loadGiftCatalog(locale, country);
return {
content: [
{
type: "text",
text: `Found ${gifts.length} gifts for locale=${locale}, country=${country}`
}
],
structuredContent: { gifts }
};
}
);
The loadGiftCatalog(locale, country) function can already:
- pick the right JSON file: gift_catalog.en-US.json, gift_catalog.ru-RU.json,
- filter out items that can’t be shipped to this country,
- select a base currency.
A bit later in the commerce modules you’ll choose tax rules based on country and map to the right SKUs, but architecturally you still rely on the same signal — country.
How userLocation complements locale
A classic example:
locale = "en", userLocation.country = "DE".
Logic might be:
- UI and prompts — in English (respect the locale);
- currency and price format — euros, because the user is physically in Germany;
- gift list — only items that ship to DE.
In GiftGenius this can be expressed with a small helper function:
export function deriveCurrency(locale: string, country?: string): string {
if (country === "DE") return "EUR";
if (country === "JP") return "JPY";
if (locale === "zh_CN") return "CNY";
return "USD";
}
And used on the backend/frontend to format prices:
const currency = deriveCurrency(locale, country);
const formatted = new Intl.NumberFormat(locale, {
style: "currency",
currency
}).format(price);
On the backend we’ve already learned to use locale and country to pick the catalog and currency. Next it’s important to carefully pass the same signals to the UI in the widget so the user sees texts and prices in the expected format.
4. How to get locale and userLocation in the GiftGenius widget
We’ve already seen how locale and userLocation live on the MCP side and influence catalogs and currency. Now let’s figure out how to pull locale into the GiftGenius widget and use it directly in the React UI.
Important: in the widget we only have direct access to locale (via window.openai and SDK hooks). userLocation lives in _meta and is used on the MCP/backend side — we’ve already worked with it above.
In addition to the “raw” window.openai, the Apps SDK offers utilities in the form of React hooks. Documentation describes hooks like useOpenAiGlobal("locale") that pull ChatGPT’s global context values into React components.
Let’s model such a hook ourselves to understand what happens under the hood.
Basic hook useOpenAiGlobal
Earlier we made a specialized useOpenAiLocale. In practice it’s more convenient to have one universal hook for ChatGPT globals — on top of it you can easily build useOpenAiLocale and other wrappers. Let’s imagine such a hook:
// src/app/hooks/useOpenAiGlobal.ts
import { useEffect, useState } from "react";
type OpenAiGlobals = {
locale?: string;
// later we can add theme, userAgent, etc.
};
export function useOpenAiGlobal<K extends keyof OpenAiGlobals>(
key: K,
fallback?: NonNullable<OpenAiGlobals[K]>
): NonNullable<OpenAiGlobals[K]> {
const [value, setValue] = useState<NonNullable<OpenAiGlobals[K]>>(
(fallback ?? "") as NonNullable<OpenAiGlobals[K]>
);
useEffect(() => {
if (typeof window === "undefined") return;
const globals = (window.openai || {}) as OpenAiGlobals;
const next = globals[key] ?? fallback;
if (next !== undefined) {
setValue(next as NonNullable<OpenAiGlobals[K]>);
}
}, [key, fallback]);
return value;
}
Now useOpenAiGlobal("locale", "en") gives us the current locale with the default value "en".
Using it in the GiftGenius widget
Let’s make a small component that shows a localized greeting and the current locale for debugging:
// src/app/widgets/GiftWelcome.tsx
"use client";
import React from "react";
import { useOpenAiGlobal } from "../hooks/useOpenAiGlobal";
export function GiftWelcome() {
const locale = useOpenAiGlobal("locale", "en");
const greeting =
locale.startsWith("ru") || locale.startsWith("uk")
? "Hi! I’ll help you choose a gift."
: "Hi! I’ll help you find a great gift.";
return (
<div>
<p>{greeting}</p>
<small style={{ opacity: 0.6 }}>Debug locale: {locale}</small>
</div>
);
}
No dictionaries or i18n libraries yet — that’s coming later. What matters now is that we already know how to honestly take the language from ChatGPT, not from random guesses.
5. When you should explicitly ask the user about the language
If openai/locale and userLocation are so good, can we avoid ever asking the user which language they want? Unfortunately, sometimes you still need to.
When signals aren’t enough
There are a few typical situations:
- The ChatGPT account is English (locale = "en"), but the user writes in Russian. The model replies in Russian, but your UI is in English.
- The user is in Germany (userLocation.country = "DE"), locale = "en", and you can offer both German and English interfaces.
- The app is sensitive to the language of communication: psychotherapy, legal advice, education. There, the precision of understanding outweighs the comfort of autodetection.
In such cases it’s appropriate to ask a short, polite question once at the start of the flow and then remember the choice.
How to ask about language unobtrusively
Usually the copy is as simple and visual as possible, for example:
- “Which language is better for you: English or Russian?”
- “We detected your language as English. Do you want to switch to another?”
In a ChatGPT App you can do it in two ways:
- Through the widget UI: render a small language switcher at the top.
- Through a follow‑up chat message from the App: send a textual follow‑up question and then process the reply.
Code: a simple language choice in GiftGenius
Let’s build a switcher component that:
- takes the starting language from locale,
- lets the user pick ru or en,
- stores the choice in widget state (for now just React state).
// src/app/widgets/LanguageSwitcher.tsx
"use client";
import React, { useState, useEffect } from "react";
import { useOpenAiGlobal } from "../hooks/useOpenAiGlobal";
type SupportedLocale = "en" | "ru";
export function LanguageSwitcher(props: {
onChange?: (locale: SupportedLocale) => void;
}) {
const initialLocale = useOpenAiGlobal("locale", "en");
const [locale, setLocale] = useState<SupportedLocale>("en");
useEffect(() => {
const normalized: SupportedLocale = initialLocale.startsWith("ru")
? "ru"
: "en";
setLocale(normalized);
props.onChange?.(normalized);
}, [initialLocale, props]);
const handleChange = (next: SupportedLocale) => {
setLocale(next);
props.onChange?.(next);
};
return (
<div style={{ marginBottom: 8 }}>
<span style={{ marginRight: 8 }}>
{locale === "ru" ? "Language:" : "Language:"}
</span>
<button
type="button"
onClick={() => handleChange("en")}
style={{ fontWeight: locale === "en" ? "bold" : "normal" }}
>
EN
</button>
<button
type="button"
onClick={() => handleChange("ru")}
style={{ fontWeight: locale === "ru" ? "bold" : "normal", marginLeft: 4 }}
>
RU
</button>
</div>
);
}
And in the main GiftGenius widget you can already choose texts/dictionary by selectedLocale, not by the “raw” data from ChatGPT.
In future lectures you’ll replace local state with more durable storage (for example, pass the selected language to MCP / Gateway via _meta["openai/subject"]), but the pattern remains the same.
6. How to pass locale and userLocation to the backend and persist them
Signals come “from above” via ChatGPT, but life doesn’t end there. You need to pass this data to your tools and services, not lose it along the way, and not force the model to guess the language again.
An explicit locale field in the tools’ arguments
The most reliable approach is to add locale (and optionally country) as separate fields in the tool’s inputSchema. This gives the model an explicit signal: “fill this field.”
server.registerTool(
"suggest_gifts",
{
title: "Gift suggestions",
description: "Suggest gifts based on recipient and budget",
inputSchema: {
type: "object",
properties: {
recipient: { type: "string" },
budget: { type: "number" },
locale: {
type: "string",
description: "Current user UI locale, BCP-47 (e.g. en-US, fr-FR)"
},
country: {
type: "string",
description: "ISO country code (e.g. US, DE)"
}
},
required: ["recipient", "budget"]
}
},
async ({ input }, extra) => {
// If the model didn’t fill locale/country, backstop from _meta:
const meta = extra?._meta ?? {};
const locale = input.locale || (meta["openai/locale"] as string) || "en";
const country =
input.country ||
(meta["openai/userLocation"] as any)?.country ||
"US";
// ...
}
);
This reduces “magic” inside the server: it clearly sees the arguments the model intends to use.
Persisting locale at the session/user level
In an MCP Gateway architecture (future modules) it’s common to store “client state”: locale, currency, preferences. For now, just grasp the idea: read ChatGPT signals once — then use them as part of session state, instead of recalculating them every time.
Pseudocode:
// gateway.ts
const sessionState = new Map<string, { locale: string; country?: string }>();
function onMcpRequest(request: any) {
const subject = request._meta?.["openai/subject"]; // anonymous user id
const locale = request._meta?.["openai/locale"] || "en";
const country = request._meta?.["openai/userLocation"]?.country;
if (subject) {
sessionState.set(subject, { locale, country });
}
// then pass locale/country to the specific MCP server
}
Within this lecture you don’t need to implement a Gateway; it’s enough to understand that locale and userLocation are good candidates for such “session state.”
Insight
Experimental data: request._meta?.["openai/locale"] shows the user’s currently set locale. The language of communication can be obtained as a tool parameter via inputSchema.
I set EN locale on my computer and chatted with ChatGPT in German (DE). As a result:
- request._meta?.["openai/locale"] was EN
- locale obtained as a tool parameter via inputSchema was DE
7. Locale vs auto‑detecting language from text
Sometimes developers are tempted to say: “Let’s just detect the language from the user’s text — the LLM can do everything.” In practice that’s almost always worse than relying on openai/locale.
Reasons are fairly down to earth:
- the user may write a mix of languages;
- subtle distinctions (uk-UA vs ru-RU) are hard to detect from a single message;
- ChatGPT has already done this work for you and sent a locale.
Auto‑detect is useful as a fallback if openai/locale arrives in a strange form or is missing (which is rare now), but it shouldn’t be the core logic. The rough rule:
- first treat openai/locale as the source of truth;
- then factor in userLocation (currency, assortment);
- and only in very ambiguous cases additionally look at the language of the last message.
8. Different combinations of locale and userLocation: scenario table
To reinforce the concept, let’s see how GiftGenius should behave in different scenarios.
| Scenario | locale | userLocation.country | UI language | Currency | Catalog |
|---|---|---|---|---|---|
| 1 | |
|
EN | |
US goods |
| 2 | |
|
UKR/RU | |
UA goods |
| 3 | |
|
EN | |
DE goods |
| 4 | |
|
RU | |
DE goods |
| 5 | |
(no data) | EN | |
Global default |
This perspective will be useful later when we discuss commerce, but even now it’s clear how easily behavior changes by simply supplying different locale and country.
9. A small diagram of the locale signal flow
To collect our thoughts, here’s a simplified scheme:
flowchart TD U[User<br/>writes a message] --> C[ChatGPT] C -->|determines| L[openai/locale<br/>+ userLocation] L -->|passes| W["Widget (Next.js)"] L -->|passes via _meta| S[MCP Server] W -->|locale| UI[GiftGenius UI<br/>texts + number formatting] S -->|locale + country| DATA[Catalogs, prices, filters] style L fill:#e0f7ff,stroke:#00a style W fill:#f7fff0,stroke:#4b4 style S fill:#fdf0ff,stroke:#b4
Note: there’s no “Choose language” modal in this scheme. You need it only as an extra layer when signals contradict the user’s expectations.
10. Practice: what you can do right now in your App
So this lecture doesn’t stay theoretical, here’s a short practical checklist for GiftGenius:
- In the widget: add a useOpenAiGlobal("locale") hook (or equivalent) and branch RU/EN for text in at least one place.
- In the MCP server: in one of the existing tools (suggest_gifts) read _meta["openai/locale"] and _meta["openai/userLocation"], log them, and use them to select a catalog.
- Write a simple deriveCurrency(locale, country) function and use it in one place when formatting a price.
No need to build a full i18n engine and 15 languages right away — our goal now is to learn to properly use the platform’s signals.
11. Common mistakes when working with locale and userLocation
Mistake #1: completely ignoring openai/locale and relying only on navigator.language.
This is what people used to regular web apps do. In ChatGPT the user might not open anything in a browser at all, and navigator.language on your side could be the language of a tunnel server or Vercel, not the user. As a result the UI is mysteriously always in English, even though ChatGPT consistently sends you ru-RU.
Mistake #2: asking “what language do you prefer?” every time.
If the first widget reply in every chat is a language survey, users start feeling like they’re at an airport where they’re asked five times in a row whether they forgot their luggage. The platform already knows language and region — just respect openai/locale and ask only in case of clear conflict (for example, a Russian request while locale = "en").
Mistake #3: storing the chosen language only in the UI and not passing it to MCP tools.
The widget can be in Russian, while the server keeps returning the English catalog because it doesn’t know about the language change. Always think end‑to‑end: if the UI has a switcher, you need to pass its result to the backend — either as tool arguments or via a Gateway session.
Mistake #4: trying to “guess” the language only from message text while ignoring openai/locale.
Auto‑detect by text can work fine… as long as the user writes clean English. Once mixed languages or similar phrases appear, the result will fluctuate. openai/locale is an already prepared, fairly reliable assessment provided by the platform. Treat it as the primary source of truth, and text detection as just an additional signal.
Mistake #5: mixing business logic and localization like if (locale === 'ru') { ... } all over the code.
In this lecture we still do a little of that for simplicity, but it’s important to plan early that strings, formats, and catalogs must be separated from business logic. Otherwise in a couple of months you’ll end up with code where every function starts with if (locale.startsWith("ru")), and adding one more language will be painful. In lecture 44 we’ll fix exactly this problem, keeping in mind that we already have a source of locale and know how to use it.
GO TO FULL VERSION