Expo + AI Chat Stack
The AI chat app stack: Expo, OpenAI via NestJS proxy, streaming responses, per-user quotas. Ships a real AI chat app without burning your API credit.
The AI chat app stack: Expo, OpenAI via NestJS proxy, streaming responses, per-user quotas. Ships a real AI chat app without burning your API credit.
Starter Kit
Expo + OpenAI + NestJS proxy ships an AI chat app with streaming and quota controls that keep your OpenAI bill from exploding.
AI chat assistants, coach apps, tutor apps, anything where the value is a conversation with an LLM and a bit of state around it. Native iOS and Android, streaming responses, real quota handling so you do not get destroyed by one viral share.
chat.completions for older setups) with streaming enabled.Never put the OpenAI key in the mobile bundle. Every call goes through your backend. This is not optional. Your credit will be burned within hours of a public launch otherwise.
/ai/chat endpoint with streaming: ~6 hours.About 26 hours total.
React Native fetch supports streaming via ReadableStream on new enough versions. Older apps can use react-native-sse. AI App Factory ships a useAiStream hook that handles both and gives you { tokens, isStreaming, error } as state you can render directly.
Without these you are one Hacker News front page away from a five-figure OpenAI bill.
Two pushes that move the needle:
/ai/chat streaming endpoint.useAiStream hook.One-time purchase. Full-stack boilerplate + 11 AI agents. No subscription.