Rails + OpenAI API — Build a Streaming Chat Interface with Turbo
You've got Rails down. You've got Hotwire, Stimulus, and background jobs wired up. Now let's wire in actual AI. This post builds a working chat interface that streams OpenAI responses in real-time ...

Source: DEV Community
You've got Rails down. You've got Hotwire, Stimulus, and background jobs wired up. Now let's wire in actual AI. This post builds a working chat interface that streams OpenAI responses in real-time using Turbo Streams. No JavaScript frameworks. No React. Just Rails doing what Rails does best. What We're Building A simple chat UI where users type messages and get AI responses streamed word-by-word. The kind of thing you'd see in ChatGPT, but built with Rails in under 100 lines of code. Setup Add the OpenAI gem: gem 'ruby-openai' bundle install Set your API key: export OPENAI_API_KEY=sk-... Or use Rails credentials: bin/rails credentials:edit Add to the file: openai: api_key: sk-... The Chat Model We need to store messages. Keep it simple: bin/rails g model Chat message:text response:text db:migrate The Controller Here's where the magic happens. We're using ruby-openai with streaming enabled, and piping each chunk through Turbo Streams: class ChatsController < ApplicationController def