Back to Blog

Integrating AI into Legacy Rails Applications: A Practical Guide

Chileap Chhin
8 min read
#Ruby on Rails #AI Integration #Legacy Systems #Modernization

💡 Need Help Modernizing Your Rails App?

I specialize in integrating AI into legacy Rails applications. Let's talk about your project.

Book Consultation →

You've got a Rails app that's been running for years. It works. It makes money. But now everyone's talking about AI, and you're wondering: "Can I add AI to my legacy Rails app without rebuilding everything?"

The short answer: Yes, absolutely. I've done this at three different companies over the past two years, and I'm going to show you exactly how.

Why Legacy Rails Apps Are Perfect for AI Integration

Here's something most people don't talk about: your legacy Rails application already has the most valuable asset for AI—real data. Years of user behavior, business logic, and edge cases that new apps spend years trying to collect.

The beauty of Rails is its flexibility. You can gradually introduce AI features without disrupting your existing codebase. Here's how:

Strategy 1: Start with Service Objects

The easiest entry point is creating dedicated service objects for AI operations. This keeps your AI logic isolated and testable.

# app/services/ai/content_summarizer.rb
module AI
  class ContentSummarizer
    def initialize(openai_client: OpenAI::Client.new)
      @client = openai_client
    end

    def summarize(text, max_words: 100)
      response = @client.chat(
        parameters: {
          model: "gpt-4",
          messages: [
            { role: "system", content: "You are a helpful assistant that summarizes content." },
            { role: "user", content: "Summarize this in #{max_words} words: #{text}" }
          ],
          temperature: 0.7
        }
      )

      response.dig("choices", 0, "message", "content")
    rescue StandardError => e
      Rails.logger.error("AI Summarization failed: #{e.message}")
      nil # Graceful degradation
    end
  end
end

Why this works: Service objects are a Rails pattern you already know. They're easy to test, cache, and replace if needed. Plus, they fail gracefully—if the AI service is down, your app keeps working.

Strategy 2: Background Jobs for Expensive Operations

AI operations can be slow. Don't block your users. Use Sidekiq or whatever job processor you're already using:

# app/jobs/ai_analysis_job.rb
class AiAnalysisJob < ApplicationJob
  queue_as :default

  def perform(document_id)
    document = Document.find(document_id)

    # Call your AI service
    analysis = AI::DocumentAnalyzer.new.analyze(document.content)

    # Store results
    document.update(
      ai_summary: analysis[:summary],
      ai_keywords: analysis[:keywords],
      analyzed_at: Time.current
    )

    # Notify user if needed
    DocumentAnalysisMailer.complete(document).deliver_later
  end
end

# In your controller
def create
  @document = current_user.documents.create(document_params)
  AiAnalysisJob.perform_later(@document.id)
  redirect_to @document, notice: "Document uploaded. AI analysis in progress..."
end

Strategy 3: Feature Flags for Gradual Rollout

Don't flip the AI switch for everyone at once. Use feature flags to test with small groups:

# With Flipper gem
if Flipper.enabled?(:ai_suggestions, current_user)
  @suggestions = AI::SuggestionEngine.new.generate_for(current_user)
end

This lets you measure impact, catch issues early, and roll back instantly if something goes wrong.

Real-World Example: Adding AI Writing Assistant

Real Result from Production

7-year-old Rails CMS → AI-powered writing assistant in 4 weeks

30%
More Content Created
0
Downtime
4
Weeks to Launch

At one of my previous roles, we added an AI writing assistant to a 7-year-old Rails content management system. Here's the approach that worked:

Phase 1: API Integration (Week 1)

  • Created AI namespace for all AI-related code
  • Built service objects for OpenAI API calls
  • Added comprehensive error handling and logging
  • Deployed behind feature flag to internal team (~5 users)

Phase 2: UI/UX Changes (Week 2-3)

  • Added a simple "Enhance with AI" button to text editors
  • Used Stimulus.js for progressive enhancement
  • Showed loading states and handled timeouts gracefully
  • Beta tested with 50 power users

Phase 3: Optimization (Week 4)

  • Added Redis caching for similar prompts
  • Implemented rate limiting per user
  • Set up monitoring in Datadog
  • Full rollout to all users

Result: 30% increase in content creation, 0 downtime, and the legacy app kept running exactly as before for users who didn't use the feature.

Common Pitfalls to Avoid

1. Don't Make AI a Single Point of Failure

Always have a fallback. If the AI service is down, your app should still function. Use circuit breakers and graceful degradation.

2. Watch Your Costs

AI API calls add up fast. Implement caching, rate limiting, and monitoring from day one. I learned this the hard way with a $3,000 surprise bill.

3. Handle Timeouts Properly

Set aggressive timeouts (5-10 seconds max for user-facing features). Users won't wait 30 seconds for an AI response.

@client = OpenAI::Client.new(
  request_timeout: 10 # seconds
)

4. Don't Over-Engineer

You don't need a microservices architecture or Kubernetes to add AI to Rails. Start simple. A service object and a background job will get you 90% of the way there.

5. "Won't AI Make My App Slow?"

This is the #1 concern I hear from teams. The good news: you can keep your app lightning-fast while using AI. The key is combining background jobs with real-time streaming using ActionCable or Hotwire Turbo Streams. Users get instant feedback as AI generates responses—no blocking, no waiting, no performance hit.

I'll show you exactly how to build this in my next post: Rails + AI Performance: Building Non-Blocking AI Features with Streaming

Testing AI Features

Testing AI is tricky because responses are non-deterministic. Here's what works:

# spec/services/ai/content_summarizer_spec.rb
RSpec.describe AI::ContentSummarizer do
  describe '#summarize' do
    let(:mock_client) { instance_double(OpenAI::Client) }
    let(:service) { described_class.new(openai_client: mock_client) }

    it 'returns summarized content' do
      allow(mock_client).to receive(:chat).and_return(
        { "choices" => [{ "message" => { "content" => "Test summary" } }] }
      )

      result = service.summarize("Long text here")
      expect(result).to eq("Test summary")
    end

    it 'handles API failures gracefully' do
      allow(mock_client).to receive(:chat).and_raise(StandardError)

      result = service.summarize("Long text here")
      expect(result).to be_nil
    end
  end
end

Use dependency injection to mock the AI client. Test the integration points, not the AI itself.

Cost-Benefit Analysis: Is It Worth It?

Adding AI to a legacy Rails app typically costs:

  • Development time: 2-4 weeks for basic integration
  • API costs: $100-$1,000/month depending on usage
  • Maintenance: Minimal if architected correctly

But the benefits can be substantial:

  • 30-50% improvement in user productivity (based on my experience)
  • Competitive differentiation in your market
  • New revenue opportunities from premium AI features
  • Employee satisfaction from working with modern tech

Next Steps: Your Implementation Checklist

Ready to add AI to your Rails app? Here's your action plan:

  1. Start small: Pick one feature that would benefit from AI
  2. Create a service object: Isolate AI logic from business logic
  3. Add proper error handling: Graceful degradation is key
  4. Use background jobs: Don't block users
  5. Implement feature flags: Test with small groups first
  6. Monitor costs: Set up alerts in your AI provider dashboard
  7. Measure impact: Track usage and user feedback

Conclusion

You don't need to rewrite your legacy Rails application to leverage AI. With the right architecture—service objects, background jobs, and feature flags—you can gradually introduce AI capabilities while maintaining stability.

I've successfully integrated AI into three legacy Rails applications over the past two years. Each time, the core strategy remained the same: start small, isolate concerns, and fail gracefully.

Your legacy Rails app is full of valuable data and battle-tested business logic. Adding AI is about enhancing what already works, not replacing it.


Currently Available for Projects

Ready to Modernize Your Rails Application?

I've helped multiple companies integrate AI into their legacy Rails systems. Let's discuss your specific situation and create a practical roadmap that fits your timeline and budget.

C

About Chileap Chhin

Senior Software Engineer with 9+ years of experience specializing in Ruby on Rails, React/Next.js, and AI integration. Working remotely with teams across Asia, North America, and Europe.

Work with me