Skip to main content

Documentation Index

Fetch the complete documentation index at: https://docs.swiftaiboilerplate.com/llms.txt

Use this file to discover all available pages before exploring further.

Production chat system with 2 UI styles, streaming AI responses, cursor pagination, and comprehensive error handling.
Full technical details in your project at /docs/modules/Feature.Chat.md

What You Get

  • 2 professional UIs - Bubble (WhatsApp) and Centered (ChatGPT)
  • Streaming responses - Real-time token-by-token rendering
  • Cursor pagination - Infinite scroll with smart prefetch
  • Conversation management - Create, rename, delete, search
  • Message limits - Configurable free-tier restrictions
  • Error handling - Retry, cancellation, offline support
Time saved: 40-60 hours of UI implementation, streaming logic, pagination, state management, and testing.

What’s new in v2.0

  • ChatView and ChatGPTStyleView now host the input bar via .safeAreaInset(edge: .bottom). The scroll view is the full body, not a VStack stacking input below messages. This is what lets Liquid Glass sample the content behind the input bar on iOS 26.
  • Scroll-edge effect is wired through saiScrollEdgeGlass(_:) so the top and bottom of the chat transcript blur into the nav and input chrome.
  • ChatViewModel has a new sibling extension file, ChatViewModel+Memory.swift, which keeps memory and history concerns out of the main view model.
  • ChatView no longer adds an explicit .safeAreaInset gutter under a TabView. The tab bar’s own inset is respected.

UI Styles

WhatsApp/iMessage-like interface:
  • Messages in colored bubbles
  • User on right (blue), AI on left (gray)
  • Timestamps and avatars
  • Smooth animations
Best for: Casual, conversational apps
Style switcher included - Users can toggle between styles!

Key Components

Production Factory

The boilerplate uses a factory pattern in CompositionRoot:
// Factory method with dependency injection
public func makeChatViewModel(conversationID: UUID) -> ChatViewModel {
    ChatViewModel(
        conversationID: conversationID,
        messageRepository: messageRepository,  // SwiftData repositories
        llmClient: llmClient                     // Proxy or Echo client
    )
}

// Create dual-style chat view (both UIs with switcher)
public func makeDualStyleChatView(
    conversationID: UUID,
    onRequireSubscription: (() -> Void)? = nil
) -> DualStyleChatView {
    let viewModel = makeChatViewModel(conversationID: conversationID)
    
    return DualStyleChatView(
        viewModel: viewModel,
        onRequireSubscription: onRequireSubscription
    )
}

ChatViewModel

What it handles:
  • Message persistence via repository
  • LLM streaming with cancellation
  • Cursor-based pagination (InfinitePaginator)
  • Error handling and retry
  • Loading states
  • Message limits (free vs pro)

ChatHistoryViewModel

Manages conversation list:
@MainActor
@Observable
class ChatHistoryViewModel {
    var conversations: [ConversationDTO]
    var searchText: String
    var filteredConversations: [ConversationDTO]
    
    func createConversation() async -> UUID
    func renameConversation(id: UUID, title: String) async
    func deleteConversation(id: UUID) async
}

Streaming Pattern

Real-time AI response rendering:
// In ChatViewModel
for try await chunk in llmClient.streamResponse(messages: messages) {
    currentResponse += chunk
    // UI updates automatically via @Observable
}
Benefits:
  • ✅ Low latency (first token fast)
  • ✅ Better UX (gradual appearance)
  • ✅ Cancellable generation
  • ✅ Memory efficient

Pagination

Efficient infinite scroll for chat history:
// InfinitePaginator handles:
// - Cursor-based pagination
// - Reverse chronological order
// - No duplicate messages
// - Smooth scrolling

class InfinitePaginator<Item, Cursor> {
    func loadNextPage() async throws -> [Item]
    var hasMore: Bool
    var isLoading: Bool
}

Customization Examples

Change Chat Colors

// In DesignSystemColors.xcassets:
// - BubbleUser.colorset (user message color)
// - BubbleAssistant.colorset (AI message color)

// All chat styles update automatically!

Add Message Actions

// Add to ChatView:
.contextMenu {
    Button("Copy", systemImage: "doc.on.doc") {
        UIPasteboard.general.string = message.text
    }
    Button("Regenerate", systemImage: "arrow.clockwise") {
        await viewModel.regenerateResponse(messageID: message.id)
    }
    Button("Delete", systemImage: "trash", role: .destructive) {
        await viewModel.deleteMessage(id: message.id)
    }
}

Add Custom AI Persona

// In ChatViewModel
let systemPrompt = LLMMessage(
    role: .system,
    content: """
    You are a Swift expert.
    Provide clear answers with code examples.
    """
)

let messages = [systemPrompt] + history + [userMessage]
// In ChatHistoryViewModel
func searchMessages(query: String) async -> [MessageDTO] {
    try await messageRepository.search(
        query: query,
        limit: 50
    )
}

UI Components

All components extracted and reusable:
ComponentPurpose
ChatViewBubble-style chat interface
ChatGPTStyleViewCentered-style interface
DualStyleChatViewBoth styles with switcher
SAIInputBarMessage composition bar
ChatBubbleMessage bubble component
ChatHistoryViewConversation list
ChatRowCardConversation row
ChatSearchBarSearch input
EmptyStateViewNo conversations state

Key Files

ComponentLocation
ViewModelsPackages/FeatureChat/Sources/FeatureChat/ViewModels/
ViewsPackages/FeatureChat/Sources/FeatureChat/Views/
ComponentsPackages/FeatureChat/Sources/FeatureChat/Views/Components/
PaginatorPackages/FeatureChat/Sources/FeatureChat/Paginator/

Dependencies

  • Core - Error handling, logging
  • AI - LLM client for responses
  • Storage - Conversation and message persistence
  • DesignSystem - UI components and tokens

Used By

  • Main App - Primary feature
  • CompositionRoot - Factory methods for ViewModels

Best Practices

  • @MainActor for UI updates
  • @Observable for state
  • Async/await for operations
  • Handle all error cases
  • Update UI incrementally
  • Show typing indicator
  • Support cancellation
  • Handle interruptions
  • Paginate messages
  • Lazy load conversations
  • Compress images
  • Cancel tasks on dismiss

Learn More

Full Documentation

Find complete FeatureChat guide in your project

UI Styles Guide

Find complete Chat UI patterns guide in your project

AI Module

LLM integration

Building Guide

Customize chat

Build with AI (fast)

You can customize this module in minutes using our ready-to-paste LLM prompts.

Example Prompt

Context: Packages/FeatureChat/** Prompt: “Refactor chat UI to centered layout using DS spacing/tokens. Add toggle in Settings. Update tests.”

Example Output

struct TypingIndicatorView: View {
  @State private var isVisible = false
  var body: some View { /* minimal dots animation */ }
}
// TODO: wire to ChatViewModel.isTyping (throttle: 600ms)
See all prompts → /docs/prompts/Feature.Chat.prompts.md
See in project: docs/modules/Feature.Chat.md

Test Coverage

78%+ - Comprehensive chat testing Tests include:
  • Message sending
  • Streaming responses
  • Pagination
  • Conversation management
  • UI rendering
  • Error scenarios