Architecture
Technical architecture and design of TalkCody
Architecture Overview
Understanding TalkCody's architecture, technology stack, and design decisions.
High-Level Architecture
┌─────────────────────────────────────────────────┐
│ TalkCody │
├─────────────────────────────────────────────────┤
│ │
│ ┌───────────────┐ ┌────────────────┐ │
│ │ Frontend │ │ Backend │ │
│ │ React + TS │◄───────►│ Tauri/Rust │ │
│ │ Vite 7 │ IPC │ │ │
│ └───────────────┘ └────────────────┘ │
│ │ │ │
│ │ │ │
│ ▼ ▼ │
│ ┌───────────────┐ ┌────────────────┐ │
│ │ Services │ │ File System │ │
│ │ AI/Database │ │ Operations │ │
│ └───────────────┘ └────────────────┘ │
│ │ │ │
└─────────┼───────────────────────────┼──────────┘
│ │
▼ ▼
┌─────────┐ ┌──────────────┐
│ AI APIs │ │ Local Files │
│(OpenAI, │ │ & SQLite │
│Claude..)│ │ │
└─────────┘ └──────────────┘Technology Stack
Frontend Stack
Core Framework
- React 19: UI framework
- TypeScript: Type-safe JavaScript
- Vite 7: Build tool and dev server
Styling
- Tailwind CSS 4: Utility-first CSS
- Radix UI: Headless component primitives
- Lucide Icons: Icon library
State Management
- React Context: Global state
- React Hooks: Local state
- Custom hooks: Reusable logic
Editor
- Monaco Editor: VS Code's editor
- Syntax highlighting: 100+ languages
- IntelliSense: Code completion
AI Integration
- Vercel AI SDK 5.0: Unified AI interface
- Provider SDKs: OpenAI, Anthropic, Google, etc.
- Streaming: Real-time responses
Backend Stack
Framework
- Tauri 2: Desktop app framework
- Rust: Systems programming language
Core Libraries
- tokio: Async runtime
- serde: Serialization
- sqlx: Database toolkit
Tauri Plugins
- fs: File system operations
- shell: Process execution
- dialog: Native dialogs
- sql: SQLite integration
- log: Logging
- opener: Open URLs/files
Additional Tools
- notify: File watching
- ignore: Gitignore parsing
- walkdir: Directory traversal
- ripgrep: Fast text search
Database
SQLite
- Embedded database
- Zero configuration
- ACID transactions
- Full-text search support
Schema
- Projects
- Conversations
- Messages
- Agents
- Settings
- MCP Servers
Component Architecture
Frontend Components
Component Hierarchy
App
├── Layout
│ ├── Sidebar
│ │ ├── ProjectList
│ │ ├── ConversationList
│ │ └── Navigation
│ └── MainContent
│ ├── ChatView
│ │ ├── MessageList
│ │ │ └── Message
│ │ │ ├── TextContent
│ │ │ ├── CodeBlock
│ │ │ └── ToolResult
│ │ └── ChatInput
│ ├── ExplorerView
│ │ ├── FileTree
│ │ └── Editor
│ ├── AgentsView
│ └── SettingsView
└── Providers
├── ThemeProvider
├── I18nProvider
└── ToastProviderComponent Patterns
- Composition: Build complex UIs from simple parts
- Render Props: Share logic between components
- Custom Hooks: Reusable stateful logic
- Context: Global state without prop drilling
Service Layer
Service Organization
services/
├── ai/
│ ├── providers/ # AI provider integrations
│ ├── agent-service.ts # Agent management
│ ├── chat-service.ts # Chat functionality
│ └── completion-service.ts
├── database/
│ ├── db-service.ts # Database operations
│ ├── models/ # Data models
│ └── migrations/ # Schema migrations
├── file/
│ ├── file-service.ts # File operations
│ ├── search-service.ts # Code search
│ └── watcher-service.ts # File watching
└── mcp/
├── mcp-service.ts # MCP client
├── server-manager.ts # Server management
└── tool-registry.ts # Tool registrationService Principles
- Single Responsibility: Each service has one purpose
- Dependency Injection: Services receive dependencies
- Async Operations: All I/O is asynchronous
- Error Handling: Consistent error patterns
Communication Architecture
IPC (Inter-Process Communication)
Frontend → Backend
// Frontend invokes Tauri command
import { invoke } from '@tauri-apps/api/core';
const result = await invoke('read_file', {
path: '/path/to/file.ts'
});Backend → Frontend
// Rust command handler
#[tauri::command]
async fn read_file(path: String) -> Result<String, String> {
tokio::fs::read_to_string(&path)
.await
.map_err(|e| e.to_string())
}Events
// Frontend listens to backend events
import { listen } from '@tauri-apps/api/event';
await listen('file-changed', (event) => {
console.log('File changed:', event.payload);
});// Backend emits events
app.emit_all("file-changed", payload)?;AI Provider Communication
Request Flow
User Input
↓
Chat Service
↓
AI Provider Service (Vercel AI SDK)
↓
Provider-Specific SDK
↓
HTTP Request
↓
AI API (OpenAI/Anthropic/etc.)
↓
Streaming Response
↓
UI Update (real-time)Provider Abstraction
interface AIProvider {
chat(messages: Message[]): AsyncIterator<string>;
complete(prompt: string): Promise<string>;
embed(text: string): Promise<number[]>;
}Data Flow
Conversation Flow
1. User types message
↓
2. ChatInput component captures input
↓
3. ChatService.sendMessage()
↓
4. Save message to DB
↓
5. Call AI provider with context
↓
6. Stream response chunks
↓
7. Update UI in real-time
↓
8. Save AI response to DB
↓
9. Tool execution (if needed)
↓
10. Update conversation stateFile Operation Flow
1. User requests file operation
↓
2. FileService validates request
↓
3. Check permissions
↓
4. Invoke Tauri command
↓
5. Rust executes file operation
↓
6. Return result via IPC
↓
7. Update file tree UI
↓
8. Emit file-changed event
↓
9. Update dependent componentsState Management
State Architecture
Global State (React Context)
- Theme settings
- User preferences
- Current project
- Active conversation
- Agent configuration
Local State (useState/useReducer)
- Form inputs
- UI toggles
- Component-specific data
Server State (React Query pattern)
- Database queries
- API responses
- Cached data
Persistent State (SQLite)
- Conversations
- Messages
- Projects
- Settings
- Agents
State Patterns
Optimistic Updates
// Update UI immediately
setMessages([...messages, newMessage]);
// Then persist to database
await db.saveMessage(newMessage);
// Rollback on error
if (error) {
setMessages(messages);
}Lazy Loading
// Load data as needed
const conversations = useLazyQuery(
() => db.getConversations(projectId),
[projectId]
);Security Architecture
Security Layers
1. File System Sandboxing
// Restrict file access to allowed directories
fn validate_path(path: &Path, allowed: &Path) -> Result<()> {
if !path.starts_with(allowed) {
return Err("Access denied");
}
Ok(())
}2. API Key Encryption
// Keys encrypted before storage
const encrypted = await encrypt(apiKey, systemKey);
await db.saveEncryptedKey(encrypted);3. Command Permissions
// Tauri command permissions
#[tauri::command]
#[cfg(not(debug_assertions))]
async fn dangerous_operation() {
// Only available in debug builds
}4. Content Security Policy
{
"csp": {
"default-src": "'self'",
"connect-src": [
"'self'",
"https://api.openai.com",
"https://api.anthropic.com"
]
}
}Performance Architecture
Optimization Strategies
1. Code Splitting
// Lazy load routes
const ChatView = lazy(() => import('./pages/ChatView'));
const SettingsView = lazy(() => import('./pages/SettingsView'));2. Virtual Scrolling
// Render only visible messages
<VirtualList
items={messages}
itemHeight={100}
windowSize={10}
/>3. Memoization
// Prevent unnecessary re-renders
const processedMessages = useMemo(
() => messages.map(process),
[messages]
);4. Database Indexing
-- Index frequently queried columns
CREATE INDEX idx_messages_conversation
ON messages(conversation_id);
CREATE INDEX idx_messages_timestamp
ON messages(created_at DESC);5. Caching
// Cache expensive computations
const cache = new Map();
function getProcessedContent(id: string) {
if (cache.has(id)) {
return cache.get(id);
}
const result = expensiveProcess(id);
cache.set(id, result);
return result;
}Resource Management
Memory Management
- Clear old conversation history
- Limit in-memory message count
- Dispose of unused resources
- Implement weak references
Connection Pooling
- Reuse HTTP connections
- Pool database connections
- Manage WebSocket connections
- Implement connection limits
Extension Architecture
MCP Integration
MCP Client Architecture
TalkCody MCP Client
├── Server Discovery
│ ├── stdio servers
│ ├── SSE servers
│ └── HTTP servers
├── Protocol Handler
│ ├── Request/Response
│ ├── Streaming
│ └── Error handling
├── Tool Registry
│ ├── Tool discovery
│ ├── Schema validation
│ └── Execution
└── Resource Manager
├── Connection pooling
├── Timeout handling
└── Retry logicTool Execution Flow
1. AI decides to use tool
↓
2. Tool registry validates request
↓
3. MCP client sends request to server
↓
4. Server executes tool
↓
5. Result returned to client
↓
6. Result sent to AI
↓
7. AI incorporates result in responsePlugin System (Future)
Planned Architecture
Plugin API
├── Hooks
│ ├── beforeSendMessage
│ ├── afterReceiveResponse
│ ├── onFileOpen
│ └── onToolExecute
├── UI Extensions
│ ├── Custom panels
│ ├── Context menus
│ └── Status bar items
└── Custom Tools
├── Tool registration
├── Execution hooks
└── Result formattingBuild Architecture
Build Process
Development Build
1. Vite starts dev server (port 5173)
2. TypeScript compiler watches files
3. Tauri watches Rust source
4. Hot reload on changes
5. DevTools enabledProduction Build
1. TypeScript → JavaScript (tsc)
2. Vite bundles frontend → dist/
3. Rust compiles with optimizations
4. Tauri bundles application
5. Code signing (release)
6. Create installers (.dmg, .msi, etc.)Build Optimizations
Frontend
- Tree shaking
- Minification
- Code splitting
- Asset optimization
- Compression
Backend
- Release profile optimizations
- Link-time optimization (LTO)
- Strip debug symbols
- Minimal binary size
Testing Architecture
Testing Strategy
Unit Tests
- Services in isolation
- Pure functions
- Utility functions
- Data transformations
Integration Tests
- Service interactions
- Database operations
- IPC communication
- MCP integration
E2E Tests
- User workflows
- Critical paths
- Cross-platform behavior
Manual Tests
- UI/UX validation
- Performance testing
- Accessibility checks
Deployment Architecture
Release Process
1. Version bump
2. Update changelog
3. Run full test suite
4. Build for all platforms
5. Code signing
6. Create GitHub release
7. Upload artifacts
8. Update documentation
9. Notify usersDistribution Channels
- GitHub Releases: Primary distribution
- Homebrew (macOS):
brew install talkcody - Chocolatey (Windows):
choco install talkcody - Snap (Linux):
snap install talkcody
Future Architecture
Planned Improvements
Cloud Sync (Optional)
- Sync conversations across devices
- Cloud backup
- Team collaboration
Plugin Marketplace
- Discover community plugins
- One-click installation
- Automatic updates
Mobile Apps
- iOS companion app
- Android companion app
- Sync with desktop
Web Version
- Browser-based TalkCody
- Limited functionality
- Easy onboarding
Next Steps
- Development Setup - Build TalkCody
- Contributing
- GitHub Repository
Have questions about the architecture? Join our Discord community or open a GitHub Discussion.