TalkCody

FAQ

Frequently Asked Questions about TalkCody

Frequently Asked Questions

Find answers to common questions about TalkCody.

General Questions

What is TalkCody?

TalkCody is a desktop AI coding assistant built specifically for developers. It provides an integrated environment for AI-assisted coding, featuring multi-model support, custom AI agents, and extensible tool integration through the Model Context Protocol (MCP).

Is TalkCody free?

TalkCody is an open-source application and is free to download and use. However, you will need API keys from AI providers (OpenAI, Anthropic, etc.) which may have their own pricing. Some providers offer free tiers or credits for new users.

Which operating systems are supported?

Currently, TalkCody is available for:

  • macOS: Fully supported (macOS 10.15+)
  • Windows: Coming soon
  • Linux: Coming soon

Do I need an internet connection?

Yes, an internet connection is required to access AI models through provider APIs. However, you can use local models through Ollama for offline AI capabilities.

Setup and Configuration

How do I get API keys?

API keys can be obtained from the following providers:

Once you have a key, add it in TalkCody Settings → API Keys.

Can I use multiple AI providers?

Yes! TalkCody supports multiple AI providers simultaneously. You can:

  • Configure multiple API keys
  • Switch between models from different providers
  • Assign different models to different AI agents
  • Use the best model for each specific task

Which AI model should I use?

The best model depends on your task:

TaskRecommended Models
General codingGPT-4.1, Claude 4.5 Sonnet
Code completionQwen 3 Coder, Codestral
Fast responsesClaude Haiku, GPT-4.1 Turbo
Complex reasoningClaude 4.5 Opus, GPT-5
Budget-friendlyDeepSeek Chat, GLM 4.5 Air
Vision/ImagesGPT-4.1 Vision, Gemini Pro Vision

Start with Claude 4.5 Sonnet or GPT-4.1 Turbo for a good balance of quality and speed.

How do I configure custom agents?

  1. Navigate to the Agents view
  2. Click "New Agent"
  3. Configure the agent settings:
    • Name: Descriptive name for the agent
    • System Prompt: Instructions that define behavior
    • Model: Which AI model to use
    • Tools: What actions the agent can perform

Learn more in the AI Agents documentation.

Features and Functionality

What is an AI Agent?

An AI Agent is a customized AI assistant with a specific role and capabilities. Agents have:

  • System Prompt: Defines their behavior and expertise
  • Assigned Tools: What actions they can perform
  • Default Model: Which AI model they use

For example, you might have:

  • A "Code Reviewer" agent that focuses on code quality
  • A "Documentation Writer" agent specialized in technical writing
  • A "Test Generator" agent that creates unit tests

What are MCP Servers?

MCP (Model Context Protocol) servers are external services that extend TalkCody's capabilities. They can provide:

  • Access to external APIs
  • Custom tools and functions
  • Integration with third-party services
  • Specialized data sources

Learn more in the MCP Servers documentation.

Can AI agents edit files in my project?

Yes, when you grant an AI agent access to file operation tools, it can:

  • Read files from your project
  • Create new files
  • Edit existing files
  • Delete files (with confirmation)

All file edits are presented for review before being applied, giving you full control.

Does TalkCody support voice input?

Yes! TalkCody includes voice-to-text functionality for hands-free input. Click the microphone icon in the chat interface to start voice input.

Can I use TalkCody offline?

You can use TalkCody offline with local models via Ollama. Configure Ollama in Settings and select a local model. Note that offline models may have reduced capabilities compared to cloud-based models.

Privacy and Security

Where is my data stored?

  • Conversations: Stored locally in a SQLite database on your machine
  • API Keys: Stored securely in your local settings
  • Project Files: Remain in their original locations on your disk
  • AI Requests: Sent to the respective AI provider APIs

TalkCody does not send your data to any servers except the AI provider APIs you configure.

Are my API keys secure?

Yes, API keys are stored locally on your machine using secure storage mechanisms. They are never transmitted except when making API calls to the respective providers.

Can I delete my conversation history?

Yes, you can delete individual conversations or clear all history from the application. This removes the data from your local database permanently.

Do AI providers see my code?

When you use cloud-based AI models, the code you share in conversations is sent to the provider's API for processing. Each provider has their own data retention policies:

For maximum privacy, use local models via Ollama.

Troubleshooting

TalkCody won't start on macOS

If macOS blocks TalkCody:

  1. Go to System Preferences → Security & Privacy
  2. Look for a message about TalkCody being blocked
  3. Click "Open Anyway"
  4. Launch TalkCody again

API requests are failing

Check the following:

  1. API Key: Verify your API key is correct in Settings
  2. Internet Connection: Ensure you're connected to the internet
  3. Provider Status: Check if the AI provider's service is operational
  4. Credits: Verify your API account has sufficient credits
  5. Rate Limits: You may have hit rate limits; wait and try again

Responses are slow

To improve response speed:

  • Use faster models (Haiku, GPT-4.1 Turbo)
  • Reduce max tokens setting
  • Check your internet connection
  • Try a different AI provider

The application is using too much memory

If TalkCody is using excessive memory:

  • Close unused conversations
  • Reduce the number of open files in the editor
  • Clear conversation history
  • Restart the application

File changes aren't being saved

Verify that:

  • You have write permissions for the file
  • The file isn't open in another application
  • Auto-save is enabled in Settings → Editor
  • Your disk has sufficient free space

Billing and Costs

How much does it cost to use TalkCody?

TalkCody itself is free. However, AI API usage costs vary by provider:

Example Pricing (approximate):

  • OpenAI GPT-4.1: $0.30 per 1M input tokens, $1.20 per 1M output tokens
  • Anthropic Claude 4.5: $3.00 per 1M input tokens, $15.00 per 1M output tokens
  • Google Gemini: $0.35 per 1M tokens
  • DeepSeek: $0.14 per 1M input tokens, $0.28 per 1M output tokens

Pricing is approximate and subject to change. Check provider websites for current rates.

Are there any free AI models?

Yes! Several providers offer free tiers:

  • DeepSeek Chat: Free tier available
  • GLM 4.5 Air: Free via Zhipu AI
  • Ollama: Completely free for local models
  • OpenRouter: Offers some free models

How can I track my API usage?

Most providers offer usage dashboards:

  • Check your provider's website for usage statistics
  • Monitor costs in your provider account
  • TalkCody shows token counts in conversations (Settings → General → Show Token Counts)

Contributing and Development

Is TalkCody open source?

Yes! TalkCody is open source. You can:

  • View the source code on GitHub
  • Report issues and bugs
  • Submit feature requests
  • Contribute code improvements

How can I contribute?

Check out the Development Setup guide to:

  • Set up your development environment
  • Build TalkCody from source
  • Submit pull requests
  • Report bugs effectively

Can I create custom themes?

Currently, TalkCody supports light and dark modes. Custom theming support is planned for a future release.

Can I create my own MCP servers?

Yes! MCP servers can be created using the MCP SDK. Learn more in the MCP Configuration documentation.

Still Have Questions?

If you didn't find your answer here:

We're here to help! Don't hesitate to reach out to the community.