Skip to content

Frequently Asked Questions (FAQ)

Models and Setup

What are the best models for CodinIT.dev?

For the best experience with CodinIT.dev, we recommend using the following models from our 19+ supported providers:

**Top Recommended Models:**
- **Claude 3.5 Sonnet** (Anthropic): Best overall coder, excellent for complex applications
- **GPT-4o** (OpenAI): Strong alternative with great performance across all use cases
- **Claude 4 Opus** (Anthropic): Latest flagship model with enhanced capabilities
- **Gemini 2.0 Flash** (Google): Exceptional speed for rapid development
- **DeepSeekCoder V3** (DeepSeek): Best open-source model for coding tasks

**Self-Hosting Options:**
- **DeepSeekCoder V2 236b**: Powerful self-hosted option
- **Qwen 2.5 Coder 32b**: Best for moderate hardware requirements
- **Ollama models**: Local inference with various model sizes

**Latest Specialized Models:**
- **Moonshot AI (Kimi)**: Kimi K2 models with advanced reasoning capabilities
- **xAI Grok 4**: Latest Grok model with 256K context window
- **Anthropic Claude 4 Opus**: Latest flagship model from Anthropic

!!! tip "Model Selection Tips"
    - Use larger models (7B+ parameters) for complex applications
    - Claude models excel at structured code generation
    - GPT-4o provides excellent general-purpose coding assistance
    - Gemini models offer the fastest response times
How do I configure API keys for different providers?

You can configure API keys in two ways:

**Option 1: Environment Variables (Recommended for production)**
Create a `.env.local` file in your project root:
```bash
ANTHROPIC_API_KEY=your_anthropic_key_here
OPENAI_API_KEY=your_openai_key_here
GOOGLE_GENERATIVE_AI_API_KEY=your_google_key_here
MOONSHOT_API_KEY=your_moonshot_key_here
XAI_API_KEY=your_xai_key_here
```

**Option 2: In-App Configuration**
- Click the settings icon (⚙️) in the sidebar
- Navigate to the "Providers" tab
- Switch between "Cloud Providers" and "Local Providers" tabs
- Click on a provider card to expand its configuration
- Click on the "API Key" field to enter edit mode
- Paste your API key and press Enter to save
- Look for the green checkmark to confirm proper configuration

!!! note "Security Note"
    Never commit API keys to version control. The `.env.local` file is already in `.gitignore`.
How do I add a new LLM provider?

CodinIT.dev uses a modular provider architecture. To add a new provider:

1. **Create a Provider Class** in `app/lib/modules/llm/providers/your-provider.ts`
2. **Implement the BaseProvider interface** with your provider's specific logic
3. **Register the provider** in `app/lib/modules/llm/registry.ts`
4. **The system automatically detects** and registers your new provider

See the [Adding New LLMs](../#adding-new-llms) section for complete implementation details.
How do I set up Moonshot AI (Kimi) provider?

Moonshot AI provides access to advanced Kimi models with excellent reasoning capabilities:

**Setup Steps:**
1. Visit [Moonshot AI Platform](https://platform.moonshot.ai/console/api-keys)
2. Create an account and generate an API key
3. Add `MOONSHOT_API_KEY=your_key_here` to your `.env.local` file
4. Or configure it directly in Settings → Providers → Cloud Providers → Moonshot

**Available Models:**
- **Kimi K2 Preview**: Latest Kimi model with 128K context
- **Kimi K2 Turbo**: Fast inference optimized version
- **Kimi Thinking**: Specialized for complex reasoning tasks
- **Moonshot v1 series**: Legacy models with vision capabilities

!!! tip "Moonshot AI Features"
    - Excellent for Chinese language tasks
    - Strong reasoning capabilities
    - Vision-enabled models available
    - Competitive pricing
What are the latest xAI Grok models?

xAI has released several new Grok models with enhanced capabilities:

**Latest Models:**
- **Grok 4**: Most advanced model with 256K context window
- **Grok 4 (07-09)**: Specialized variant for specific tasks
- **Grok 3 Beta**: Previous generation with 131K context
- **Grok 3 Mini variants**: Optimized for speed and efficiency

**Setup:**
1. Get your API key from [xAI Platform](https://docs.x.ai/docs/quickstart#creating-an-api-key)
2. Add `XAI_API_KEY=your_key_here` to your `.env.local` file
3. Models will be available in the provider selection

Best Practices

How do I access help and documentation?

CodinIT.dev provides multiple ways to access help and documentation:

**Help Icon in Sidebar:**
- Look for the question mark (?) icon in the sidebar
- Click it to open the full documentation in a new tab
- Provides instant access to guides, troubleshooting, and FAQs

**Documentation Resources:**
- **Main Documentation**: Complete setup and feature guides
- **FAQ Section**: Answers to common questions
- **Troubleshooting**: Solutions for common issues
- **Best Practices**: Tips for optimal usage

**Community Support:**
- **GitHub Issues**: Report bugs and request features
How do I get the best results with CodinIT.dev?

Follow these proven strategies for optimal results:

**Project Setup:**
- **Be specific about your stack**: Mention frameworks/libraries (Astro, Tailwind, ShadCN, Next.js) in your initial prompt
- **Choose appropriate templates**: Use our 15+ project templates for quick starts
- **Configure providers properly**: Set up your preferred LLM providers before starting

**Development Workflow:**
- **Use the enhance prompt icon**: Click the enhance icon to let AI refine your prompts before submitting
- **Scaffold basics first**: Build foundational structure before adding advanced features
- **Batch simple instructions**: Combine tasks like *"Change colors, add mobile responsiveness, restart dev server"*

**Advanced Features:**
- **Leverage MCP tools**: Use Model Context Protocol for enhanced AI capabilities
- **Connect databases**: Integrate Supabase for backend functionality
- **Use Git integration**: Version control your projects with GitHub
- **Deploy easily**: Use built-in Vercel, Netlify, or GitHub Pages deployment
What is MCP and why should I use it?

MCP (Model Context Protocol) is an open protocol that extends CodinIT.dev's AI capabilities by allowing it to interact with external tools and data sources:

**What MCP Enables:**
- **Database Access**: Query SQL databases, MongoDB, Redis, and more
- **File Operations**: Read/write files with proper permissions
- **API Integrations**: Connect to REST APIs, GraphQL endpoints
- **Custom Tools**: Build domain-specific tools for your workflow
- **Real-time Data**: Access live data during AI conversations

**Why Use MCP:**
- Makes the AI aware of your specific data and context
- Automates complex workflows with multiple tool calls
- Securely connects to enterprise systems
- Standardized protocol supported by multiple AI platforms
How do I set up MCP servers in CodinIT.dev?

Setting up MCP servers is straightforward:

**Step-by-Step Setup:**
1. **Open Settings**: Click the settings icon (⚙️) in the sidebar
2. **Navigate to MCP Tab**: Select "MCP" from the settings menu
3. **Add Server**: Click "Add Server" or "Configure Server"
4. **Choose Server Type**:
   - **STDIO**: For local command-line tools
   - **SSE**: For Server-Sent Events servers
   - **HTTP**: For HTTP-based MCP servers
5. **Configure Connection**:
   - Enter server name and description
   - Set command/URL based on server type
   - Add required environment variables or headers
6. **Save and Enable**: Save configuration and enable the server

**Example STDIO Configuration (PostgreSQL)**:
```json
{
  "name": "postgres-db",
  "type": "stdio",
  "command": "npx",
  "args": ["-y", "@modelcontextprotocol/server-postgres"],
  "env": {
    "DATABASE_URL": "postgresql://user:pass@localhost/mydb"
  }
}
```

**Example SSE Configuration (Remote API)**:
```json
{
  "name": "my-api",
  "type": "sse",
  "url": "https://api.example.com/mcp",
  "headers": {
    "Authorization": "Bearer YOUR_API_TOKEN"
  }
}
```
What MCP server types are supported?

CodinIT.dev supports three types of MCP server connections:

**1. STDIO Servers (Local Tools)**
- Run as local command-line processes
- Communicate via standard input/output
- Best for: Local databases, file systems, CLI tools
- Examples: PostgreSQL server, filesystem server, git tools

**2. SSE Servers (Server-Sent Events)**
- Connect to remote servers via HTTP
- Real-time streaming with Server-Sent Events
- Best for: Remote APIs, cloud services
- Examples: Cloud database services, third-party APIs

**3. Streamable HTTP Servers**
- HTTP-based protocol with streaming support
- Flexible connection options
- Best for: Custom services, enterprise systems
- Examples: Internal APIs, custom business logic

Each type has different setup requirements and use cases. Choose based on your needs.
How do MCP tools work during conversations?

When MCP servers are configured, their tools become available to the AI:

**Tool Discovery:**
- CodinIT.dev automatically detects all tools from connected servers
- Tools appear in the AI's available tool list
- Tool descriptions help the AI understand when to use them

**Tool Execution Flow:**
1. **AI Decides**: Based on your prompt, AI determines which tool to use
2. **User Approval**: You review and approve the tool execution for security
3. **Tool Runs**: The MCP server executes the tool with provided parameters
4. **Results Return**: Tool output is sent back to the AI
5. **AI Responds**: AI incorporates tool results into its response

**Approval States:**
- **APPROVE**: Allow the tool to execute
- **REJECT**: Deny the tool execution
- **ERROR**: Tool execution failed

**Security Features:**
- All tool executions require explicit user approval
- Tool parameters are shown before execution
- Failed executions are logged with error details
What are common MCP use cases?

MCP enables many powerful workflows:

**Development & DevOps:**
- Query database schemas and generate migrations
- Read/write code files for automated refactoring
- Execute git commands for version control
- Deploy applications to cloud platforms
- Run test suites and analyze results

**Data Analysis:**
- Query SQL databases for business intelligence
- Process CSV/JSON files for data transformation
- Generate charts and visualizations
- Fetch real-time data from APIs
- Aggregate data from multiple sources

**Business Operations:**
- Integrate with CRM systems (Salesforce, HubSpot)
- Manage customer support tickets
- Access inventory and order management systems
- Generate reports from business data
- Automate routine administrative tasks

**Content Management:**
- Read/write documentation files
- Manage blog posts and articles
- Update website content
- Process and optimize images
- Version control content changes
How do I troubleshoot MCP connection issues?

Common MCP issues and solutions:

**Server Won't Connect:**
- Verify server endpoint/command is correct
- Check authentication credentials (API keys, tokens)
- Ensure network connectivity for remote servers
- Review server logs for specific error messages
- Confirm MCP protocol version compatibility

**Tools Not Appearing:**
- Restart the MCP server to refresh tool list
- Check server configuration in Settings → MCP
- Verify server is enabled (toggle switch on)
- Look for errors in browser console (F12)
- Confirm server implements tool discovery correctly

**Tool Execution Failures:**
- Check tool parameters are valid
- Ensure required permissions are granted
- Verify environment variables are set correctly
- Review tool-specific error messages
- Test the tool outside CodinIT.dev first

**Performance Issues:**
- Limit number of concurrent tool calls
- Use smaller result sets when querying databases
- Implement caching in your MCP server
- Monitor server response times
- Consider using local servers for better performance

**Conflict Resolution:**
- If multiple servers provide the same tool name, CodinIT.dev will detect the conflict
- Rename tools in server configuration to avoid conflicts
- Disable unused servers to reduce tool namespace pollution
Can I build my own MCP servers?

Yes! You can create custom MCP servers for your specific needs:

**Building Custom Servers:**
- Use the official MCP SDK: `@modelcontextprotocol/sdk`
- Implement server in TypeScript, Python, or other languages
- Define custom tools with input schemas and handlers
- Deploy as STDIO, SSE, or HTTP server

**Basic Server Example (TypeScript)**:
```typescript
import { Server } from '@modelcontextprotocol/sdk/server/index.js';
import { StdioServerTransport } from '@modelcontextprotocol/sdk/server/stdio.js';

const server = new Server({
  name: 'my-custom-server',
  version: '1.0.0',
});

// Register a custom tool
server.setRequestHandler('tools/list', async () => ({
  tools: [{
    name: 'get_user_data',
    description: 'Fetch user data from database',
    inputSchema: {
      type: 'object',
      properties: {
        userId: { type: 'string' }
      }
    }
  }]
}));

// Start the server
const transport = new StdioServerTransport();
await server.connect(transport);
```

**Resources:**
- MCP Documentation: [modelcontextprotocol.io](https://modelcontextprotocol.io)
- Example Servers: GitHub MCP organization
- SDK Reference: @modelcontextprotocol/sdk package

Custom servers allow you to integrate any system or data source with CodinIT.dev!
How do I deploy my CodinIT.dev projects?

CodinIT.dev supports one-click deployment to multiple platforms:

**Supported Platforms:**
- **Vercel**: Go to Settings → Connections → Vercel, then deploy with one click
- **Netlify**: Connect your Netlify account and deploy instantly
- **GitHub Pages**: Push to GitHub and enable Pages in repository settings

**Deployment Features:**
- Automatic build configuration for popular frameworks
- Environment variable management
- Custom domain support
- Preview deployments for testing
How do I use Git integration features?

CodinIT.dev provides comprehensive Git integration with GitHub and GitLab:

**Basic Git Operations:**
- Import existing repositories by URL
- Create new repositories on GitHub or GitLab
- Automatic commits for major changes
- Push/pull changes seamlessly

**GitHub Integration:**
- Connect GitHub account in Settings → Connections → GitHub
- Import from your connected repositories
- Create and manage branches
- View repository statistics

**GitLab Integration:**
- Connect GitLab account in Settings → Connections → GitLab
- Browse and import GitLab projects
- Manage GitLab branches
- Access project metadata

**Advanced Features:**
- Version control with diff visualization
- Collaborative development support
- Bug report generation with automatic system information

Project Information

How do I contribute to CodinIT.dev?

Check out our Contribution Guide for more details on how to get involved!

Why are there so many open issues/pull requests?

We're forming a team of maintainers to manage demand and streamline issue resolution. The maintainers are rockstars, and we're also exploring partnerships to help the project thrive.

New Features & Technologies

What's new in CodinIT.dev?

Recent major additions to CodinIT.dev include:

**Advanced AI Capabilities:**
- **19 LLM Providers**: Support for Anthropic, OpenAI, Google, DeepSeek, Cohere, and more
- **MCP Integration**: Model Context Protocol for enhanced AI tool calling
- **Dynamic Model Loading**: Automatic model discovery from provider APIs

**Development Tools:**
- **WebContainer**: Secure sandboxed development environment
- **Live Preview**: Real-time application previews without leaving the editor
- **Project Templates**: 15+ starter templates for popular frameworks

**Version Control & Collaboration:**
- **Git Integration**: Import/export projects with GitHub and GitLab
- **Automatic Commits**: Smart version control for project changes
- **Diff Visualization**: See code changes clearly
- **Bug Reporting**: Built-in bug report generation and tracking

**Backend & Database:**
- **Supabase Integration**: Built-in database and authentication
- **API Integration**: Connect to external services and databases

**Deployment & Production:**
- **One-Click Deployment**: Vercel, Netlify, and GitHub Pages support
- **Environment Management**: Production-ready configuration
- **Build Optimization**: Automatic configuration for popular frameworks
How do I use the new project templates?

CodinIT.dev offers templates for popular frameworks and technologies:

**Getting Started:**
1. Start a new project in CodinIT.dev
2. Browse available templates in the starter selection
3. Choose your preferred technology stack
4. The AI will scaffold your project with best practices

**Available Templates:**
- **Frontend**: React, Vue, Angular, Svelte, SolidJS
- **Full-Stack**: Next.js, Astro, Qwik, Remix, Nuxt
- **Mobile**: Expo, React Native
- **Content**: Slidev presentations, Astro blogs
- **Vanilla**: Vite with TypeScript/JavaScript

Templates include pre-configured tooling, linting, and build processes.
How does WebContainer work?

WebContainer provides a secure development environment:

**Features:**
- **Isolated Environment**: Secure sandbox for running code
- **Full Node.js Support**: Run npm, build tools, and dev servers
- **Live File System**: Direct manipulation of project files
- **Terminal Integration**: Execute commands with real-time output

**Supported Technologies:**
- All major JavaScript frameworks (React, Vue, Angular, etc.)
- Build tools (Vite, Webpack, Parcel)
- Package managers (npm, pnpm, yarn)
How do I connect external databases?

Use Supabase for backend database functionality:

**Setup Process:**
1. Create a Supabase project at supabase.com
2. Get your project URL and API keys
3. Configure the connection in your CodinIT.dev project
4. Use Supabase tools to interact with your database

**Available Features:**
- Real-time subscriptions
- Built-in authentication
- Row Level Security (RLS)
- Automatic API generation
- Database migrations

Model Comparisons

How do local LLMs compare to larger models like Claude 3.5 Sonnet for CodinIT.dev?

While local LLMs are improving rapidly, larger models still offer the best results for complex applications. Here's the current landscape:

**Recommended for Production:**
- **Claude 4 Opus**: Latest flagship model with enhanced reasoning (200K context)
- **Claude 3.5 Sonnet**: Proven excellent performance across all tasks
- **GPT-4o**: Strong general-purpose coding with great reliability
- **xAI Grok 4**: Latest Grok with 256K context window

**Fast & Efficient:**
- **Gemini 2.0 Flash**: Exceptional speed for rapid development
- **Claude 3 Haiku**: Cost-effective for simpler tasks
- **xAI Grok 3 Mini Fast**: Optimized for speed and efficiency

**Advanced Reasoning:**
- **Moonshot AI Kimi K2**: Advanced reasoning with 128K context
- **Moonshot AI Kimi Thinking**: Specialized for complex reasoning tasks

**Open Source & Self-Hosting:**
- **DeepSeekCoder V3**: Best open-source model available
- **DeepSeekCoder V2 236b**: Powerful self-hosted option
- **Qwen 2.5 Coder 32b**: Good balance of performance and resource usage

**Local Models (Ollama):**
- Best for privacy and offline development
- Use 7B+ parameter models for reasonable performance
- Still experimental for complex, large-scale applications

!!! tip "Model Selection Guide"
    - Use Claude/GPT-4o for complex applications
    - Use Gemini for fast prototyping
    - Use local models for privacy/offline development
    - Always test with your specific use case

Troubleshooting

There was an error processing this request

This generic error message means something went wrong. Check these locations:

- **Terminal output**: If you started with Docker or `pnpm`
- **Browser developer console**: Press `F12` → Console tab
- **Server logs**: Check for any backend errors
- **Network tab**: Verify API calls are working
x-api-key header missing

This authentication error can be resolved by:

- **Restarting the container**: `docker compose restart` (if using Docker)
- **Switching run methods**: Try `pnpm` if using Docker, or vice versa
- **Checking API keys**: Verify your API keys are properly configured
- **Clearing browser cache**: Sometimes cached authentication causes issues
Blank preview when running the app

Blank previews usually indicate code generation issues:

- **Check developer console** for JavaScript errors
- **Verify WebContainer is running** properly
- **Try refreshing** the preview pane
- **Check for hallucinated code** in the generated files
- **Restart the development server** if issues persist
MCP server connection failed

If you're having trouble with MCP integrations:

- **Verify server configuration** in Settings → MCP
- **Check server endpoints** and authentication credentials
- **Test server connectivity** outside of CodinIT.dev
- **Review MCP server logs** for specific error messages
- **Ensure server supports** the MCP protocol version
Git integration not working

Common Git-related issues and solutions:

- **GitHub connection failed**: Verify your GitHub token has correct permissions
- **Repository not found**: Check repository URL and access permissions
- **Push/pull failed**: Ensure you have write access to the repository
- **Merge conflicts**: Resolve conflicts manually or use the diff viewer
- **Large files blocked**: Check GitHub's file size limits
Deployment failed

Deployment issues can be resolved by:

- **Checking build logs** for specific error messages
- **Verifying environment variables** are set correctly
- **Testing locally** before deploying
- **Checking platform-specific requirements** (Node version, build commands)
- **Reviewing deployment configuration** in platform settings
Everything works, but the results are bad

For suboptimal AI responses, try these solutions:

- **Switch to a more capable model**: Use Claude 3.5 Sonnet, GPT-4o, or Claude 4 Opus
- **Be more specific** in your prompts about requirements and technologies
- **Use the enhance prompt feature** to refine your requests
- **Break complex tasks** into smaller, focused prompts
- **Provide context** about your project structure and goals
WebContainer preview not loading

If the live preview isn't working:

- **Check WebContainer status** in the terminal
- **Verify Node.js compatibility** with your project
- **Restart the development environment**
- **Clear browser cache** and reload
- **Check for conflicting ports** (default is 5173)
Received structured exception #0xc0000005: access violation

Windows-specific issue: Update the Visual C++ Redistributable

Miniflare or Wrangler errors in Windows

Windows development environment: Install Visual Studio C++ (version 14.40.33816 or later). More details in GitHub Issues

Provider not showing up after adding it

If your custom LLM provider isn't appearing:

- **Restart the development server** to reload providers
- **Check the provider registry** in `app/lib/modules/llm/registry.ts`
- **Verify the provider class** extends `BaseProvider` correctly
- **Check browser console** for provider loading errors
- **Ensure proper TypeScript compilation** without errors

Get Help & Support

Report Issues

Open an Issue in our GitHub Repository