Skip to main content

How to Use Custom Models in Cursor

This guide explains how to integrate and use custom AI models in Cursor, allowing you to leverage different AI capabilities for your development workflow.

Supported Model Types

Cursor supports various AI model integrations:

  1. OpenAI Compatible Models

    • Anthropic Claude
    • DeepSeek
    • Mistral
    • Local LLMs
  2. Custom API Endpoints

    • Self-hosted models
    • Cloud API services
    • Custom implementations

Basic Configuration

Setting Up Custom Models

  1. Open Cursor Settings
  2. Navigate to AI Models section
  3. Add new model configuration:
{
"models": {
"custom-model": {
"name": "Your Model Name",
"apiKey": "your-api-key",
"baseUrl": "https://api.your-model-endpoint.com/v1",
"contextLength": 8192,
"temperature": 0.7
}
}
}

Configuration Parameters

ParameterDescriptionDefault
nameDisplay name for the modelRequired
apiKeyAPI authentication keyRequired
baseUrlAPI endpoint URLRequired
contextLengthMaximum context window4096
temperatureResponse randomness0.7

Model Integration

OpenAI Compatible Models

{
"models": {
"custom-gpt": {
"name": "Custom GPT",
"apiKey": "${OPENAI_API_KEY}",
"baseUrl": "https://api.openai.com/v1",
"model": "gpt-4",
"contextLength": 8192
}
}
}

Anthropic Claude Setup

{
"models": {
"claude": {
"name": "Claude",
"apiKey": "${ANTHROPIC_API_KEY}",
"baseUrl": "https://api.anthropic.com/v1",
"model": "claude-2",
"contextLength": 100000
}
}
}

Local Model Configuration

{
"models": {
"local-llm": {
"name": "Local LLM",
"baseUrl": "http://localhost:8000",
"contextLength": 4096,
"useDocker": true
}
}
}

Advanced Settings

Model Behavior

Configure model behavior:

{
"models": {
"custom-model": {
"settings": {
"temperature": 0.7,
"topP": 0.9,
"frequencyPenalty": 0.0,
"presencePenalty": 0.0,
"stopSequences": ["```", "###"]
}
}
}
}

Response Formatting

{
"models": {
"custom-model": {
"formatting": {
"trimWhitespace": true,
"removeNewlines": false,
"maxTokens": 1000
}
}
}
}

Model-Specific Features

Code Completion

{
"models": {
"code-model": {
"features": {
"codeCompletion": true,
"contextAware": true,
"multiFile": true
}
}
}
}

Chat Capabilities

{
"models": {
"chat-model": {
"features": {
"chat": true,
"systemPrompts": true,
"streaming": true
}
}
}
}

Performance Optimization

Caching Settings

{
"models": {
"custom-model": {
"cache": {
"enabled": true,
"maxSize": "1GB",
"ttl": 3600
}
}
}
}

Rate Limiting

{
"models": {
"custom-model": {
"rateLimit": {
"requestsPerMinute": 60,
"tokensPerMinute": 90000,
"concurrent": 5
}
}
}
}

Troubleshooting

Common Issues

  1. Connection Problems

    • Verify API endpoint
    • Check network connectivity
    • Validate API key
  2. Response Errors

    • Check model compatibility
    • Verify request format
    • Review error messages
  3. Performance Issues

    • Optimize context length
    • Adjust cache settings
    • Monitor rate limits

Best Practices

Security

  1. API Key Management

    • Use environment variables
    • Rotate keys regularly
    • Implement access controls
  2. Request Validation

    • Sanitize inputs
    • Validate responses
    • Handle errors gracefully

Performance

  1. Context Optimization

    • Minimize context size
    • Clear unused context
    • Use efficient prompts
  2. Resource Management

    • Monitor usage
    • Implement caching
    • Optimize requests

Model Comparison

Feature Matrix

FeatureOpenAIClaudeLocal LLM
Code Completion
ChatVaries
Context Length8K-32K100KVaries
Response SpeedFastMediumVaries
  • Model Configuration Guide
  • API Integration
  • Performance Optimization

Conclusion

Custom model integration in Cursor provides flexibility in choosing AI capabilities that best suit your needs. Following these configuration guidelines ensures optimal performance and reliability.

  • Advanced Model Configuration
  • API Integration Guide
  • Performance Optimization