Agent skill
provider-integration-templates
OpenRouter framework integration templates for Vercel AI SDK, LangChain, and OpenAI SDK. Use when integrating OpenRouter with frameworks, setting up AI providers, building chat applications, implementing streaming responses, or when user mentions Vercel AI SDK, LangChain, OpenAI SDK, framework integration, or provider setup.
Install this agent skill to your Project
npx add-skill https://github.com/majiayu000/claude-skill-registry/tree/main/skills/development/provider-integration-templates-vanman2024-ai-dev-marketplace
SKILL.md
Provider Integration Templates
This skill provides complete integration templates, setup scripts, and working examples for integrating OpenRouter with popular AI frameworks: Vercel AI SDK, LangChain, and OpenAI SDK.
What This Skill Provides
- Setup Scripts: Automated installation and configuration for each framework
- Integration Templates: Drop-in code templates for provider configuration
- Working Examples: Complete implementation examples with best practices
- Validation Tools: Scripts to verify integrations are working correctly
Supported Frameworks
Vercel AI SDK (TypeScript)
- Provider configuration with
createOpenAI() - API route templates with
streamText()andgenerateText() - Chat UI components with
useChat()hook - Tool calling with Zod schemas
- Streaming responses
LangChain (Python & TypeScript)
- ChatOpenAI configuration for OpenRouter
- LCEL chain templates
- Agent templates with tool support
- RAG (Retrieval Augmented Generation) implementations
- Memory and context management
OpenAI SDK (Python & TypeScript)
- Drop-in replacement configuration
- Chat completions with streaming
- Function calling support
- Embeddings integration
Available Templates
Vercel AI SDK Templates
templates/vercel-ai-sdk-config.ts- OpenRouter provider setuptemplates/vercel-api-route.ts- API route with streamingtemplates/vercel-chat-component.tsx- Chat UI componenttemplates/vercel-tools-config.ts- Tool calling setup
LangChain Templates
templates/langchain-config.py- Python ChatOpenAI setuptemplates/langchain-config.ts- TypeScript ChatOpenAI setuptemplates/langchain-chain.py- LCEL chain templatetemplates/langchain-agent.py- Agent with toolstemplates/langchain-rag.py- RAG implementation
OpenAI SDK Templates
templates/openai-sdk-config.ts- TypeScript configurationtemplates/openai-sdk-config.py- Python configurationtemplates/openai-streaming.ts- Streaming exampletemplates/openai-functions.ts- Function calling
Setup Scripts
Installation Scripts
# Vercel AI SDK setup
bash scripts/setup-vercel-integration.sh
# LangChain setup (Python)
bash scripts/setup-langchain-integration.sh --python
# LangChain setup (TypeScript)
bash scripts/setup-langchain-integration.sh --typescript
Validation Scripts
# Validate integration is working
bash scripts/validate-integration.sh --framework vercel
# Test streaming functionality
bash scripts/test-streaming.sh --provider openrouter
# Check version compatibility
bash scripts/check-compatibility.sh
How to Use This Skill
1. Setup Framework Integration
Read the setup script for your target framework:
Read: skills/provider-integration-templates/scripts/setup-vercel-integration.sh
Execute the setup script to install dependencies:
bash skills/provider-integration-templates/scripts/setup-vercel-integration.sh
2. Use Integration Templates
Read the template you need:
Read: skills/provider-integration-templates/templates/vercel-ai-sdk-config.ts
Copy template to project:
cp skills/provider-integration-templates/templates/vercel-ai-sdk-config.ts src/lib/ai.ts
Customize with project-specific values:
- Replace
YOUR_OPENROUTER_API_KEYwith actual key or env var - Update model selection
- Configure streaming options
3. Review Working Examples
Read complete examples:
Read: skills/provider-integration-templates/examples/vercel-streaming-example.md
Examples show:
- Complete file structure
- Environment variable setup
- API route implementation
- Frontend component integration
- Error handling patterns
4. Validate Integration
Run validation script:
bash skills/provider-integration-templates/scripts/validate-integration.sh --framework vercel
Test streaming:
bash scripts/test-streaming.sh --provider openrouter --model anthropic/claude-4.5-sonnet
Integration Patterns
Pattern 1: Vercel AI SDK Chat Application
- Read Vercel AI SDK config template
- Copy to
src/lib/ai.ts - Read API route template
- Copy to
app/api/chat/route.ts - Read chat component template
- Copy to
components/chat.tsx - Run validation script
Pattern 2: LangChain LCEL Chain
- Read LangChain config template (Python or TS)
- Copy to
src/config/langchain.py - Read LCEL chain template
- Copy to
src/chains/chat_chain.py - Customize prompts and models
- Test with validation script
Pattern 3: OpenAI SDK Drop-in Replacement
- Read OpenAI SDK config template
- Replace base URL with OpenRouter endpoint
- Add
HTTP-RefererandX-Titleheaders - Update API key to use OpenRouter key
- Test existing OpenAI code (should work unchanged)
Environment Variables
All templates use these standard environment variables:
OPENROUTER_API_KEY=sk-or-v1-...
OPENROUTER_MODEL=anthropic/claude-4.5-sonnet
OPENROUTER_SITE_URL=https://yourapp.com # Optional: for rankings
OPENROUTER_SITE_NAME=YourApp # Optional: for rankings
Model Selection
Templates use configurable model selection. Common models:
anthropic/claude-4.5-sonnet- Best reasoning, long contextanthropic/claude-4.5-sonnet- Most capable, highest costmeta-llama/llama-3.1-70b-instruct- Fast, cost-effectiveopenai/gpt-4-turbo- Strong general purposegoogle/gemini-pro-1.5- Long context, multimodal
Update model selection in templates based on use case.
Best Practices
- Use Environment Variables: Never hardcode API keys
- Enable Streaming: Better UX for chat applications
- Add Error Handling: Handle rate limits and API errors
- Set HTTP Headers: Include site URL and name for rankings
- Test Before Deployment: Use validation scripts
- Monitor Usage: Track costs with OpenRouter dashboard
Troubleshooting
Issue: API key not working
- Check key format:
sk-or-v1-... - Verify key in OpenRouter dashboard
- Check environment variable is loaded
Issue: Streaming not working
- Ensure
stream: truein request - Check framework version compatibility
- Verify response handler supports streaming
Issue: Model not found
- Check model ID format:
provider/model-name - Verify model is available on OpenRouter
- Check for typos in model name
Progressive Disclosure
For detailed implementation guides, load these files as needed:
examples/vercel-streaming-example.md- Complete Vercel AI SDK setupexamples/langchain-rag-example.md- RAG implementation guideexamples/openai-sdk-example.md- OpenAI SDK migration guide
Template Version: 1.0.0 Framework Support: Vercel AI SDK 4.x, LangChain 0.3.x, OpenAI SDK 1.x Last Updated: 2025-10-31
Didn't find tool you were looking for?