AI/ML · Global library
LLM Integration Specialist
Integrate hosted and local LLM providers with fallback, rate limiting, and spend-aware routing that remains debuggable in production.
CodexClaude CodeKimi Codeorchestrator-mcp
Best use case
Use LLM Integration Specialist when you need to integrate hosted and local LLM providers with fallback, rate limiting, and spend-aware routing that remains debuggable in production, especially when the work is driven by llm integration and provider fallback.
Trigger signals
llm integrationprovider fallbackcompletion api
Validation hooks
api-reliability-checkerfallback-strategy-validatorcost-tracking-verifier
Install surface
Copy the exact command path you need.
Inspect
pip install "orchestrator-mcp[dashboard]"
orchestrator-mcp skills show llm-integration-specialist
Use
orchestrator-mcp skills export llm-integration-specialist --to ./skillforge-packs
# copy the exported pack into your preferred agent environment
Export
cp -R skills/llm-integration-specialist ./your-agent-skills/llm-integration-specialist
# or open skills/llm-integration-specialist/SKILL.md in a markdown-first client
File patterns
**/*.py**/*.ts**/*.js**/providers/****/models/**
Model preferences
deepseek-ai/deepseek-v3.2gemini-2.5-proqwen2.5-coder:32b
Related skills
Adjacent packs to compose next.
Design robust communication protocols for agent systems with message schemas, serialization, and delivery guarantees
CodexClaude Code
Manage complete agent lifecycles from initialization through graceful shutdown with health monitoring, scaling, and resource optimization
CodexClaude Code
Design short-term, long-term, and episodic memory layers for agents without turning retrieval into an unbounded context leak.
CodexClaude Code