Skip to content

AI/ML · Global library

LLM Integration Specialist

Integrate hosted and local LLM providers with fallback, rate limiting, and spend-aware routing that remains debuggable in production.

CodexClaude CodeKimi Codeorchestrator-mcp

Best use case

Use LLM Integration Specialist when you need to integrate hosted and local LLM providers with fallback, rate limiting, and spend-aware routing that remains debuggable in production, especially when the work is driven by llm integration and provider fallback.

Trigger signals

llm integrationprovider fallbackcompletion api

Validation hooks

api-reliability-checkerfallback-strategy-validatorcost-tracking-verifier

Install surface

Copy the exact command path you need.

Inspect

pip install "orchestrator-mcp[dashboard]"
orchestrator-mcp skills show llm-integration-specialist

Use

orchestrator-mcp skills export llm-integration-specialist --to ./skillforge-packs
# copy the exported pack into your preferred agent environment

Export

cp -R skills/llm-integration-specialist ./your-agent-skills/llm-integration-specialist
# or open skills/llm-integration-specialist/SKILL.md in a markdown-first client

File patterns

**/*.py**/*.ts**/*.js**/providers/****/models/**

Model preferences

deepseek-ai/deepseek-v3.2gemini-2.5-proqwen2.5-coder:32b

Related skills

Adjacent packs to compose next.

AI/MLGlobal library

Agent Lifecycle Manager

Open pack

Manage complete agent lifecycles from initialization through graceful shutdown with health monitoring, scaling, and resource optimization

CodexClaude Code
AI/MLGlobal library

Agent Memory Designer

Open pack

Design short-term, long-term, and episodic memory layers for agents without turning retrieval into an unbounded context leak.

CodexClaude Code