Best use case
Use TinyML Development for Microcontrollers when you need to deploy ML models on resource-constrained microcontrollers for on-device inference, especially when the work is driven by tinyml and microcontroller.
IoT · Global library
Deploy ML models on resource-constrained microcontrollers for on-device inference
Best use case
Use TinyML Development for Microcontrollers when you need to deploy ML models on resource-constrained microcontrollers for on-device inference, especially when the work is driven by tinyml and microcontroller.
Trigger signals
Validation hooks
Install surface
Inspect
pip install "orchestrator-mcp[dashboard]"
orchestrator-mcp skills show tinyml-development-for-microcontrollersUse
orchestrator-mcp skills export tinyml-development-for-microcontrollers --to ./skillforge-packs
# copy the exported pack into your preferred agent environmentExport
cp -R skills/tinyml-development-for-microcontrollers ./your-agent-skills/tinyml-development-for-microcontrollers
# or open skills/tinyml-development-for-microcontrollers/SKILL.md in a markdown-first clientFile patterns
Model preferences
Related skills
Deploy and serve ML models at the edge with auto-scaling, A/B testing, and monitoring
Optimize ML models for edge deployment with quantization, pruning, and hardware acceleration
Train ML models collaboratively across edge devices without centralizing sensitive data