Nano Banana 2 & Pro Sale — 15% OFF | Apr 1–15 Only

Motion Control Models

Control character poses, camera movements, and object trajectories in AI-generated images and videos with WaveSpeed

Our selection

kwaivgi/kling-v2.6-pro/motion-control
motion-control

kwaivgi/kling-v2.6-pro/motion-control

Kling 2.6 Pro Motion Control turns reference motion clips (dance, action, gesture) into smooth, realistic animations. Upload a character image (or source video) and a motion video; the model transfers the movement while preserving identity and temporal consistency. Ready-to-use REST API with fast response, native-audio option, no cold starts, and affordable pricing.

All Models

8 models
motion-control

kwaivgi/kling-v2.6-pro/motion-control

Kling 2.6 Pro Motion Control turns reference motion clips (dance, action, gesture) into smooth, realistic animations. Upload a character image (or source video) and a motion video; the model transfers the movement while preserving identity and temporal consistency. Ready-to-use REST API with fast response, native-audio option, no cold starts, and affordable pricing.

motion-control

kwaivgi/kling-v2.6-std/motion-control

Kling 2.6 Standard Motion Control transfers motion from reference videos to animate still images. Upload a character image and a motion clip (dance, action, gesture), and the model extracts the movement to generate smooth, realistic video. Ready-to-use REST inference API, best performance, no cold starts, affordable pricing.

motion-control

wavespeed-ai/wan-2.2/fun-control

Wan2.2-Fun-Control uses Control Codes and multi-modal inputs to generate preset-controlled videos up to 120s at 720p; released under Apache 2.0 for commercial use. Ready-to-use REST API, no coldstarts, affordable.

motion-control

wavespeed-ai/wan-2.2/animate

Wan2.2-Animate unified character animation & replacement model replicating movement and expression; generates 720p videos up to 120s. Ready-to-use REST inference API, best performance, no coldstarts, affordable pricing.

motion-control

wavespeed-ai/steady-dancer

SteadyDancer is a 14B-parameter human image animation framework that transforms static images into coherent dance videos. Features first-frame preservation, robust identity consistency, and temporal coherence for realistic motion generation. Ready-to-use REST inference API, best performance, no coldstarts, affordable pricing.

motion-control

wavespeed-ai/scail

SCAIL enables high-fidelity character animation using reference images. It handles large motion variations, stylized characters, and multi-character interactions without explicit per-frame structural guidance. Ready-to-use REST inference API, no coldstarts, affordable pricing.

motion-control

wavespeed-ai/ltx-2-19b/control

LTX-2 19B ControlNet generates synchronized audio-video (up to 20s) from video input with pose, depth, or canny edge guidance. Supports audio preservation, generation, or removal for flexible video transformation. Ready-to-use REST inference API, best performance, no cold starts, affordable pricing.

motion-control

bytedance/dreamactor-v2

DreamActor V2 transfers motion from a driving video to characters in an image. Great performance for non-human and multiple characters. Ready-to-use REST inference API, best performance, no coldstarts, affordable pricing.