Star 历史趋势
数据来源: GitHub API · 生成自 Stargazers.cn
README.md

AI4AnimationPy

A Python framework for AI-driven character animation using neural networks.

Developed by Paul Starke and Sebastian Starke

License: CC BY-NC 4.0 Python 3.12 Documentation

AI4AnimationPy Demo Video

AI4AnimationPy enables character animation through neural networks and provides useful tools for motion capture processing, training & inference, and animation engineering. The framework brings AI4Animation to Python — removing the Unity dependency for data-processing, feature-extraction, inference, and post-processing while keeping similar game-engine-style architecture (ECS, update loops, rendering pipeline). Everything runs on NumPy or PyTorch, so training, inference, and visualization happen in one unified environment.

Getting Started

Please see the full documentation for installation instructions, and other practical tips for working with AI4AnimationPy:

Full Documentation

Architecture

The framework can be executed via 1) using in-built rendering pipeline ("Standalone"), 2) headless mode ("Headless") or 3) manual execution ("Manual") which enables running code locally or remotely on server-side. While both Standalone and Headless mode invoke automatic update callbacks, the Manual mode allows to manually control how often and at which time intervals the update loop is invoked.

Interactive Demos

Locomotion DemoQuadruped Demo
Stylized Biped Locomotion Controller trained on style100Quadruped Locomotion Controller — Interactive dog locomotion with gait transitions and action poses
Training DemoECS Demo
Future Motion Anticipation with Interactive model training visualizationECS — Entity hierarchy and component system
IK DemoMocapImport Demo
Inverse Kinematics — Real-time IK solvingMotion Capture Import — GLB/FBX/BVH/NPZ loading
MotionEditor Demo
Motion Editor — animation browsing and feature visualization

Why AI4AnimationPy?

Research on AI-driven character animation has required juggling multiple disconnected tools — model research happens in Python while visualization requires specialized software, and bridging the two involves custom communication pipelines. This creates friction that slows iteration and makes it difficult to validate results on-the-fly.

The training pipeline in AI4Animation has been heavily dependent on Unity. While useful for visualization and runtime inference, communication with PyTorch had to go through ONNX or data streaming, creating a disconnect in the overall workflow. AI4AnimationPy solves this by fusing everything into one unified framework running only on NumPy/PyTorch:

  • Train neural networks on motion capture data
  • Visualize instantly without switching tools — training, inference, and rendering share the same backend
  • Run headless for server-side training with optional standalone mode
  • Extend easily with new features like geometry, audio, vision, or physics via the modular ECS design
Framework Workflow
AI4AnimationPyAI4Animation (Unity)
Training data generation (20h mocap)< 5 min> 4 hours
Setup time for new experiment~10 min> 4 hours
Visualize inputs/outputs during trainingBuilt-inRequires streaming
Backprop through inference✅ Supported❌ Not possible
QuantizationFull PyTorch supportLimited to ONNX
VisualizationOptional built-in rendererBuilt-in

Features

FeatureStatus
🧩Entity-Component-System — modular architecture with lifecycle management
🔄Update Loop — game-engine-style callbacks (Update / Draw / GUI)
📐Math Library — vectorized FK, quaternions, axis-angle, matrices, mirroring
🧠Neural Networks — MLP, Autoencoder, Codebook Matching with training utilities
🖥️Real-time Renderer — deferred shading, shadow mapping, SSAO, bloom, FXAA
💀Skinned Mesh Rendering — GPU-accelerated skeletal mesh rendering
🦴Inverse Kinematics — FABRIK solver for real-time IK
🎬Animation Modules — joint contacts, root & joint trajectories
🎥Camera System — Free, Fixed, Third-person, Orbit mode with smooth blending
📦Motion Import — GLB, FBX, BVH
Execution Modes — Standalone, Headless, Manual
🏗️Physics simulation (rigid bodies / collision)🔜
🛤️Path planning and spline tooling🔜
🔊Audio support🔜

Motion Capture Import

The framework supports importing mesh, skin, and animation data from GLB, FBX, and BVH files. The internal motion format is .npz, storing 3D positions and 4D quaternions for each skeleton joint per frame.

from ai4animation import Motion motion = Motion.LoadFromGLB("character.glb") motion = Motion.LoadFromFBX("character.fbx") motion = Motion.LoadFromBVH("character.bvh", scale=0.01) motion.SaveToNPZ("character")

Batch convert entire directories using the built-in CLI:

convert --input_dir path/to/motions --output_dir path/to/output

See the Loading Motion Data guide for setup details.

Public Datasets Several public motion capture datasets are compatible with the framework:
DatasetCharacterDownload
CranberryCranberryFBX & GLB
100Style retargetedGenoBVH / FBX
LaFanUbisoft LaFanBVH
LaFan resolvedGenoBVH / FBX
ZeroEggs retargetedGenoBVH / FBX
Motorica retargetedGenoBVH / FBX
NSMAnubisBVH
MANNDogBVH

License

AI4AnimationPy is licensed under the CC BY-NC 4.0 License.

关于 About

A Python framework for AI-driven character animation using neural networks.

语言 Languages

Python95.8%
GLSL4.2%

提交活跃度 Commit Activity

代码提交热力图
过去 52 周的开发活跃度
17
Total Commits
峰值: 4次/周
Less
More

核心贡献者 Contributors