Game Animation May Be Taken Up to a New Height with AI-based Procedural Generation

A research team from UK-based University of Edinburgh and US-based Method Studios has jointly announced the result of a project called Phase-Functioned Neural Networks for Character Control.

This project uses an AI-based neural network system to procedural generate animations of a model running on different terrains. The system creates movements by examining a large amount of motion capture data and getting trained with various terrain heights. By doing so, the generated animations would be suitable to almost all types of environments while achieving high quality in a short amount of time. The following video shows the character that implemented AI-based animation systems can run naturally across different scenarios.

If this system can be applied to the industry, it will change the face of video game development in the future. Traditionally, animators could spend hundreds of hours in order to implement life-like movements and face animations for game characters. Even with the aid of current technologies, like motion capture, it will require sophisticated handwork to apply the data into the models accurately. This AI-based animation system has shown great potentials of reducing the development cost, as the movements could be automatically made in short time with high compatibility. Currently, the system is still undergoing experiments, and it may take a while to bring it to the public in the near future.

Related posts

Nintendo President Shuntaro Furukawa Reveals Switch 2 Backwards Compatibility at Investors Meeting

£100,000 House Deposit up for Grabs to Mark the Launch of Call of Duty: Black Ops 6

Discover More About BLEACH: Rebirth of Souls’ Gameplay in a New Trailer