[en] Lifelike visualizations in design, cinematography, and gaming rely on precise physics simulations, typically requiring extensive computational resources and detailed physical input. This paper presents a method that can infer a system's physical properties from a short video, eliminating the need for explicit parameter input, provided it is close to the training condition. The learned representation is then used within a Graph Network-based Simulator to emulate the trajectories of physical systems. We demonstrate that the video-derived encodings effectively capture the physical properties of the system and showcase a linear dependence between some of the encodings and the system's motion.
Disciplines :
Computer science
Author, co-author :
Szewczyk, Franciszek
Louppe, Gilles ; Université de Liège - ULiège > Département d'électricité, électronique et informatique (Institut Montefiore) > Big Data
Sabatelli, Matthia
Language :
English
Title :
Video-Driven Graph Network-Based Simulators
Publication date :
2024
Event name :
Machine Learning and the Physical Sciences Workshop (NeurIPS 2024)