Near-exhaustive Precomputation of Secondary Cloth Effects from Doyub Kim on Vimeo.
Doyub Kim1, Woojong Koh2, Rahul Narain2, Kayvon Fatahalian 1, Adrien Treuille1, and James F. O’Brien2
1Carnegie Mellon University
ACM Transactions on Graphics (Proc. SIGGRAPH 2013) Vol. 32, No. 4, 87.
The central argument against data-driven methods in computer graphics rests on the curse of dimensionality: it is intractable to precompute “everything” about a complex space. In this paper, we challenge that assumption by using several thousand CPU-hours to perform a massive exploration of the space of secondary clothing effects on a character animated through a large motion graph. Our system continually explores the phase space of cloth dynamics, incrementally constructing a secondary cloth motion graph that captures the dynamics of the system. We find that it is possible to sample the dynamical space to a low visual error tolerance and that secondary motion graphs containing tens of gigabytes of raw mesh data can be compressed down to only tens of megabytes. These results allow us to capture the effect of high-resolution, off-line cloth simulation for a rich space of character motion and deliver it efficiently as part of an interactive application.
TechCrunch – Researchers Create “Near-Exhaustive,” Ultra-Realistic Cloth Simulation
CNET – Computers sweat for 4,554 hours to simulate cloth movement