As a humans fatigue there is an inability to sustain performance at the level of one’s normal capabilities. Certain variables cue our perception of this exhaustion. Depth and rate of breathing can be modified in response to changing body demands, sweating increases, and blood flow flushes the face and affects balance. We first create a database of exercise motions to demonstrate these exhaustion variables by recording motion capture data and biosignal sensors. We derive a set of parameters from this multi-modal training data. We apply these exhaustion parameters on novel input motions. Then change the input motion to display a desired level of exhaustion. (This can be either- what really happens to the user as they perform the motion, or an application of a slider so at any point an animator can get a desired level of exhaustion.
We demonstrate the practicability of this method by proposing a data-driven exhaustion filter that models the level of exhaustion in human motion on various characters and motions.
Some notes:
We want this method to work on novel input motions.
We want this method to work on a variety of characters (does it matter if they are human?)
No comments:
Post a Comment