Schiffner, DanielKrömker, DetlefMühl, GeroRichling, JanHerkersdorf, Andreas2019-10-302019-10-302012978-3-88579-294-9https://dl.gi.de/handle/20.500.12116/29486The heterogeneity of future living environments will increase the necessity to create applications that can run on any device. In the context of graphics applications, some kind of simplification must be included to enable rendering on devices with less computation power. Using perception to guide such a simplification is a common approach. However, existing methods generate levels of detail in advance, and only a selection is performed during run-time. In simulations, this not sufficient because an object will change over time. We present a framework that adapts a simulation using perceptual measures. We use a visual salience model to extract regions where detail can be modified. This information is calculated during run-time, and by using a dynamic data structure, the representation is adapted without a definition of levels of detail in advance. We included the system in a physics library and so created an interactive and continuous simulation level of detail.enPerception-influenced AnimationText/Conference Paper1617-5468