# S07 - Visualisation of Large and Unstructured Data Sets

Visualisation of Large and Unstructured Data Sets

Hans Hagen, Martin Hering-Bertram, Christoph Garth (Eds.)

GI-Edition - Lecture Notes in Informatics (LNI), S-7

Bonner Köllen Verlag 2007

ISSN 1614-3213

ISBN 3-88579-441-7

## Autor*innen mit den meisten Dokumenten

### Neueste Veröffentlichungen

- TextdokumentDeriving global material properties of a microscopically heterogeneous medium - computational homogenisation and opportunities in visualisation(Visualization of large and unstructured data sets, 2008) Hirschberger, C. B.; Ricker, S.; Steinmann, P.; Sukumar, N.In order to derive the overall mechanical response of a microscopically material body, both the theoretical and the numerical framework of multi scale consideration coined as computational homogenisation is presented. Instead of resolving the actual heterogeneous microstructure in all detail for its simulation, representative micro elements are considered which provide the material properties for the coarse or rather scale. This procedure allows for a smaller and less inexpensive computation. However both the chance and challenge of visualising the decisive features arise on two scales.
- TextdokumentGPU accelerated gesture detection for real time interaction(Visualization of large and unstructured data sets, 2008) Bierz, T.; Ebert, A.; Meyer, J.Over the past years, the interaction between humans and computers (HCI) evolved to one of the most important research topics in computer science. Therefore, finding a way for an intuitive, easy and affordable interaction is the main challenge. Optical markerless tracking using consumer hardware can satisfy these problems. However, in order to be able to interact in an efficient way, the tracking and furthermore the interaction must be handled in real time. This leads to efficient use of current GPUs in order to speed up the tracking and furthermore the gesture recognition. Involving these issues, an approach for an implementation and system will be presented in the following paper.
- TextdokumentA Survey of Implicit Surface Rendering Methods, and a Proposal for a Common Sampling Framework(Visualization of large and unstructured data sets, 2008) Knoll, AloisWe consider several applications of implicit surfaces in visualization, and methods for rendering them. In particular we focus on geometry processing techniques for mesh extraction; and ray casting methods for direct rendering of implicits. Given that both methods rely on sampling the implicit function in question, we design a soft- ware framework that could accomodate both algorithms. We conclude by evaluating the time complexity and performance of existing systems, and discuss the long-term potential of both methods for rendering and computational goals.
- TextdokumentFrontmatter(Visualization of large and unstructured data sets, 2008)
- TextdokumentOn the modification of phonon tracing(Visualization of large and unstructured data sets, 2008) Deines, E.; Michel, F.Phonon tracing is a geometric approach estimating acoustics in closed rooms. This work contains discussions of possible refinements and extensions of the algorithm. In computer graphics exist numerous level-of-detail approaches decreasing the representation details of objects in order to speed up computations and rendering of virtual scenes. Different ideas for the realization of a level-of-detail approach in acoustics are presented. For this purpose the phonon tracing algorithm has to be modified. With these modifications the room impulse response can be calculated with respect to the user requirements on calculation time and accuracy.
- TextdokumentA framework for the visualization of brain structures(Visualization of large and unstructured data sets, 2008) Thelen, S.; Bierz, T.; Müller, B.; Hagen, Hans; Ebert, A.; Friauf, E.; Meyer, J.Nowadays biologists investigate different causes for deafness. One reason is a damage in a particular region of the auditory brain stem. Anatomical differences were discovered when investigating brain slices of different laboratory mice. However, these slices are only a two dimensional representation of a part of the brain. The arising question was how these differences of structure affect the three dimensional representation of this region. Therefore, an interdisciplinary framework was developed, which allows even unexperienced users to investigate and compare these regions.
- TextdokumentComparative tensor visualisation within the framework of consistent time-stepping schemes(Visualization of large and unstructured data sets, 2008) Mohr, R.; Bobach, T.; Hijazi, Y.; Reis, G.; Steinmann, P.; Hagen, H.Nowadays, the design of so-called consistent time-stepping schemes that basically feature a physically correct time integration, is still a state-of-the-art topic in the area of numerical mechanics. Within the proposed framework for finite elastoplasto-dynamics, the spatial as well as the time discretisation rely both on a Finite Element approach and the resulting algorithmic conservation properties have been shown to be closely related to quadrature formulas that are required for the calculation of time-integrals. Thereby, consistent integration schemes, which allow a superior numerical performance, have been developed based on the introduction of an enhanced algorithmic stress tensor, compare [MMS06]-[MMS07c]. In this contribution, the influence of this consistent stress enhancement, representing a modified time quadrature rule, is analysed for the first time based on the spatial distribution of the tensor-valued difference between the standard quadrature rule, relying on a specific evaluation of the well-known continuum stresses, and the favoured nonstandard quadrature rule, involving the mentioned enhanced algorithmic stresses. This comparative analysis is carried out using several visualisation tools tailored to set apart spatial and temporal patterns that allow to deduce the influence of both step size and material constants on the stress enhancement. The resulting visualisations indeed confirm the physical intuition by pointing out locations where interesting changes happen in the data.
- TextdokumentSurvey of Techniques for Data-dependent Triangulations(Visualization of large and unstructured data sets, 2008) Lehner, B.; Umlauf, G.; Hamann, B.We present a survey of different techniques to approximate a color image using a piecewise linear interpolation induced by a triangulation of the image domain. We also include a detailed description of a method we designed. We give a short overview of possible applications and extentions.
- TextdokumentWhy interval arithmetic is so useful(Visualization of large and unstructured data sets, 2008) Hijazi, Y.; Hagen, H.; Hansen, C. D.; Joy, K. I.Interval arithmetic was introduced by Ramon Moore [Moo66] in the 1960s as an approach to bound rounding errors in mathematical computation. The theory of interval analysis emerged considering the computation of both the exact solution and the error term as a single entity, i.e. the interval. Though a simple idea, it is a very powerful technique with numerous applications in mathematics, computer science, and engineering. In this survey we discuss the basic concepts of interval arithmetic and some of its extensions, and review successful applications of this theory in particular in computer science.
- TextdokumentGeomodeling and Geovisualizations in Urban Planning und Real Estate Industry: The Example of Office Market Research(Visualization of large and unstructured data sets, 2008) von Malottki, C.Modeling, quantitative analysis, and forecasting in urban planning have a tradition since the sixties when very complex models for the whole “system of the city” were developed. After a phase of criticism about these complex black box programs in the eighties the topic got in the research focus again because of the easier possibilities for visualizing the results by the means of GIS. Subsequently, geomodeling is also interesting for more specific questions. The example shown in the paper is office market modeling – with a case study in Stuttgart. Due to higher vacancy rates and the degradation of buildings especially from the sixties and the seventies the subject is relevant for investors and real estate brokers but also for city administrations who try to avoid the degradation of whole areas. The classical time-series based office market models from urban economics describe the movement of the entire market but they do not consider local heterogeneity. Cross- sectional models like hedonic price modeling and an adaptation of the hedonic model for vacancy rates shown in the paper are difficult to couple with forecasting results. The microsimulation approach is the best way to integrate forecasting and a detailed spatial resolution. It consists in simulating movements, location choices, and vacancy at the building level. The paper presents the equations and exemplary results of the different simulation steps.