vizNET Showcase 2008 Winners

 

This year's first prize winners are Chris Goodyer and Ken Brodlie, from Leeds University with their entry on 3D surface visualization using high resolution display walls.

After careful deliberation and a lot of agonising the judging panel is proud to announce the winners of the 2008 vizNET Showcase. As with other years, this is always a difficult process and the entries were judged according to how well they met this year's criteria. The judges took into account the usefulness, novelty and the impact that the visualization had in the application domain. The prizes were awarded as follows:

First Prize

Runners up

First Prize

3D Surface Visualization on High Resolution Display Walls

Chris Goodyer and Ken Brodlie. Leeds University

This entry was selected to receive the 1st prize in the 2008 Visualization Showcase competition. The practical achievement and the demonstration are impressive. The accompanying video, with carefully crafted annotations, was nicely produced and gives a good impression of the effectiveness of the pre-processing stages. The detail in the visualization is impressive. The resulting viewer is an appealing and usable system, providing easy migration from desktop to gigapixel display walls. Careful attention has also been paid to user interface issues, for example through use of an intuitive wireless controller and inclusion of features in the visualization to aid the user’s cognition. The judges felt that this work holds great potential for application in other problem domains.

Annotated Rabbit's Heart

Video disabled, you either have JavaScript turned off or an old version of Adobe's Flash Player. Get the latest Flash player.
Video included by kind permission of The Cardiac Mechano-Electric Feedback Group, Department of Physiology, Anatomy & Genetics, University of Oxford, UK

The oView system

As part of the Integrative Biology project high resolution MRI data from Dr Peter Kohl at the University of Oxford had been provided. A total of 1576 slices at 1024x1024 gave the full volume of a rabbit heart with 25μm voxels. The main intention of the work was to develop a viewer for such a data volume that could be deployed on both desktop displays and display walls.

The surfaces are generated by isosurfacing the MRI volume. In order to improve the surfaces smoothing and decimation are undertaken. These steps are computationally expensive and hence are pre-processing stages done only once per dataset.

The generated surfaces are then loaded into a custom-built VRJuggler application. Using this environment means that the control of the display-wall or desktop setups are handled independently of the application itself.

..with oView we were able to explore the data at real-time speed, using natural controls.
Brock M. Tice
The Johns Hopkins University

Fast rendering is achieved through combining display lists, level of detail rendering (low resolution when moving) and building sub-volumes which can easily be tested for presence in the viewing frustum. In order to assist users the addition of labels in the 3D volume was also added, along with a portion of the screen dedicated to a thumbnail image in order to aid navigation when fully zoomed in. Users are also provided with an intuitive wireless “Playstation”-style controller with full control of navigation, rotations, and clip planes. History place-markers and animation facilities are also available. Periodic MRI slices, with voids rendered transparent, are also added back in to aid the user’s cognition.

oView in use

The viewer, oView, has since been applied to a variety of scientific datasets. Brok M. Tice from The Johns Hopkins University said that "oView has made the visualization of extremely large 3D data sets both more feasible and more intuitive as compared to other pieces of visualization software. We have had difficulty loading these data sets in other viewers and achieving usable performance, but with oView we were able to explore the data at real-time speed, using natural controls. This has been a great help in developing our latest models."

Others have praised the use of the high-resolution display wall for visualisation. Dr Peter Kohl from the University of Oxford said that "You have to see 'Powerwall' to understand the vast potential it has for interactive exploration of sometimes huge biological data sets. Visualisation of the complex three-dimensional architecture of the heart, for example, becomes intuitive, interactive, informative - and fun! Tools like this are THE quintessential pre-requirements for the realisation of the vast potential that the combination of modern imaging techniques and computer modelling bear for biomedical research and development."

Runner-up Prizes

Molecular Modelling with Low-Cost Haptic Feedback

N.Zonta, I.J.Grimstead, A.Brancale and N.J.Avis. Cardiff University

This entry was a runner-up in the 2008 Visualization Showcase competition. The work explores the application of low-cost haptic devices to molecular modelling, with potential applications to teaching molecular interactions. Whilst the application of haptics to molecular modelling is a long-standing application area for haptic technology, it is important to understand the design space for such applications, as exemplified by the trade-offs involved in the use of relatively low-cost haptic devices, compared to high-end devices. Although tradeoffs are not quantified in this showcase entry, the entry illustrates the potential for using low-cost devices. The judges felt that the accompanying video was very well-produced and uses visualization very effectively to portray the force field.

Molecular Modelling

Molecular modelling based on molecular mechanics covers the computational techniques that model the behaviour of molecules, where the lowest level of information is individual atoms (rather than quantum chemistry where electrons are explicitly considered). Molecular mechanics reduces the complexity of the system so it can be computed far faster than quantum techniques.

Zodiac is a molecular modelling package developed at the Welsh School of Pharmacy that takes advantage of this increase in speed to support real-time (>30fps) interaction with molecules. Molecular mechanics force fields are used to simulate the underlying physics (such as van der Waals forces and electrostatic forces from Coulomb's law).

Zodiac - Low-Cost Haptic Feedback

Video disabled, you either have JavaScript turned off or an old version of Adobe's Flash Player. Get the latest Flash player.
The video shows real-time haptic feedback, through multiple asynchronous threads, enabling very high refresh of the haptic device working with a slower (>30Hz) graphics interface.

Haptic Feedback

One issue with molecular modelling is the comprehension by the end user of the forces involved (attraction, repulsion), such as determining whether a particular molecule will “dock” at a specific site on another molecule. The speed of Zodiac has enabled a collaboration between the Welsh School of Pharmacy & the Cardiff School of Computer Science to support haptic feedback in a real-time environment. Two points are of note:

  1. Haptic devices are notoriously expensive, costing tens of thousands of dollars for a system with six degrees of freedom.
  2. Haptic devices require far higher update rates than graphics (such as 200Hz compared to 30Hz for graphics).
The system we use is a mid-range workstation with a sub-£300 haptic which has recently been introduced to the entertainment market, rather than a specialised high-end haptic. Future work will look at stereo viewing and head tracking to further enhance the user’s experience and thus understanding.

Overall, we have a real-time haptic molecular modelling package using sub-£1000 for a complete system. Such a system is directly of interest with teaching molecular interactions, with the low cost making the teaching of large groups possible.


Using Augmented Reality to Promote the Understanding of Materials Engineering to School Children

K.T.W. Tan, E.M. Lewis, N.J. Avis, P.J. Withers, The University of Manchester and Cardiff University

This entry was a runner-up in the 2008 Visualization Showcase competition. The entry demonstrates the application of Augmented Reality technology in systems designed for use in schools to promote an awareness and understanding of the properties of materials and their use in engineering applications such as aircraft engines. The accompanying documents give a good overview of the work. The judges were particularly impressed by the application area for the technology – to systems designed for use in schools – and by the steps being taken to provide a development environment for teachers who are not expert in this technology. The judges hope that the promise shown by the initial evaluation will be reflected in more thorough evaluations as the system, deservedly, becomes more widely deployed. The judges hope too that this work may serve as a source of inspiration to other scientists and engineers seeking ideas to promote their disciplines to the public at large and to the next generation of potential recruits to their fields.

Augmented Reality

Video disabled, you either have JavaScript turned off or an old version of Adobe's Flash Player. Get the latest Flash player.
The first part of this video shows the ability of the system to animate virtual object and at the same time, interact with the virtual object in real-time (in this case, turnning the rock stage with our own hand). The video goes on to demonstrate the 'top trumps' style game for promoting the understanding of material science.

Developing Innovative learning tools

Innovative learning tools exploiting Augmented Reality (AR) technology are now possible for application in schools today because of advances in software and a reduction in cost of the associated hardware (cheap webcam coupled to a reasonably fast computer). AR, is a relatively mature technology, but as yet remains largely undiscovered by schools as a means of enhancing traditional lesson delivery. This showcase describes our educational AR software that can help familiarise physics, chemistry and technology students and teachers with the development and exploitation of advanced materials through the task of constructing a jet-engine.

The advantage of AR is the ability to overlay information on real physical objects as viewed on a LCD projector, or interactive white board. In our case we have constructed a materials database in a generic XML format integrated into our specialised 3D AR software. This system allows pupils to explore and learn about materials by linking real materials to visual information by means of identifying symbols for AR recognition.

Representative visuals

The materials can be overlayed with representative 3D visuals highlighting their applications, their key attributes, or associated microstructures. These linkages between the hands on materials and their properties and applications are explored through a series of puzzles, games and tasks, with the AR providing guidance. For example, the pupils can try and identify the materials needed to build a jetengine or play a ‘top trumps’-style game against the computer, aiming to choose an materials attributes that trump those selected by the computer. The AR system also acts as a virtual microscope to display the microstructure of a given material as it is placed under the web-cam. For younger pupils, simpler AR tools guide the pupil through the categorisation of materials (Metal, Ceramic, Polymer and Natural) with the AR recognition software rewarding correct designations and helping students to identify and understand their mistakes.

Our toolkit includes a set of textual descriptions, images and 3D pre-defined computer generated objects from our XML-based database. In combination with our special effects database (fire, smoke, explosion), the resulting system can combine these effects and display the results on screen to make the task much more fun and enjoyable to a wide ability range.


Parallel Hardware Volume Rendering in Commercial Visualization Software

George Leaver, James S. Perrin, Louise Lever, Martin Turner. The University of Manchester

This entry was a runner-up in the 2008 Visualization Showcase competition. This work describes extension of the AVS/Express visualization system to support parallel volume rendering. Technically the work is demanding and is an important contribution to visualization infrastructure and demonstrates the potential for commercial application of parallel volume rendering. The submission includes useful insight into the source of bottlenecks in the hardware configuration used. The judges hope that future work in this area will also be able to explore the benefits for users in this approach and that it will lead to fresh results in the application areas to which it is applied.

Parallel Rendering

Although parallel volume rendering has been around in the literature for many years there are few available systems that actually implement in a suitable state commercially. One of the reasons has been due to a lack of a robust image compositing layer. The Parallel Composting API is a specification for such a system, an implementation of which has been developed by Hewlett Packard Inc.

AVS/Express is a widely used scientific visualization system that supports both hardware volume rendering (Back to Front Textures) and raycasting in serial. The University of Manchester is developing a version of AVS/Express for multi CPU and GPU systems. We have also worked with HP on the development of the Parallel Compositing specification and the implementation which is now being used in the high-end versions of AVS/Express. The latest step has been to add parallel volume rendering support.

Parallel Volume Rendering

The Armadillo dataset rendered in parallel using five nodes

Rendering System

This example uses a scan of an armadillo (see inset), ~5GB of data. The system is a HP SVA with five nodes, each with dual 3.4GHz Xeons, 2GB RAM and an nVidia FX3400 card. Four nodes act as compute/render nodes and the fifth as the master/display node. The data is loaded using the reader in AVS/Express Parallel Edition onto the compute nodes. Then reduced to 5123 so as to fit the available texture memory; each render node renders a volume of 512x512x128 which in ARGB space uses all the available texture memory (5123 texels). Therefore the effective volume we could render was 4 times larger than in serial.

The compositing layer runs at a baseline of 17.66 fps for a 5122 image, so we can see that rendering requires little work and that the available bandwidth for compositing along with the amount of texture memory and fill rate were the limiting factors. The current version uses full frame compositing so the framerate is dependent on image size. Bounding box composting is to be added shortly. Future work will also add a parallel raycast volume rendering implementation.

The volume rendering allows us to see the internal micro structure of the armadillo in context. Using hardware volume rendering accelerates the time to visualization. Parallel rendering and compositing overcomes the hardware limitations.