Jump to main content
Jump to site search
Access to RSC content Close the message box

Continue to access RSC content when you are not at your institution. Follow our step-by-step guide.


Volume 169, 2014
Previous Article Next Article

A GPU-accelerated immersive audio-visual framework for interaction with molecular dynamics using consumer depth sensors

Author affiliations

Abstract

With advances in computational power, the rapidly growing role of computational/simulation methodologies in the physical sciences, and the development of new human–computer interaction technologies, the field of interactive molecular dynamics seems destined to expand. In this paper, we describe and benchmark the software algorithms and hardware setup for carrying out interactive molecular dynamics utilizing an array of consumer depth sensors. The system works by interpreting the human form as an energy landscape, and superimposing this landscape on a molecular dynamics simulation to chaperone the motion of the simulated atoms, affecting both graphics and sonified simulation data. GPU acceleration has been key to achieving our target of 60 frames per second (FPS), giving an extremely fluid interactive experience. GPU acceleration has also allowed us to scale the system for use in immersive 360° spaces with an array of up to ten depth sensors, allowing several users to simultaneously chaperone the dynamics. The flexibility of our platform for carrying out molecular dynamics simulations has been considerably enhanced by wrappers that facilitate fast communication with a portable selection of GPU-accelerated molecular force evaluation routines. In this paper, we describe a 360° atmospheric molecular dynamics simulation we have run in a chemistry/physics education context. We also describe initial tests in which users have been able to chaperone the dynamics of 10-alanine peptide embedded in an explicit water solvent. Using this system, both expert and novice users have been able to accelerate peptide rare event dynamics by 3–4 orders of magnitude.

Back to tab navigation

Supplementary files

Article information


Submitted
04 Feb 2014
Accepted
19 Mar 2014
First published
19 Mar 2014

This article is Open Access

Faraday Discuss., 2014,169, 63-87
Article type
Paper
Author version available

A GPU-accelerated immersive audio-visual framework for interaction with molecular dynamics using consumer depth sensors

D. R. Glowacki, M. O'Connor, G. Calabró, J. Price, P. Tew, T. Mitchell, J. Hyde, D. P. Tew, D. J. Coughtrie and S. McIntosh-Smith, Faraday Discuss., 2014, 169, 63
DOI: 10.1039/C4FD00008K

This article is licensed under a Creative Commons Attribution 3.0 Unported Licence. Material from this article can be used in other publications provided that the correct acknowledgement is given with the reproduced material.

Reproduced material should be attributed as follows:

  • For reproduction of material from NJC:
    [Original citation] - Published by The Royal Society of Chemistry (RSC) on behalf of the Centre National de la Recherche Scientifique (CNRS) and the RSC.
  • For reproduction of material from PCCP:
    [Original citation] - Published by the PCCP Owner Societies.
  • For reproduction of material from PPS:
    [Original citation] - Published by The Royal Society of Chemistry (RSC) on behalf of the European Society for Photobiology, the European Photochemistry Association, and RSC.
  • For reproduction of material from all other RSC journals:
    [Original citation] - Published by The Royal Society of Chemistry.

Information about reproducing material from RSC articles with different licences is available on our Permission Requests page.


Social activity

Search articles by author

Spotlight

Advertisements