University of Delaware C · (Figure 5, Step - 1) glycosidic bond isosurface of unpaired electron of...

1
ELF –α ELF –β C 1 C 2 O 2 O 1 O H O C OH OH C P glycosidic bond isosurface of unpaired electron of carbon toroidal isosurface of unpaired electron of carbon University of Delaware The CLUSTER The VENDORS Animal & Food Sciences: Behnam Abasht Center for Applied Demography & Survey Research: Ed Ratledge Center for Bioinformatics and Computational Biology: Cathy Wu Chemistry & Biochemistry: Douglas Doren Chemical Engineering: Stanley Sandler, Dion Vlachos Civil & Environmental Engineering: Rachel Davidson, Dominic Di Toro, Tian- Jian Hsu, Paul Imhoff, James Kirby, Jennifer McConnell, Christopher Meehan Delaware Biotechnology Institute: Kelvin Lee Electrical & Computer Engineering: Stephan Bohacek, Abhyudai Singh, Daniel Weile, Ryan Zurakowski Finance: Jayesh Khanapure Geography: Luc Claessens, Tracy DeLiberty Geological Sciences: Holly Michael Materials Science & Engineering: Juejun Hu Mathematical Sciences: Richard Braun, Peter Monk, Yvonne Ou, Petr Plechac, Lou Rossi, Peter Schwenk Mechanical Engineering: Suresh Advani, Kausik Sarkar, Lian-Ping Wang Oceanography: Matthew Oliver, Xaoi-Hai Yan Physical Ocean Science & Engineering: Cristina Archer, Tobias Kulkulka, Dana Veron, Fabrice Veron Physics & Astronomy: Daniel De Marco, Paul Evenson, Edward Lynam, William Matthaeus, Branislav Nikolic, Stanley Owocki, Marianna Safrono, Qaisar Shafi, Michael Shay Plant & Soil Sciences: Randall Wisser, Psychology: Gregory Miller RESEARCH SAMPLER The STAKEHOLDERS Study Morphodynamics in Fluvial, Estuarine and Costal Environments 3D Simulation of fine sediment transport in the bottom boundary layer driven by a oscillatory motion (top panel) for a medium sediment concentration case (middle panels, near bed concentration <C> 10 g/l) and a high concentration case (bottom panels, <C> 50 g/l). Study Environmental Fluid Mechanics of Cloud Physics and Warm Rain Reference: H. Gao, H. Li, L.-P. Wang (2011) Lattice Boltzmann simulation of turbulent flow laden with finite-size particles. Computers & Mathematics with Applications. doi:10.1016/j.camwa.2011.06.028 Building a multiscale simulation framework linking cloud microphysics and cloud dynamics to larger scale weather and climate modeling. Snapshot of particle-resolved simulation of decaying particle-laden turbulent flow - 3D view and 2D slice of vorticity contours and particle distribution. Reference: C.E. Ozdemir, T.-J. Hsu, and S. Balachandar, (2011) A numerical investigation of lutocline dynamics and saturation of fine sediment in the oscillatory boundary layer. Journal of Geophysics Research, 116, C09012. Electron Localization Function Analysis Mapping the electronic structure during biomass pyrolysis reaction Reference: A. Steffen, B. Rockstroh, C. Wienbruch, G.A. Miller (2011). Distinct cognitive mechanisms in a gambling task share neural mechanisms. Psychophysiology, 48, 1037-1046. Developing a computational multi-scale engine to screen all explicit solvents to elucidate the pathways of sugar conversion. Analysis of Human Magnetoencephalographic (MEG) Data Reference: M. Mettler et al. (2011) Energy and Environmental Science. (In Press) Analysis of time course of brain activity while subjects make a decision (a) and the brain surface distribution of neuromagnetic activity around 300 ms into the decision process (b). Penguin Computing Inc. AMD (Processors) Qlogic (Infiniband) Arista (Switch) WhamCloud (Lustre) APC (Racks) DotHill (Lustre Enclosures) The MISSION Based on priorities expressed in the faculty's Research Computing Task Force report (Apr. 2011), UD's Information Technologies responded by rapidly developing and implementing its first HPC Community Cluster plan, targeting production status by Jan. 2012. Faculty interest quickly grew, resulting in a 5,136-core, 49.3 Tflops (peak), Infiniband- and Lustre-based system. Faculty purchased the compute nodes (24-48 core, 64-256 GB RAM) and IT funded the storage, switch, maintenance, and physical and staff infrastructure. IT plans to solicit interest for additional clusters every 12 months over the next five years to respond to increasing research needs at UD and emerging HPC technologies. The UD Community Cluster Program: A technical & financial collaboration between faculty and Information Technologies The TECHNICAL SPECS Predict Protein-Protein Interaction Based on Domain Profiles Study High Resolution Simulation of Vortex Lattice Dynamics Ongoing projects that will be accelerated by the Community Cluster Prediction using feature selection and support vector machines provide highly accurate interaction domain identification in genome prediction. Reference: A.J. Gonzalez and L. Liao (2010) Predicting domain-domain interaction based on domain profiles with feature selection and support vector machines, BMC Bioinformatics 11, 537. Number of nodes: 200 dual & quad-socket nodes (12 cores/socket) Number of cores: 5,136 processor cores Processor: Dual Opteron 6234 12C, 2.4GHz Memory: 64-256 GB/node (total 14.5 TB) Total local disk space on the nodes: 208 TB Fabric: QDR Infiniband network interconnect File system: Lustre RAID-6 with 200 TB usable space Operating system: CentOS Cluster peak performance: 49.3 Tflops Compute node: Altus 1804 Two weak vortices strip a stronger central vortex in a naturally adaptive, high resolution simulation of vortex lattice dynamics. Adaptive, scalable, high precision methods like these are essential tools for investigators exploring geophysical, aerodynamic and industrial flows. Reference: L.A. Barba and L.F. Rossi (2010) Global field interpolation for particle methods. Journal of Computational Physics, 229, pp. 1292-1310. QDR Infiniband network interconnect (Qlogic) Powered by

Transcript of University of Delaware C · (Figure 5, Step - 1) glycosidic bond isosurface of unpaired electron of...

Page 1: University of Delaware C · (Figure 5, Step - 1) glycosidic bond isosurface of unpaired electron of carbon toroidal isosurface of unpaired electron of carbon University of Delaware

ELF – α ELF – β

C1

C2

O2

O1

OH

O

C

OH

OH

C

P

Intermediate(Figure 5, Step-1)

glycosidic bond

isosurface of unpaired electron of carbon

toroidal isosurfaceof unpaired

electron of carbon

University of Delaware

The CLUSTER

The VENDORS

Animal & Food Sciences: Behnam

Abasht

Center for Applied Demography &

Survey Research: Ed Ratledge

Center for Bioinformatics and

Computational Biology: Cathy Wu

Chemistry & Biochemistry: Douglas

Doren

Chemical Engineering: Stanley Sandler,

Dion Vlachos

Civil & Environmental Engineering:

Rachel Davidson, Dominic Di Toro, Tian-

Jian Hsu, Paul Imhoff, James Kirby,

Jennifer McConnell, Christopher Meehan

Delaware Biotechnology Institute: Kelvin

Lee

Electrical & Computer Engineering:

Stephan Bohacek, Abhyudai Singh, Daniel

Weile, Ryan Zurakowski

Finance: Jayesh Khanapure

Geography: Luc Claessens, Tracy

DeLiberty

Geological Sciences: Holly Michael

Materials Science & Engineering:

Juejun Hu

Mathematical Sciences: Richard Braun,

Peter Monk, Yvonne Ou, Petr Plechac, Lou

Rossi, Peter Schwenk

Mechanical Engineering: Suresh Advani,

Kausik Sarkar, Lian-Ping Wang

Oceanography: Matthew Oliver,

Xaoi-Hai Yan

Physical Ocean Science & Engineering:

Cristina Archer, Tobias Kulkulka, Dana

Veron, Fabrice Veron

Physics & Astronomy: Daniel De Marco,

Paul Evenson, Edward Lynam, William

Matthaeus, Branislav Nikolic, Stanley

Owocki, Marianna Safrono, Qaisar Shafi,

Michael Shay

Plant & Soil Sciences: Randall Wisser, Psychology: Gregory Miller

RESEARCH SAMPLER

The STAKEHOLDERS

Study Morphodynamics in Fluvial, Estuarine and Costal Environments

3D Simulation of fine sediment transport in the bottom boundary layer

driven by a oscillatory motion (top panel) for a medium sediment

concentration case (middle panels, near bed concentration <C> 10

g/l) and a high concentration case (bottom panels, <C> 50 g/l).

Study Environmental Fluid Mechanics of Cloud Physics and Warm Rain

Reference: H. Gao, H. Li, L.-P. Wang (2011) Lattice Boltzmann simulation of

turbulent flow laden with finite-size particles. Computers & Mathematics with

Applications. doi:10.1016/j.camwa.2011.06.028

Building a multiscale simulation framework linking cloud microphysics

and cloud dynamics to larger scale weather and climate modeling.

Snapshot of particle-resolved

simulation of decaying particle-laden turbulent flow - 3D view and 2D

slice of vorticity contours and particle distribution.

Reference: C.E. Ozdemir, T.-J. Hsu, and S. Balachandar, (2011) A numerical

investigation of lutocline dynamics and saturation of fine sediment in the oscillatory

boundary layer. Journal of Geophysics Research, 116, C09012.

Electron Localization Function Analysis

Mapping the electronic

structure during biomass

pyrolysis reaction

Reference: A. Steffen, B. Rockstroh, C. Wienbruch, G.A. Miller (2011). Distinct

cognitive mechanisms in a gambling task share neural mechanisms.

Psychophysiology, 48, 1037-1046.

Developing a computational multi-scale engine to screen all explicit

solvents to elucidate the pathways of sugar conversion.

Analysis of Human Magnetoencephalographic (MEG) Data

Reference: M. Mettler et al. (2011) Energy and Environmental Science. (In Press)

Analysis of time course of brain activity while subjects make a decision

(a) and the brain surface distribution of neuromagnetic activity around

300 ms into the decision process (b).

Penguin Computing Inc.

AMD (Processors) Qlogic (Infiniband)

Arista (Switch) WhamCloud (Lustre)

APC (Racks) DotHill (Lustre Enclosures)

The MISSION

Based on priorities expressed in the

faculty's Research Computing Task

Force report (Apr. 2011), UD's

Information Technologies responded by

rapidly developing and implementing its

first HPC Community Cluster plan,

targeting production status by Jan. 2012.

Faculty interest quickly grew, resulting in

a 5,136-core, 49.3 Tflops (peak),

Infiniband- and Lustre-based system.

Faculty purchased the compute nodes

(24-48 core, 64-256 GB RAM) and IT

funded the storage, switch, maintenance,

and physical and staff infrastructure.

IT plans to solicit interest for additional

clusters every 12 months over the next

five years to respond to increasing

research needs at UD and emerging

HPC technologies.

The UD Community Cluster Program: A technical & financial collaboration between

faculty and Information Technologies

The TECHNICAL SPECS

Predict Protein-Protein Interaction Based on Domain Profiles

Study High Resolution Simulation of Vortex Lattice Dynamics

Ongoing projects that will be accelerated by the Community Cluster

Prediction using feature selection and support vector machines

provide highly accurate interaction domain identification in genome

prediction.

Reference: A.J. Gonzalez and L. Liao (2010) Predicting domain-domain interaction

based on domain profiles with feature selection and support vector machines, BMC

Bioinformatics 11, 537.

• Number of nodes: 200 dual & quad-socket nodes (12

cores/socket)

• Number of cores: 5,136 processor cores

• Processor: Dual Opteron 6234 12C, 2.4GHz

• Memory: 64-256 GB/node (total 14.5 TB)

• Total local disk space on the nodes: 208 TB

• Fabric: QDR Infiniband network interconnect

• File system: Lustre RAID-6 with 200 TB usable space

• Operating system: CentOS

• Cluster peak performance: 49.3 Tflops

Compute node: Altus 1804

Two weak vortices strip a

stronger central vortex in a

naturally adaptive, high

resolution simulation of

vortex lattice

dynamics. Adaptive,

scalable, high precision

methods like these are

essential tools for

investigators exploring

geophysical, aerodynamic

and industrial flows. Reference: L.A. Barba and L.F. Rossi (2010) Global field interpolation for particle

methods. Journal of Computational Physics, 229, pp. 1292-1310.

QDR Infiniband network

interconnect (Qlogic)

Powered by