Holland Computing Center David R. Swanson, Ph.D. Director.

26
Holland Computing Center David R. Swanson, Ph.D. Director

Transcript of Holland Computing Center David R. Swanson, Ph.D. Director.

Page 1: Holland Computing Center David R. Swanson, Ph.D. Director.

Holland Computing CenterDavid R. Swanson, Ph.D.

Director

Page 2: Holland Computing Center David R. Swanson, Ph.D. Director.

Computational and Data-Sharing Core

•Store and Share documents

•Store and Share data and databases

•Computing resources

•Expertise

Page 3: Holland Computing Center David R. Swanson, Ph.D. Director.

Who is HCC?

•HPC provider for University of Nebraska

•System-wide entity, evolved over last 11 years

•Support from President, Chancellor, CIO, VCRED

•10 FTE, 6 students

Page 4: Holland Computing Center David R. Swanson, Ph.D. Director.

HCC Resources•Lincoln:

•Tier-2 Machine Red (1500 cores, 400 TB)

•Campus clusters PrairieFire, Sandhills (1500 cores, 25TB)

•Omaha:

•Large IB cluster Firefly (4000 cores, 150 TB)

•10 Gb/s connection to Internet2 (DCN)

Page 5: Holland Computing Center David R. Swanson, Ph.D. Director.

Staff

•Dr. Adam Caprez, Dr. Ashu Guru, Dr. Jun Wang

•Tom Harvill, Josh Samuelson, John Thiltges

•Dr. Brian Bockleman (OSG development, grid computing)

•Dr. Carl Lundstedt, Garhan Attebury (CMS)

•Derek Weitzel, Chen He, Kartik Vedelaveni (GRAs)

• Carson Cartwright, Kirk Miller, Shashank Reddy (ugrads)

Page 6: Holland Computing Center David R. Swanson, Ph.D. Director.

HCC -- Schorr Center

•2200 sq. ft. machine room

•10 full-time staff

•PrairieFire, Sandhills, Red and Merritt

•2100 TB storage

•10 gbps network

Page 7: Holland Computing Center David R. Swanson, Ph.D. Director.

Three Types of Machines

•ff.unl.edu ::: large capacity cluster ... more coming soon

•prairiefire.unl.edu // sandhills.unl.edu ::: special purpose cluster

•merritt.unl.edu ::: shared memory machine

•red.unl.edu ::: grid enabled cluster for US CMS (OSG)

Page 8: Holland Computing Center David R. Swanson, Ph.D. Director.

prairiefire

50 nodes from SUN2 socket, quad-core opterons (400 cores)2 GB/core (800 GB)ethernet and SDR

InfinibandSGE or Condor submission

Page 9: Holland Computing Center David R. Swanson, Ph.D. Director.

Sandhills46 fat nodes

4 socket opterons32 cores/node (128

GB/node)1504 cores totalQDR Infiniband

Maui/Torque or Condor submission

Page 10: Holland Computing Center David R. Swanson, Ph.D. Director.

Merritt64 itanium processors

512 GB RAM shared memoryNFS storage (/home,

/work)PBS only, interactive for debugging only

Page 11: Holland Computing Center David R. Swanson, Ph.D. Director.

RedOpen Science Grid machinepart of US CMS project240 TB storage (dCache)over 1100 compute cores certificates required, no

login accounts

Page 12: Holland Computing Center David R. Swanson, Ph.D. Director.

HCC -- PKI

•1800 sq. ft. machine room (500 kVA UPS + generator)

•2 full-time staff

•Firefly

•150 TB Panasas storage

•10 gbps network

Page 13: Holland Computing Center David R. Swanson, Ph.D. Director.

Firefly4000+ Opteron cores

150 TB Panasas storageLogin or grid submissionsMaui (PBS)

Infiniband, Force10 GigE

Page 14: Holland Computing Center David R. Swanson, Ph.D. Director.

TBD 5800+ Opteron cores

400 TB Lustre storageLogin or grid submissionsMaui (PBS)

QDR Infiniband, GigE

Page 15: Holland Computing Center David R. Swanson, Ph.D. Director.

First Delivery...

Page 16: Holland Computing Center David R. Swanson, Ph.D. Director.

Last year’s Usage

Approaching 1 Million cpu hours/week

Page 17: Holland Computing Center David R. Swanson, Ph.D. Director.

Resources & Expertise

•Storage of large data sets (2100 TB)

•High Performance Storage (Panasas)

•High bandwidth transfers (9 gbps, ~50 TB/day)

•20 gbps between sites, 10 gbps to Internet2

•High Performance Computing: ~10,000 cores

•Grid computing and High Throughput Computing

Page 18: Holland Computing Center David R. Swanson, Ph.D. Director.

Usage Options•Shared Access

•Free

•Opportunistic

•Storage limited

•Shell or Grid deployment

•Priority Access

Page 19: Holland Computing Center David R. Swanson, Ph.D. Director.

Usage Options•Priority Access

•Fee assessed

•Reserved queue

•Expandable Storage

•Shell or Grid deployment

Page 20: Holland Computing Center David R. Swanson, Ph.D. Director.

Computational and Data-Sharing Core

• Will meet computational demands with a combination of Priority Access, Shared, and Grid resources

• Storage will include a similar mixture, but likely consist of more dedicated resources

• Often a trade-off between Hardware, Personnel and Software

• Commercial Software saves Personnel time

• Dedicated Hardware requires less development (grid protocols)

Page 21: Holland Computing Center David R. Swanson, Ph.D. Director.

Computational and Data-Sharing Core

•Resource organization at HCC

•Per research group -- free to all NU faculty and staff

•Associate quotas, fairshare or reserved portions of machines with these groups

•/home/swanson/acaprez/ ...

•accounting is straightforward

Page 22: Holland Computing Center David R. Swanson, Ph.D. Director.

Computational and Data-Sharing Core

•Start now - facilities and staff already in place

•It’s free - albeit shared

•Complaints currently encouraged (!)

•Iterations required

Page 23: Holland Computing Center David R. Swanson, Ph.D. Director.
Page 24: Holland Computing Center David R. Swanson, Ph.D. Director.
Page 25: Holland Computing Center David R. Swanson, Ph.D. Director.

More information

•http://hcc.unl.edu

[email protected]

[email protected]

•David Swanson: (402) 472-5006

•118K Schorr Center /// 158H PKI /// Your Office

•Tours /// Short Courses

Page 26: Holland Computing Center David R. Swanson, Ph.D. Director.

Sample Deployments

•CPASS site (http://cpass.unl.edu)

•DaliLite, Rosetta, OMMSA

•LogicalDoc ( https://hcc-ngndoc.unl.edu/logicaldoc/ )