High Performance Computing: New Resources Let Faculty and Students Think Big in Computational Research

Posted on 11/26/2013 1:48:00 PM

Professors John Chrispell and Ed Donley and two students use high performance computers

Professor John Chrispell, students Ashleigh Craig and Theresa Scarnati, and professor Ed Donley run calculations on XSEDE. With new high performance computing, you'll be able to solve complex computational and data-driven problems like never before. 

New high performance computing resources enable faculty and students to perform, teach, and learn about computational research on a colossal scale, via high-throughput, parallel-processing computers that run trillions of calculations per second. 

IUP was recently awarded up to 200,000 free service hours in the Extreme Science and Engineering Discovery Environment (XSEDE), a global virtual system funded by the National Science Foundation, through mathematics professor John Chrispell's enrollment in the XSEDE Campus Champions program.

In addition, IUP has acquired a portable supercomputer via the LittleFe project, an Intel-funded multiuniversity initiative that provides select faculty the opportunity to build a supercomputer for computational science teaching. Math professor Ed Donley teamed up with Chrispell to build a LittleFe computer named PEPPER, or Portable Electronic Parallel Processing Educational Resource, for IUP classroom use.

The Penrose Cluster, created by chemistry professor Carl LeBlond, has a collection of software packages to help students get started with computational research in chemistry, materials science, bioinformatics, and biochemistry.

Together these resources make the sky the limit for faculty and students in any department who want to solve complex computational and data-driven problems or teach or learn how to use high performance computers.

Running calculations on XSEDEAlthough HPC has been around a while, with today's explosive growth in data generation, researchers are increasingly using supercomputers to perform innovative, data-intensive studies.

By breaking complex calculations into smaller ones and processing them concurrently at extremely high speeds, HPC machines can run in a matter of hours calculations that a desktop can't do or would require weeks to do.

"High performance computing frees researchers to solve complex problems with fewer simplifying assumptions, and more as the problems exist in the real world," says Chrispell, who models viscoelastic fluids such as multigrade oils, food products, and blood and other biofluids.

When researchers have to make assumptions and other compromises to get a collection of data points that is computationally feasible, they often end up paring down three-dimensional research problems to two dimensions, he explains.

"Say I want to track the velocity, stress, pressure, and temperature of a fluid on a fine enough computational grid to have all fluid motion significantly refined," says Chrispell. "Or maybe someone in a social science wants to track the use of a certain word on Twitter across the country.

"Where do we store all this data? How do we look through it and visualize it in a meaningful and timely way? What kind of questions can we ask if we can look through all that data quickly?"

A number of faculty and students, including math graduate student Ryan Grove, chemistry professor Justin Fair and student Teresa Dierks, and physics professor Majid Karimi, have begun using HPC resources.

Faculty and students in any department are invited to explore how HPC at IUP can help them reach their teaching and research goals. 

—Deborah Klenotic