Reza Rooholamini
Dr. Reza Rooholamini

Introduction by Dell's Dr. Reza Rooholamini
I remember the day, in mid-1990s, when David Lifka and Paul Redfern of Cornell University came to Austin and presented Dell executives with an idea to catapult Dell into the HPC server business by designing and deploying the company’s first high-performance computing cluster. At the time, Dell was known for its market-leading PCs. Servers were an emerging business for us and supercomputing was a new and exciting territory for Dell. Six months later, thanks to Cornell and Dell's technical engineering team, we began gaining visibility and credibility for the high-performance computing cluster solutions we were deploying for customers. Today, HPC is a core business at Dell and our partnership with Cornell continues to grow with the recent deployment of “MATLAB on the TeraGrid,” an experimental computing resource. As explained by David in his blog post, MATLAB is an important research tool for U.S. scientists and engineers and as a seamlessly-accessible parallel resource it has the potential to broaden the high performance user community.

-- Dr. Reza Rooholamini

Guest Blog: Dr. David A. Lifka, Cornell University for Advanced Computing (CAC)

Cornell UniversityAccording to HPCwire, over 1 million researchers and engineers use MATLAB to develop technical computing applications. It is a pervasive tool in the science and engineering community essential to scientific discovery in a wide range of disciplines such as nanoscience, restoration ecology, microbiology and immunology, and computational fluid dynamics. MATLAB is also an important tool for manipulating data from scientific instruments, telescopes and remote sensors in novel ways in order to gain new insights into basic research questions or to solve time-sensitive problems confronting society today.

The Cornell University Center for Advanced Computing (CAC) recently deployed a 512-core Dell PowerEdge system that provides seamless parallel MATLAB computational services running on Windows HPC Server 2008 to remote desktop and Science Gateway users. The project is designed to advance the understanding of how to best deploy a software utility as a transparent and responsive users service.

Access to the 512-core experimental computing resource – called “MATLAB on the TeraGrid” – does not require knowledge of a specific operating system, batch scheduler, or message passing library. Interactive ease-of-use is facilitated by leveraging Parallel Computing Toolbox and MATLAB Distributed Computing Serer to seamlessly access the distributed computing resource through remote desktops and TeraGrid Science Gateways.
HVC Network

Visualization of an HCV network (Campo et al).

This NSF-funded resource has generated significant speed-ups on a wide range of applications including earth mantle dynamics, environmental remediation, space systems design, and Hepatitis C virus (HCV) modeling, a major cause of liver disease worldwide.

It was also recently connected to a Science Gateway. Use of the Cornell MATLAB resource is completely transparent to users of the Science Gateway, a resource for nanoscience created by the Network for Computational Nanotechnology at Purdue University. Communication protocols have been implemented to enable secure and authenticated data transmission between and Cornell.A tool called “NanoNet,” the first of many parallel applications on, has been converted to take advantage of the MATLAB resource.
Cornell is now allowing access to remote academic users from across the nation. New users interested in using the experimental resource or connecting a Science Gateway to it may request access at

By creating a seamlessly-accessible parallel MATLAB resource, Cornell, in partnership with Dell, Microsoft, and MathWorks, is providing an important tool for computational research and a transparent backend for Science Gateways. The project also demonstrates a working model for high-performance utility computing, which may encourage other software vendors to explore and develop similar research capabilities.
To learn more, visit

What are your thoughts? We'd love to hear your comments on this very important HPC subject.

David Lifka, Cornell University Center for Advanced Computing Dr. David Lifka Biographical Sketch
David is the Director of the Cornell University Center for Advanced Computing. He is a HPC industry veteran with over twenty years experience in management and technology leadership positions at Cornell and Argonne National Laboratory. His areas of expertise include parallel job scheduling and resource management systems, data management, high throughput systems, and Web services. Lifka has a Ph.D. in Computer Science from the Illinois Institute of Technology and serves on a number of academic and corporate advisory boards. His scheduling technologies have been commercially licensed by industry and he has received a ComputerWorld/Smithsonian award for innovations in IT.