As high performance computing becomes more prevalent, the challenges posed increase as well. Take, as just one example, the major challenges the biomedical field faces as researchers integrate and analyze an ever-growing amount of diverse data. Now consider the vast array of industries and academic departments increasingly reliant on big data and high performance computing.

Recently, Jay Etchings, the director of operations and research computing, and senior HPC architect at Arizona State University, began a series of thought pieces for HPCWire addressing how ASU is meeting those challenges by building a Next Generation Cyber Capability (NGCC.) The university is forging a path quite different from the traditional academic HPC model.

Their model is sustainable, collaborative, elastic, and distributed. Promising to overcome legacy barriers while opening new avenues into research sciences, it embraces 7 foundational components.

  • Collaboration-centric, NIST-compliant research centers. (Hybrid Cloud)
  • Open Big Data frameworks supporting global meta-data.
  • Dedicated bandwidth without policy or capacity restrictions that support research collaboration.
  • OpenFlow-based software-defined networking.
  • Programmable, pluggable architectures for the next generation of research computing. (NGCC Architecture)
  • Efficient and effective touch-less network management.
  • Research as a Service.

The NGCC provides a novel synergism that integrates big data platforms and traditional supercomputing technologies with software-defined networking, inter-intra-university high-speed interconnects, workload virtualization, coprocessors, and cloud-bursting capacity. The result heralds the next important wave of analytics innovation for life sciences research.

You can read all of Jay's article in  HPCWire here.