The alignment between high performance computing (HPC) and big data has steadily gained traction over the past few years. As analytics and big data continue to be top of mind for organizations of all sizes and industries, traditional IT departments are considering HPC solutions to help provide rapid and reliable information to business owners so they can make more informed decisions.
This alignment is clearly seen by increasing sales of hyper-converged systems. IDC predicts sales of these systems will increase 116% this year compared 2014, reaching an impressive $807 million. This significant growth is expected to continue over the next few years. Indeed, the market is expected to experience nearly 60% compound annual growth rate (CAGR) from 2014 to 2019, at which time it will generate more than $3.9 billion in total sales.
To meet this growing customer demand, more hyper-converged systems are being offered. For example, the latest offering in the 13th generation Dell PowerEdge server portfolio, the PowerEdge C6320, is now available. These types of solutions help organizations meet their increasingly demanding workloads by offering improved performance, power improvements and cost-efficient compute and storage. This allows customers to optimize application performance and productivity while conserving energy use and saving traditional datacenter space.
Among the top research organizations and enterprises utilizing the marriage between HPC and big data is San Diego Supercomputer Center (SDSC). Comet, it’s new, recently-deployed petascale supercomputer is leveraging 27 racks of PowerEdge C6320, totaling 1,944 nodes or 46,656 cores. This represents a five-fold increase in compute capacity versus their previous system. In turn this affords SDSC the ability to provide HPC to a much larger research community. You can read more about Comet and how it is being in this Q&A with Rick Wagner, SDSC’s high-performance computing systems manager.
Learn more about the PowerEdge C6320 here.