A recent Biz Tech article, "Biotech Companies See ROI in High-Performance Computing Systems" highlights how several companies in the BioTech and Life Sciences industries are leveraging High Performance Computing (HPC) to achieve results much faster than was previously done.

According to the article, part of this stems from cost. In the past, HPC systems were simply cost prohibitive for many SMBs. However, over the years, the cost has become much more affordable.  The article notes that whereas in the past it could run between $20 million and $30 million to purchase an HPC, today they're available for as little as $10,000.

But how are HPC systems being used?  The article took a look at what Translational Genomics Research Institute (TGen) is doing.  TGen  has been deploying three supercomputers from Dell for the past ten years to process genomics data.  Ease of use was a deciding factor for TGen's adoption of an HPC.

The article urges would-be HPC customers to keep in a mind their unique situation and needs before purchasing a system.  Among the considerations they suggest:

  • Does your processing workload justify the capital expense?
  • What are your data requirements
  • Do you have the in-house expertise or a trusted partner to help build this type of system
  • Is the data accessible?

Obviously, cost efficiency is paramount in any major expenditure.  However, as the article mentioned, the entry cost point for HPC has dropped dramatically.

Data requirements will vary among organizations. Choosing a high performance computing system that can meet your organization's current and future needs is key. TGen for example, keeps a petabyte of genomics data. It must keep in perpetuity because that data can impact future medical treatments many years after it was first collected.  Its genomics have doubled annually for the past three years, with no signs of slowing.

Accelerate Diagnostics, another company mentioned in the article, is adding a terabyte of data with microscopic images to its storage facility daily.

The ability to meet current and future data storage needs such as those of TGen and Accelerate is imperative.

The article also notes that with the tremendous amount of data SMBs are often required to keep for long period of times, having access to it in a variety formats is paramount. SBMs need a system that can support a variety of mining and analytics options.

By the way, if you're attending the International Data Corporation's HPC User Forum http://www.hpcuserforum.com/ next week, my colleague, Glen Otero will be speaking about TGen's inspiring pediatric cancer program. Stop by to hear Glen's insights.