The Research Computing Services group (RCS) administers advanced, multiprocessor supercomputing systems for research computing. These systems are available to all University faculty, their students, and their collaborators for research and for educational use in courses related to computational science.
The use of high-performance computing systems can dramatically reduce the time to obtain solutions for computationally and data intensive problems. High performance, high availability storage is shared across all of the systems, thereby reducing the time and complexity of data access.
Shared Computing Cluster (SCC)
Our current computing system, which went into production use in June 2013, is the
Shared Computing Cluster (SCC) at the Massachusetts Green High-Performance Computing Center (MGHPCC) in Holyoke, MA. The SCC is a large Linux cluster with over 6000 processors and more than 2 petabytes of disk for research data.
The SCC is composed of both shared (centrally funded by the University and available to all researchers at no cost) and buy-in (researcher owned with priority access) compute nodes and storage components accommodating a wide range of researchers’ requirements and resources. Currently, over 60% of the compute nodes are purchased through the buy-in program and the rest are a shared resource for the entire BU research computing community.
Over 2 petabytes of storage for research data is available in several configurations. All of it uses hardware RAID to protect against data loss due to disk drive failures and Snapshots to recover files that may have been accidentally deleted. A storage service called STASH provides an inexpensive means of maintaining a second copy of data off-site. Additionally, a limited amount of space which is automatically backed up to an off-site location is provided for disaster recovery. All of the above configurations are offered with an option to conform to dbGaP requirements.
Access to the SCC is via the campus and region’s high-performance networking. The campus core utilizes 10-Gigabit Ethernet with multiple 10-Gigabit links to the Boston/Cambridge NoX node and two pairs of 10-Gigabit connections to the MGHPCC.
If you need resources beyond what BU can provide, you also may want to look into the XSEDE (eXtreme Science and Engineering Discovery Environment) initiative of the National Science Foundation.