Item talk:Q153922

Add topic
Active discussions

High throughput computing: a solution for scientific analysis

Public land management agencies continually face resource management problems that are exacerbated by climate warming, land-use change, and other human activities. As the U.S. Geological Survey (USGS) Fort Collins Science Center (FORT) works with managers in U.S. Department of the Interior (DOI) agencies and other federal, state, and private entities, researchers are finding that the science needed to address these complex ecological questions across time and space produces substantial amounts of data.



The additional data and the volume of computations needed to analyze it require expanded computing resources well beyond single- or even multiple-computer workstations. To meet this need for greater computational capacity, FORT investigated how to resolve the many computational shortfalls previously encountered when analyzing data for such projects. Our objectives included finding a solution that would:


harness existing Computer Processing Units (CPUs) when they're idle to run multiple jobs concurrently, which reduces the overall processing time without requiring additional hardware;



offer an effective, centralized job-management system;

handle job failures due to hardware, software, or network interruptions (obviating the need to manually resubmit the job after each stoppage);



be affordable; and most importantly,



allow us to complete very large, complex analyses that otherwise would not even be possible.



In short, we envisioned a job-management system that would take advantage of unused FORT CPUs within a local area network (LAN) to effectively distribute and run highly complex analytical processes. What we found was a solution that uses High Throughput Computing (HTC) and High Performance Computing (HPC) systems to do exactly that (Figure 1).

Return to "Q153922" page.