DiaGrid (distributed computing network)
DiaGrid is a large, multicampus distributed research computing network utilizing the HTCondor system and centered at Purdue University in West Lafayette, Indiana. In 2012, it included nearly 43,000 processors representing 301 teraflops of computing power. DiaGrid received a Campus Technology Innovators Award from Campus Technology magazine[1] and an IDG InfoWorld 100 Award[2] in 2009 and was employed at the SC09 supercomputing conference in Portland, Ore., to capture nearly 150 days of compute time for science jobs.[3]
Partners
[edit]DiaGrid is a partnership with Purdue, Indiana University, Indiana State University, the University of Notre Dame, the University of Louisville, the University of Nebraska, the University of Wisconsin, Purdue's Calumet and North Central campuses, and Indiana University-Purdue University Fort Wayne. It is designed to accommodate computers at other campuses as new members join. The Purdue portion of the pool, named BoilerGrid, is the largest academic system of its kind.[citation needed]
Management
[edit]DiaGrid is managed by Information Technology at Purdue (ITaP), the central information technology organization at Purdue's West Lafayette campus, and ITaP's research computing unit the Rosen Center for Advanced Computing, which also operates the Steele, Coates, Rossmann, Hansen and Carter cluster supercomputers.[citation needed]
HTCondor
[edit]Through HTCondor, developed at the University of Wisconsin, DiaGrid harvests and manages computing cycles from idle or underused high-performance computing cluster nodes, servers, machines in campus computer and other labs, and office computers. Whenever a local user or scheduled job needs a given machine, the HTCondor job is stopped and automatically sent to another HTCondor node as soon as possible. While this "opportunistic" model limits the ability to do parallel processing and communications, a HTCondor pool can provide smaller, serial jobs vast numbers of cycles in a very short amount of time. HTCondor—and by extension, DiaGrid—is designed for high-throughput computing and is excellent for parameter sweeps, Monte Carlo simulation, or nearly any serial application. Some classes of parallel jobs (master-worker) may be run effectively via HTCondor as well.[citation needed]
Networking
[edit]To pool computational resources spread around Indiana and the Midwest, DiaGrid takes advantage of I-Light, the high-speed fiber-optic state network connecting Indiana campuses to each other, the Internet and national research networks such as the Internet2 and National LambdaRail. DiaGrid provides computational resources to researchers on both the Open Science Grid and the U.S. National Science Foundation's Extreme Science and Engineering Discovery Environment system (formerly TeraGrid).[citation needed]
Uses
[edit]DiaGrid and BoilerGrid have been used by researchers at Purdue and elsewhere for a variety of purposes,[1] such as imaging the structure of viruses at near-atomic resolutions,[4][5] simulating the early stages of the Solar System's formation, projecting the reliability of Indiana's electrical supply, modeling the spread of water pollutants, discerning the structure of protein molecules and identifying millions of potential new forms of zeolites, silicate minerals widely used to catalyze chemical reactions on an industrial scale.[6] DiaGrid also is being used to develop data processing techniques for the Large Synoptic Survey Telescope. Purdue added a Web-based portal for BLAST processing with DiaGrid in 2011.
External links
[edit]This section's use of external links may not follow Wikipedia's policies or guidelines. (March 2023) |
- [1]
- http://www.dia-grid.org Archived 2010-02-17 at the Wayback Machine
- https://www.youtube.com/watch?v=CH_YHGYQl2g
- http://campustechnology.com/articles/2009/07/22/campus-technology-innovators-awards-2009-high-performance-computing.aspx
- http://markets.hpcwire.com/taborcomm.hpcwire/?GUID=10770002&Page=MediaViewer&ChannelID=3198
- http://cloudcomputing.sys-con.com/node/1185883 Archived 2011-07-16 at the Wayback Machine
- http://news.uns.purdue.edu/x/2008b/081118McCartneyPool.html Archived 2009-02-16 at the Wayback Machine
- https://web.archive.org/web/20110302201542/http://www.itap.purdue.edu/newsroom/detail.cfm?newsId=2298
- http://www.cs.wisc.edu/htcondor/
- http://www.itap.purdue.edu/
- http://www.rcac.purdue.edu/
References
[edit]- ^ a b Grush, Mary; Villano, Matt (July 28, 2009). "Campus Technology Innovators Awards 2009: High-Performance Computing - Purdue University". Campus Technology.
- ^ "The top 100 IT projects of 2009". InfoWorld, November 23, 2009.
{{cite journal}}
: Cite journal requires|journal=
(help) - ^ "Cycle Computing and Purdue University to Power Dynamic Optimized Condor Pool at SuperComputing 2009" (Press release). Nov 13, 2009.
- ^ Jiang, Wen; et al. (Feb 28, 2008). "Backbone structure of the infectious e15 virus capsid revealed by electron cryomicroscopy" (PDF). Nature. 451 (7182): 1130–1134. Bibcode:2008Natur.451.1130J. doi:10.1038/nature06665. PMID 18305544. S2CID 205212346.
- ^ Wu, Weimin; Jiang, Wen (Apr 30, 2008). "Condor in Cryo-EM image processing". Archived from the original on July 25, 2011. Retrieved January 21, 2010.
- ^ Pophale, Ramdas; Cheeseman, Phillip A.; Deem, Michael W. (2011). "A database of new zeolite-like materials". Physical Chemistry Chemical Physics. 13 (27): 12407–12412. Bibcode:2011PCCP...1312407P. doi:10.1039/C0CP02255A. PMID 21423937.