NCAR Supercomputing ‘Data Center’ Project An NCAR-led computing ‘facility’ for the study of the Earth system May 30, 2006
Download ReportTranscript NCAR Supercomputing ‘Data Center’ Project An NCAR-led computing ‘facility’ for the study of the Earth system May 30, 2006
NCAR Supercomputing ‘Data Center’ Project An NCAR-led computing ‘facility’ for the study of the Earth system May 30, 2006 Project Review • NCAR Mesa Lab computer facility: power, cooling and floor space will be inadequate beyond the current procurement • Science being restricted by focusing on capacity ahead of capability • Facility concept: 30,000 (initial to 60,000) sq. ft., 150,000 (to 300,000) sq. ft., 4 (to 24) MW redundant power, cooling, ~ 20 year lifetime • Phase 1 facility est. construction cost @ $50M to $70M – Such a facility would be a computational equivalent of the Hubble Telescope for geoscience simulation Average Electrical Cost by State (1990 - 2003) Data From Energy Information Administration http://www.eia.doe.gov/cneaf/electricity/epa/average_price_state.xls 12 Commercial Industrial 10 8 6 4 2 HI NJ AK RI M A CT M E VT CA NH NY M I AZ PA FL AL M D TX M S DE NV G A LA DC O H IL NM IA AR NC KS SD TN IN O K W I M O M N ND SC 0 ID W A O R W Y KY W V NE M T UT CO VA (cents per kWh) 14 Schedule 2006 2007 Discovery 2008 2009 2010 Planning/Financing Design • • • • • • • • • Construction Project plan development - Sep-Dec Community engagement - Ongoing Partnership development - Nov-Mar, down-select in April and June Engage National Science Foundation (GEO, OCI) - Ongoing Forge international collaborations (UK/NERC, ENES) Initiate facility building project ~ Summer 2006 Community workshop - September at NCAR Submit project prospectus to NSF – Jan 2007 First electrons ~ Summer 2010 An Opportunity NSF’s Petascale Roadmap “Overarching Recommendation: Establish a Petascale Collaboratory for the Geosciences with the mission to provide leadership-class computational resources that will make it possible to address, and minimize the time to solution of, the most challenging problems facing the geosciences.” www.joss.ucar.edu/joss_psg/meetings/petascale/ NSF’s Cyberinfrastructure Vision 2006-2010 A petascale center linked with multiple 100+ teraflop centers… Track-2 $30m Track-1 $200m 1 Petaflop Sustained Track-2 $30m Track-2 $30m 100TF Track-2 $30m Geosciences HPC Research Consortium Other National Labs and Concept Supercomputer Centers NSF Geosciences Research Community ATM, OCE, EAR, + Earth Interiors Resource Center and International Alliances Minority Institution Resource Center Geosciences HPC Collaboratory Center Energy Research Resource Center Atmospheric Science Resource Center NCAR + Facility Partner Ocean Science Resource Center Hydrology, Energy, etc. Research Communities Computational Science Resource Center Scientific Steering Committee • • • • • • • • • • • • • • • Rick Anthes Rafael Bras Guy Brasseur Kelvin Droegemeier Tamas Gombosi Gregory Jenkins Thomas Jordan David Maidment Jean-Bernard Minster John Orcutt Tim Palmer Annick Pouquet Jagadish Shukla Paola Rizzoli David Yuen Meteorology, UCAR Hydrology, MIT Atmospheric Science, Max Planck Institute, Hamburg Atmospheric Science, OU Space Science, U Michigan Atmospheric Science, Howard Geophysics, USC Hydrology, Univ. Texas Seismology, SIO Oceanography, SIO Weather and Climate, ECMWF Geophysical Turbulence, NCAR Climate, COLA Oceanography, MIT Geophysics, UMN NCAR committee • Tim Killeen, Larry Winter, Katy Schmoll, Al Kellie, Lawrence Buja, Peter Fox, Aaron Anderson, Peter Backlund, Frank Bryan, Krista Laursen, Rich Loft, Jeff Reaves, Henry Tufo, Olga Wilhelmi, Michael Wiltberger Concluding remarks … Contacts at NCAR • Tim Killeen ([email protected]) - NCAR Director • Lawrence Buja ([email protected]) and Peter Fox ([email protected]) are co-chairs of the NCAR project team • Aaron Anderson ([email protected]) is the computing facilities contact • Jeff Reaves ([email protected]) is the financial/ contracts contact