SC08 Keynote speaker
Michael Dell to Give Keynote Speech at SC08
Michael Dell, Dell, Inc.
Higher Performance: Supercomputing in the Connected Era
Abstract: 2008 marks the 20th anniversary of SC bringing together the worlds leading high-performance computing researchers, scientists and engineers. From the environment to health and energy, these leaders have helped address many of the worlds most pressing challenges. The next era of HPC will be enabled by super-scalable, increasingly simple technologies that will make possible even greater collaboration, productivity and scientific breakthroughs.
Biography: Michael Dell, chairman of the board of directors and chief executive officer of Dell Inc., the company he founded in 1984 with $1,000 and an unprecedented ideato build relationships directly with customerswill give the keynote address at SC08 in Austin, Texas. In 1992, Mr. Dell became the youngest CEO ever to earn a ranking on the Fortune 500 list. Mr. Dell is the author of Direct From Dell: Strategies that Revolutionized an Industry, his story of the rise of the company and the strategies he has refined that apply to all businesses.
In 1998, Mr. Dell formed MSD Capital, and in 1999, he and his wife formed the Michael & Susan Dell Foundation, to manage the investments and philanthropic efforts, respectively, of the Dell family.
Born in February 1965, Mr. Dell serves on the Foundation Board of the World Economic Forum and the Executive Committee of the International Business Council, and is a member of the U.S. Business Council. Mr. Dell also serves on the U.S. President's Council of Advisors on Science and Technology, the Technology CEO Council and the governing board of the Indian School of Business in Hyderabad, India.
Invited Speakers
Kenneth H. Buetow, National Cancer Institute
Developing an Interoperable IT Framework to Enable Personalized Medicine
Abstract: 21st Century biomedical research is driven by massive amounts of data; automated technologies generate hundreds of gigabytes of DNA sequence information, terabytes of high resolution medical images, and massive arrays of gene expression information on thousands of genes tested in hundreds of independent experiments. Clinical research data is no different. Each clinical trial may potentially generate hundreds of data points of thousands of patients over the course of the trial.
This influx of data has enabled a new understanding of disease on its fundamental, molecular basis. Many diseases are now understood as complex interactions between an individuals genes, environment and lifestyle. To harness this new understanding, research and clinical care capabilities (traditionally undertaken as isolated functions) must be bridged to seamlessly integrate laboratory data, biospecimens, medical images and other clinical data. This collaboration between researchers and clinicians will create a continuum between the bench and the bedsidespeeding the delivery of new diagnostics and therapies, tailored to specific patients, ultimately improving clinical outcomes.
To realize the promises of this new paradigm of personalized medicine, healthcare and drug discovery organizations must evolve their core processes and IT capabilities to enable broader interoperability among data resources, tools and infrastructureboth within and across institutions. Answers to these challenges are enabled by the cancer Biomedical Informatics Grid (caBIG) initiative, overseen by the National Cancer Institute Center for Biomedical Informatics and Information Technology (NCI-CBIIT). caBIG is a collection of interoperable software tools, standards, databases, and grid-enabled computing infrastructure founded on four central principles:
- Open access; anyonewith appropriate permissionmay access caBIG the tools and data
- Open development; the entire research community participates in the development, testing, and validation of the tools
- Open source; all the tools are available for use and modification
- Federation; resources can be controlled locally, or integrated across multiple sites
caBIG is designed to connect researchers, clinicians, and patients across the continuum of biomedical researchallowing seamless data flow between electronic health records and data sources including genomic, proteomic, imaging, biospecimen, pathology and clinical information, facilitating collaboration across the entire biomedical enterprise.
caBIG technologies are widely applicable beyond cancer and may be freely adopted, adapted or integrated with other standards-based tools and systems. Guidelines, tools and support infrastructure are in place to facilitate broad integration of caBIG tools, which are currently being deployed at more than 60 academic medical centers around the United States and are being integrated in the Nationwide Health Information Network as well. For more information on caBIG, visit http://cabig.cancer.gov/.
Biography: In his role as the Associate Director for Bioinformatics and Information Technology at the National Cancer Institute (NCI), Dr. Buetow is best known for initiating the cancer Biomedical Informatics Grid (caBIG) and currently oversees its activities. caBIG was conceived as the World Wide Web of cancer research, providing data standards, interoperable tools and grid-enabled computing infrastructure to address the needs of all constituencies in the cancer community. Guided by the principles of open source licensing, open software development, open access to the products of that development and federated data storage and integration, caBIG facilitates data and knowledge exchange and simplifies collaboration between biomedical researchers and clinicians, leading to better patient outcomes and the realization of personalized medicine in cancer care and beyond.
As Director of the NCI Center for Biomedical Informatics and Information Technology (NCICBIIT), Buetow works to advance the centers goal of maximizing interoperability and integration of NCI research. The center participates in the evaluation and prioritization of the NCIs bioinformatics research portfolio; facilitates and conducts research required to address the CBIITs mission; serves as the locus for strategic planning to address the NCIs expanding research initiatives informatics needs; establishes information technology standards (both within and outside of NCI); and communicates, coordinates or establishes information exchange standards.
Buetow also serves as the Chief of the Laboratory of Population Genetics (LPG), which focuses on developing, extending and applying human genetic analysis methods and resources to better understand the genetics of complex phenotypes, specifically human cancer. He also spearheaded the efforts of the Genetic Annotation Initiative (GAI) to identify variant forms of the cancer genes detected through the NCI Cancer Genome Anatomy Project (CGAP). His laboratory combines computational tools with laboratory research to understand how genetic variations make individuals more susceptible to liver, lung, prostate, breast and ovarian cancer.
Buetow received a B.A. in biology from Indiana University in 1980 and a Ph.D. in human genetics from the University of Pittsburgh in 1985. From 1986 to 1998, he was at the Fox Chase Cancer Center in Philadelphia, where he worked with the Cooperative Human Linkage Center (CHLC) to produce a comprehensive collection of human genetic maps. Buetow has been in his role at NCI since 2000. He has published more than 160 scientific papers on a wide variety of topics in journals such as PNAS, Science, Cell, and Cancer Research.
His honors and awards include The Editors Choice Award from Bio-IT World (2008), The Federal 100 Award (2005), The NIH Award of Merit (2004), the NCI Directors Gold Star Award (2004), The Partnership in Technology Award (1996), and the Computerworld Smithsonian Award for Information Technology (1995).
David Patterson, University of California Berkeley
Parallel Computing Landscape: A View from Berkeley
Abstract: In December 2006 we published a broad survey of the issues for the whole field concerning the multi-core/many-core sea change (see view.eecs.berkeley.edu). We view the ultimate goal as being able to productively create efficient, correct and portable software that smoothly scales when the number of cores per chip doubles biennially. This talk covers the specific research agenda that a large group of us at Berkeley are going to follow (see parlab.eecs.berkeley.edu) as part of a center funded for five years by Intel and Microsoft.
To take a fresh approach to the longstanding parallel computing problem, our research agenda will be driven by compelling applications developed by domain experts in personal health, image retrieval, music, speech understanding and browsers. The development of parallel software is divided into two layers: an efficiency layer that aims at low overhead for 10 percent of the best programmers, and a productivity layer for the rest of the programming communityincluding domain expertsthat reuses the parallel software developed at the efficiency layer.
Key to this approach is a layer of libraries and programming frameworks centered around the 13 design patterns that we identified in the Berkeley View report. We rely on autotuning to map the software efficiently to a particular parallel computer. The role of the operating systems and the architecture in this project is to support software and applications in achieving the ultimate goal. Examples include primitives like thin hypervisors and libraries for the operating system and hardware support for partitioning and fast barrier synchronization. We will prototype the hardware of the future using field programmable gate arrays (FPGAs) on a common hardware platform being developed by a consortium of universities and companies (see http://ramp.eecs.berkeley.edu/).
Biography: David Patterson was the first in his family to graduate from college and he enjoyed it so much that he didnt stop until he received a Ph.D. from UCLA in 1976. He then moved north to UC Berkeley. He spent 1979 at DEC working on the VAX minicomputer, which inspired him and his colleagues to later develop the Reduced Instruction Set Computer (RISC). In 1984, Sun Microsystems recruited him to start the SPARC architecture. In 1987, Patterson and colleagues tried building dependable storage systems from the new PC disks. This led to the popular Redundant Array of Inexpensive Disks (RAID). He spent 1989 working on the CM-5 supercomputer. Patterson and colleagues later tried building a supercomputer using standard desktop computers and switches. The resulting Network of Workstations (NOW) project led to cluster technology used by many Internet services. He is currently director of both the Reliable Adaptive Distributed Systems Lab and the Parallel Computing Lab at UC Berkeley. In the past, he served as chair of Berkeleys Computer Science Division, chair of the Computing Research Association, and president of the ACM.
All this has resulted in 200 papers, five books, and about 30 honors, some shared with friends, including election to the National Academy of Engineering, the National Academy of Sciences, and the Silicon Valley Engineering Hall of Fame. He was named Fellow of the Computer History Museum and both AAAS organizations. Awards were also received from the ACM, where as a fellow, he received the SIGARCH Eckert-Mauchly Award, the SIGMOD Test of Time Award, the Distinguished Service Award, and the Karlstrom Outstanding Educator Award. Patterson is also a fellow at the IEEE, where he received the Johnson Information Storage Award, the Undergraduate Teaching Award and the Mulligan Education Medal. Finally, he shared the IEEE the von Neumann Medal and the NEC C&C Prize with John Hennessy of Stanford University.
Jeffrey Wadsworth, Battelle Memorial Institute
High Performance Computing and the Energy Challenge: Issues and Opportunities
Abstract: Energy issues are central to the most important strategic challenges facing the United States and the world. The energy problem can be broadly defined as providing enough energy to support higher standards of living for a growing fraction of the worlds increasing population without creating intractable conflict over resources or causing irreparable harm to our environment. It is increasingly clear that even large-scale deployment of the best, currently available energy technologies will not be adequate to successfully tackle this problem. Substantial advances in the state of the art in energy generation, distribution and end use are needed. It is also clear that a significant and sustained effort in basic and applied research and development (R&D) will be required to deliver these advances and ensure a desirable energy future.
It is in this context that high performance computing takes on a significance that is co-equal with theory and experiment. The U.S. Department of Energy (DOE) and its national laboratories have been world leaders in the use of advanced high-performance computing to address critical problems in science and energy. As computing nears the petascale, a capability that until recently was beyond imagination, it is now poised to address these critical problems. Battelle Memorial Institute manages or co-manages six DOE national laboratories that together house some of the most powerful computers in the world. These capabilities have enabled remarkable scientific progress in the last decade. The world-leading petascale computers that are now being deployed will make it possible to solve R&D problems of importance to a secure energy future and contribute to the long-term interests of the United States.
Biography: Jeffrey Wadsworth is the senior executive responsible for Battelles laboratory management business. Battelle currently manages or co-manages six U.S. Department of Energy (DOE) national laboratories: Brookhaven, Idaho, Lawrence Livermore, the National Renewable Energy Laboratory, Oak Ridge and Pacific Northwest. A Battelle subsidiary, Battelle National Biodefense Institute, manages the National Biodefense Analysis and Countermeasures Center for the U.S. Department of Homeland Security. The laboratories have combined research revenues of $3.2 billion and employ 16,000 staff.
Wadsworth joined Battelle in August 2002 and was a member of the White House Transition Planning Office for the Department of Homeland Security before being named director of Oak Ridge National Laboratory (ORNL) and president and CEO of UT Battelle, LLC, which manages the laboratory for DOE, in August 2003. As ORNLs director through June 2007, he was responsible for managing DOEs largest multi-purpose science and energy laboratory, with 4,100 staff and an annual budget of $1 billion. Under his leadership the laboratory commissioned the $1.4 billion Spallation Neutron Source, launched DOEs first nanoscience research center, developed the worlds most powerful unclassified computer system, expanded its work in national security, and initiated an interdisciplinary bioenergy program.
Before joining Battelle, Wadsworth was Deputy Director for Science and Technology at Lawrence Livermore National Laboratory, where he oversaw science and technology across all programs and disciplines. His responsibilities included programmatic and discretionary funding, technology transfer, and workforce competencies. He joined the laboratory in 1992 and was Associate Director for Chemistry and Materials Science before becoming Deputy Director in 1995. From 1980 to 1992, Wadsworth worked for Lockheed Missiles and Space Company at the Palo Alto Research Laboratory, where as manager of the Metallurgy Department, he was responsible for direction of research activities and acquisition of research funds.
Wadsworth attended the University of Sheffield in England, graduating with a bachelors and a Ph.D. in 1972 and 1975, respectively. He was awarded a D. Met. for published work in 1990 and an honorary D. Eng. in 2004. He joined Stanford University in 1976 and conducted research on the development of steels, superplasticity, materials behavior and Damascus steels. He lectured at Stanford after joining Lockheed, and remained a consulting professor until 2004. He is a Distinguished Research Professor in the department of materials science and engineering at the University of Tennessee.
He has authored and co-authored more than 280 papers in the open scientific literature on a wide range of materials science and metallurgical topics; one book, Superplasticity in Metals and Ceramics (Cambridge, 1997); and four U.S. patents. He has presented or co-authored 300 talks at conferences, scientific venues, and other public events, and has twice been selected as a NATO Lecturer. His work has been recognized with many awards, including Sheffield Universitys Metallurgica Aparecida Prize for Steel Research and Brunton Medal for Metallurgical Research. He was elected a Fellow of ASM International in 1987, of The Minerals, Metals & Materials Society in 2000, and of the American Association for the Advancement of Science (AAAS) in 2003. He is a member of the Materials Research Society (MRS) and the American Ceramic Society (ACeRS). In January 2005 he was elected to membership in the National Academy of Engineering for research on high temperature materials, superplasticity, and ancient steels and for leadership in national defense and science programs.
Mary Wheeler, University of Texas at Austin
Computational Frameworks for Subsurface Energy and Environmental Modeling and Simulation
Abstract: Over the past 60 years, modeling and simulation have been essential to the success of the petroleum industry. This fact dates back to 1948 when von Neumann was a consultant for Humble Research in Houston, Texas. Exploration and production in the deep Gulf of Mexico and the North Slope of Alaska and the design and construction of the Alyeska pipeline could not have been achieved without modeling of coupled nonlinear partial differential equations.
Today, energy-related industries are facing new challenges: unprecedented demand for energy as well as growing environmental concerns over global warming and greenhouse gases. Resolving complex scientific issues in addressing next generation energies requires multidisciplinary teams of geologists, biologists, chemical, mechanical and petroleum engineers, mathematicians and computational scientists working closely together. Simulation needs include: 1) the development of novel multiscale (molecular to field scale) and multiphysics discretizations for estimating physical characteristics and statistics of stochastic systems; 2) modeling of multiscale stochastic problems for quantifying uncertainty to heterogeneity and small-scale uncertainty in subdomain system parameters; 3) verification and validation of models through experimentation and simulation; 4) robust optimization and optimal control for monitoring and controlling large systems; and 5) petascale computing on heterogeneous platforms that includes interactive visualization and seamless data management.
In order to address these challenges, a robust reservoir simulator comprised of coupled programs that together account for multicomponent, multiscale, multiphase (full compositional) flows and transport through porous media and through wells and that incorporate uncertainty and include robust solvers is required. The coupled programs must be able to treat different physical processes occurring simultaneously in different parts of the domain and, for computational accuracy and efficiency, should also accommodate multiple numerical schemes. In addition, these problem-solving environments or frameworks must have parameter estimation and optimal control capabilities. We present a wish list for simulator capabilities as well as describe the methodology and parallel algorithms employed in the IPARS software being developed at the University of Texas at Austin.
Biography: Mary Fanett Wheeler is a world-renowned expert in massive parallel-processing. She has been a part of the faculty at The University of Texas at Austin since 1995 and holds the Ernest and Virginia Cockrell Chair in the departments of Aerospace Engineering and Engineering Mechanics and Petroleum and Geosystems Engineering. She is also director of the Center for Subsurface Modeling at the Institute for Computational Engineering and Sciences (ICES).
Wheelers research group employs computer simulations to model the behavior of fluids in geological formations. Her particular research interests include numerical solution of partial differential systems with application to the modeling of subsurface flows and parallel computation. Applications of her research include multiphase flow and geomechanics in reservoir engineering, contaminant transport in groundwater, sequestration of carbon in geological formations, and angiogenesis in biomedical engineering. Wheeler has published more than 200 technical papers and edited seven books. She is currently an editor of nine technical journals.
Wheeler is a member of the Society of Industrial and Applied Mathematics, the Society of Petroleum Engineers, American Women in Mathematics, Mathematical Association of America and American Geophysical Union. She is a fellow of the International Association for Computational Mechanics and is a certified professional engineer in the state of Texas.
Wheeler has served on numerous committees for the National Science Foundation and the Department of Energy. For the past seven years she has been the university lead in the Department of Defense User Productivity Enhancement and Technology Transfer Program (PET) in environmental quality modeling. Wheeler is on the Board of Governors for Argonne National Laboratory and on the Advisory Committee for Pacific Northwest National Laboratory. In 1998, Wheeler was elected to the National Academy of Engineering. In 2006, she received an honorary doctorate from Technische Universiteit Eindhoven in the Netherlands and in 2008 an honorary doctorate from the Colorado School of Mines.
Wheeler received her B.S., B.A., and M.A. degrees from the University of Texas at Austin and her Ph.D. in mathematics from Rice University.
|