Distinguished Seminar Series on Computational and Data-Enabled Science and Engineering (CDS&E)
Computational and Data-Enabled Science and Engineering (CDS&E) is an intellectual discipline that brings together core areas of science and engineering, computer science, and computational and applied mathematics in a concerted effort to use cyberinfrastructure (CI) for scientific discovery and engineering innovations. It is clear that CDS&E has become a central theme in the national research and education agenda, and it is critically important that Rutgers University respond by establishing a core competency in this emerging discipline.
The goal of this distinguished seminar series is to bring national CDS&E leaders to Rutgers to talk about their vision and research. The currently scheduled speakers are listed below. The talks will be held at the Fiber Optics Auditorium, 607 Taylor Road on Busch Campus, and will begin at 10:00 AM. There will be an informal reception before the talks at 9:30AM. Additional details about the talks will be posted at http://nsfcac.rutgers.edu/cdse/seminars/.
Announcement: Seminars of March 08 and March 28 will be held in CAIT Auditorium on Busch Campus and will start at 10:30.
Schedule of Seminars
October 10, 2011:
Sangtae Kim, Executive Director of the Morgridge Institute for Research, University of Wisconsin-Madison
Title: Pharmaceutical Informatics and Computer-Aided Drug Discovery
Abstract: The scientific community has witnessed the divergence of the field of "computational biology" into two disjoint communities: one group focused on the "wriggling of molecules" (e.g., molecular dynamics) as governed by the equations of biochemical physics and another revolving around pattern matching and informatics (e.g., sequence alignment). As we enter the petascale era of data-intensive computational science, exciting advances in computer-aided drug discovery can be envisioned in a new niche in the computational biology ecosystem - a niche formed by the reunion of the separated branches of computational biology. In this presentation, we consider some examples from pharmaceutical informatics of data-intensive approaches applied to "objects" that are not simply numbers or static molecular structures but instead are (compute-intensive) dynamic simulations. The relevance of this approach to the rational design of new anti-cancer drugs will also be described.
Short Biography: Sangtae "Sang" Kim was appointed executive director of the new, private not-for-profit Morgridge Institute for Research in September 2008. In this position, he is building a world-class interdisciplinary biomedical research organization from the ground up. Located on the University of Wisconsin-Madison campus, it is intended to become the Midwest's premier, private medical research institute.
October 26, 2011:
Edward Seidel, Assistant Director for Mathematical and Physical Sciences, US National Science Foundation
Title: The Data and Compute-Driven Transformation of Modern Science
Abstract: We all know that modern science is undergoing a profound transformation as it aims to tackle the complex problems of the 21st Century. It is becoming highly collaborative; problems as diverse as climate change, renewable energy, or the origin of gamma-ray bursts require understanding processes that no single group or community alone has the skills to address. At the same time, after centuries of little change, compute, data, and network environments have grown by 9-12 orders of magnitude in the last few decades. Moreover, science is not only compute-intensive but is dominated now by data-intensive methods. This dramatic change in the culture and methodology of science will require a much more integrated and comprehensive approach to development and deployment of hardware, software, and algorithmic tools and environments supporting research, education, and increasingly collaboration across disciplines.
Short Biography:Edward Seidel is a physicist recognized for his work on numerical relativity and black holes, as well as in high-performance and grid computing. He earned his Ph.D. from Yale University in relativistic astrophysics. He was a professor at the Max Planck Institute for Gravitational Physics (Albert-Einstein-Institute, or AEI) in Germany from 1996-2003. There, Seidel founded and led AEI's numerical relativity and e-science groups, which became leaders in solving Einstein's equations using large-scale computers, and in distributed and grid computing. He also was a senior research scientist at the National Center for Supercomputing Applications and associate professor in the Physics Department at the University of Illinois, Urbana-Champaign.
November 28, 2011:
David Case, Professor of Chemistry & Chemical Biology, Rutgers University
Title: Molecular dynamics simulations of proteins and nucleic acids
Abstract: Molecular dynamics simulations of biomolecules have become an increasingly important tool in the fields of biochemistry and molecular biology. They also provide an interesting set of challenges for distributed and high-performance computing platforms. I will give an overview of ways in which such simulations are used in a biological realm, but devote most of the talk to a discussion of the computations themselves, examining challenges and opportunities in three main areas: tightly-coupled and highly parallel machines, more loosely coupled and very distributed networks, and the use of graphical processing units (GPUs) as an affordable alternative architecture.
Short Biography: David Case received a Bachelor's degree in Chemistry from Michigan State, and
a Ph.D. in chemical physics from Harvard. He joined the Department of
Chemistry and Chemical Biology and the BioMaPS Insitute at Rutgers in 2008,
following work at U.C. Davis and The Scripps Research Institute. His research
focusses on theoretical chemistry of biomolecules. Particular areas of
interest include molecular dynamics simulations of proteins and nucleic acids;
electronic structure calculations of transition-metal complexes that model
active sites in metalloenzymes; development and application of methods for NMR
structure determination; ligand-protein and ligand-nucleic acid docking and
computational drug design. He is a Fellow of the Royal Society of Chemistry.
February 08, 2012:
Alan Blatecky, Assistant Director for Office of Cyberinfrastructure, US National Science Foundation
Title: Cyberinfrastructure : a catalyst which is transforming science
Abstract: Innovative information technologies are transforming the fabric of society and data is the new currency for science, education, government and commerce. Decades of investments in observing platforms, computational infrastructure, and model development have already led to important breakthroughs in the sciences and engineering. The need to interrogate and manage the ever-increasing, unprecedented volumes of data and simulations being generated from these components of research-enabling infrastructure are transforming the very conduct of the science and engineering. Commoditization of both hardware and software is creating an era of significant disruptions. One disruption is the changing nature and role of next generation computing and technologies. A second disruption is that the ubiquitous availability of a wide range of technologies will fundamentally change the types of algorithms and software that must be implemented for research and education. A third disruption is the emerging transformation of the institutions engaged in the higher education enterprise as there will be much less connection between researchers and the physical place of their institutions. These factors are leading to new models for compute and data-intensive science that will be organized dynamically around research questions, domains, expertise, and resources, and will present new challenges to geographically-centered research efforts, including traditional departments and campuses.
Alan Blatecky has been appointed as director for the Office of Cyberinfrastructure (OCI) at the National Science Foundation (NSF). Blatecky has been the acting office head for OCI since June 2010.
February 20, 2012:
Craig Stewart, Executive Director, Pervasive Technology Institute, Associate Dean, Research Technologies, Associate Director, CREST, Indiana University
Title: Cyberinfrastructure begins at home
Abstract: Proclamations about the future of cyberinfrastructure in academia abound. Many such current proclamations can be summarized as "just do it all in the cloud." Clouds may be a good solution for many computing problems, but computational clouds are not magic fairy dust .... You can't just sprinkle the words "cloud computing" on any given problem and thereby solve it. IU's experience with cyberinfrastructure in academia demonstrates that the best order of operations is to set university goals and strategy first, and design cyberinfrastructure facilities and computational science research in ways that support the core university goals. IU's history in computing supporting academic research goes back to the 1950s. Oiver the past decade and a half IU has developed an excellent local cyberinfrastructure and a IT R&D organization - the Pervasive Technology Institute - which is well regarded locally and very successful at the in securing federal and non-federal grants and contracts. This talk will describe how IU has implemented it's IT strategy based on university goals, and discuss some of the common challenges facing academia as regards CI facilities and the need for more compositional and data intensive research and development centers. And IU does use clouds ... this talk will describe the circumstances in which cloud facilities seem to be the best solution to real problems.
Craig Stewart is the Executive Director of the Pervasive Technology Institute (PTI), IU's flagship initiative for advanced information technology research, development, and delivery in support of research, scholarship, and artistic performances. Stewart is Associate Dean for Research Technologies, and leads the Research Technologies Division of University Information Technology Services.
February 21, 2012:
Bob Grossman, Director of Informatics at the Institute for Genomics and Systems Biology, University of Chicago and the Open Cloud Consortium
Title: Science Clouds and Their Impact on Big Data Science
Just as cloud computing as transformed commercial IT, science clouds are beginning to impact science, especially data intensive science. In this talk, we give an introduction to cloud computing and to science clouds.
We also describe the Open Science Data Cloud (OSDC), a science cloud that is operated by the not-for-profit Open Cloud Consortium. The OSDC is a persistent cloud-based infrastructure that allows scientists to manage, analyze, integrate and share medium to large size scientific datasets. The OSDC contains data from a variety of scientific disciplines, from earth science to biology. The OSDC is currently one of the largest cloud-based infrastructures devoted to scientific data in the world.
Short Biography: Robert Grossman is a faculty member at the University of Chicago. He is the Director of Informatics at the Institute for Genomics and Systems Biology, a Senior Fellow at the Computation Institute, and a Professor of Medicine. His research group focuses on bioinformatics, cloud computing, data intensive computing, predictive modeling, and related areas. He is also a Partner at Open Data Group, which builds predictive models over big data, and the Chair of the Open Cloud Consortium. More information about him can be found at rgrossman.com
March 08, 2012:
Chris Johnson, Director, Scientific Computing and Imaging (SCI) Institute, University of Utah
Title: Visual Computing: Making Sense of a Complex World
Abstract: Computers are now extensively used throughout science, engineering, and medicine. Advances in computational geometric modeling, imaging, and simulation allow researchers to build and test models of increasingly complex phenomena and thus to generate unprecedented amounts of data. These advances have created the need to make corresponding progress in our ability to understand large amounts of data and information arising from multiple sources. In fact, to effectively understand and make use of the vast amounts of information being produced is one of the greatest scientific challenges of the 21st Century. Visual computing, which relies on and takes advantage of, the interplay among techniques of visualization, large-scale computing, data management, and imaging, is fundamental to understanding models of complex phenomena, which are often multi-disciplinary in nature. In this talk, I will provide examples of interdisciplinary visual computing and imaging research at the Scientific Computing and Imaging (SCI) Institute as applied to important problems in science, engineering, and medicine.
Short Biography: Chris Johnson directs the Scientific Computing and Imaging (SCI) Institute at the University of Utah where he is a Distinguished Professor of Computer Science and holds faculty appointments in the Departments of Physics and Bioengineering. His research interests are in the areas of scientific computing and scientific visualization. Dr. Johnson founded the SCI research group in 1992, which has since grown to become the SCI Institute employing over 200 faculty, staff and students. Professor Johnson serves on several international journal editorial boards, as well as on advisory boards to several national research centers. Professor Johnson has received several awards, including the the NSF Presidential Faculty Fellow (PFF) award from President Clinton in 1995 and the Governor's Medal for Science and Technology from Governor Michael Leavitt in 1999. He is a Fellow of the American Institute for Medical and Biological Engineering, a Fellow of the American Association for the Advancement of Science, and in 2009 he was elected a Fellow of the Society for Industrial and Applied Mathematics (SIAM) and received the Utah Cyber Pioneer Award. In 2010 Professor Johnson received the Rosenblatt Award from the University of Utah and the IEEE Visualization Career Award.
March 28, 2012:
Dan Reed, Corporate Vice President, Technology Policy Group Microsoft Corporation
Title: Technical Computing: Past, Present, Future
Abstract: Computational science and engineering are moving rapidly from a world of purely homogeneous and local resources to a much more complex world of distributed software and systems, virtual organizations and cloud services. In science, a tsunami of new experimental and computational data and a suite of increasingly ubiquitous sensors pose vexing problems in data analysis, transport, visualization and collaboration. In engineering, modeling tools and multidisciplinary manufacturing pose new challenges. In both cases, many of the most interesting problems increasingly lie at the intersection of individual disciplines, requiring teams from diverse backgrounds to work together across intellectual and cultural barriers.
Let's step back and think about the longer term future. Where is the technology going and what are the implications? What are the lessons that can be gleaned from the history of high-performance computing, technically, organizationally and politically? What are the educational and workforce implications? This talk will examine the scientific, technical and social issues around high-performance computing, computational science, research empowerment and new scientific domains.
Short Biography: As corporate vice president of the Technology Policy Group, Dr. Dan Reed helps shape Microsoft's long-term vision for technology innovations and the company's associated policy engagement with governments and institutions around the world. Given the centrality of information technology to communication and social interaction, research and development, education and learning, health and safety, the environment, and economic development, such strategic technology identification and policy coordination are critical to the company's future. In this capacity, Reed reports to and works closely with Craig Mundie, Microsoft's chief research and strategy officer.