From: chemistry-request at ccl.net
To: chemistry-request at ccl.net
Date: Fri Oct 12 21:44:29 2007
Subject: 08.11.03 IMA Development and Analysis of Multiscale Methods, Minneapolis, MN
IMA Development and Analysis of Multiscale Methods November 3-7, 2008 University of Minnessota, Minneapolis, MN http://www.ima.umn.edu/2008-2009/W11.3-7.08/ Organizers: Anne M. Chaka Physical and Chemical Properties, National Institute of Standards and Technology Gero Friesecke Mathematics, University of Warwick Kurt Kremer Max-Planck Institut fr Polymerforschung Yousef Saad Computer Science and Engineering, University of Minnesota Gregory A. Voth Theoretical and Physical Chemistry, University of Utah Description: Theoretical, computational and experimental approaches to problems in natural sciences typically focus on particular aspects of the studied phenomena or systems. This is linked to the need to structure the questions with respect to the most relevant length and time scales. This need comes from the limited range of applicability of specific experimental as well as theoretical tools. In the past this has created huge progress and is the basis of our current understanding of physical, chemical and biological systems. For example, in the area of phase transitions and critical phenomena renormalization group theory has shown that many properties such as critical exponents or ratio of critical amplitudes does not depend on microscopic details of the studied system. This means that within each universality class, for many properties it is sufficient to study highly idealized model systems. However details of the models determine the transition temperature or the absolute amplitudes. Similar examples could be given in many other areas, for example the mechanical response of bulk solids, thin films or biological membranes are to a large extent governed by a small number of universal models but constitutive parameters depend crucially on the details of the underlying microstructure. In a computer simulation, in principle it would be possible to study systems on huge length scales and for long times (i.e. fracture mechanics based on an all atom simulation, function of a membrane protein in a fully fluctauting membrane etc.) if all the interactions would be fully treated and infinite CPU time would be available. While neither of the two is the case, such an ansatz probably also would produce too much information, obscuring a more general understanding. Out of this, for several years now scale bridging or multiscale simulations methods are developed at many places. They are still in their infancy and the many different ideas did not converge into one or several generally accepted and validated schemes. Because of that, this fairly young and critical area of computational science can benefit greatly from advances in mathematics. Conversely, emerging computational experience on truly multiscale systems can serve as a great stimulus to mathematical understanding, which at present remains at its most thorough for two-scale systems (as treated, e.g., in classical and stochastic homogenization theory or Gamma-convergence). Examples of truly multiscale systems include biological ion channels, proteins, emulsions, functional materials and quantum dots. They require new methods to address challenges such as hierarchies of structual organisation, fluctuating (electrostatic) fields, simultaneous treatment and interdependencies of very short and very long range interactions, and approximate Hamiltonians to model dynamics and reactivity of tens of thousands of atoms. In order to achieve this coupling schemes between different scales have to be developed which includes systematic coarse graining strategies, appropriate interaction potentials and force fields, and methods to link studies on different scales and tune the resolution of the computer model to coarser and finer resolution as needed. Ultimately such schemes have to include classical as well as quantum methods. All this requires new approaches beyond conventional computer modeling. The workshop aims to address a number of exemplary questions. How does one parameterize coarse grained interaction potentials for bonded and nonbonded interactions? The latter is especially delicate for soft matter, because of the huge size of the molecules. What is the best point or regime in parameter and phase space to hand over from one to another level of description? How do errors propagate from one level to the next and what are the consequences when one wants to finegrain again? How specific or transferable are models and methods or are there general strategies to follow? Do we have strategies and general criteria for validation beyond trivial tests? Coarse graining means mapping of scales, but how does this work for nonequilibrium systems and time scales, i.e., for studying dynamics? All these questions will be addressed and discussed in terms of basic concepts as well as specific applications.NOTE THAT E-MAIL ADDRESSES HAVE BEEN MODIFIED!!!