The time is nowWhen processor clock speeds flatlined in 2004, after more than 15 years of exponential increases, the computational science community lost the key to the automatic performance improvements its applications had traditionally enjoyed. Subsequent developments in processor and system designhundreds of thousands of nodes, millions of cores, reduced bandwidth and memory available to cores, inclusion of special purpose elements -have made it clear that a broad divide has now opened up between the software infrastructure that we have, and the one we will certainly need to have to perform the kind of computationally intensive and data intensive work that tomorrow's scientists and engineers will require. Given the daunting conceptual and technical problems that such a change in design paradigms brings with it, we believe that this software gap will require an unprecedented level of cooperation and coordination within the worldwide open source software community. In forming the International Exascale Software Project (IESP), we hope to plan for and catalyze the kind of community wide effort that we believe is necessary to meet this historic challenge.Our belief in the need for broad-based, coordinated action by the global scientific software community to address the looming crisis reflects, in part, the fact computational methods are now universally accepted as indispensable to future progress in science and engineering. The last time a disruption of comparable dimensions occurred -during the transition from vector to distributed memory supercomputers more than two decades ago -only a relatively small part of the scientific community felt the consequences of the struggle to replace, wholesale, the programming models, numerical and communication libraries, and all the other software components and tools on which application scientists were already building.Computational science was still relatively young, and computationally intensive methods were still largely the province of relatively small scientific elite in a relatively small number of physical sciences.Today, aided by the success of the scientific software research and development community, researchers in nearly every field of science and engineering have been able to turn computational modeling/simulation and high-throughput data analysis to open new areas of inquiry (e.g., the very small, very large, very hazardous, very complex), to dramatically increase research productivity, and to amplify the social and economic impact of their work. Recent reports [7,10] make a compelling case, in terms of both scope and importance, for the profound expansion of our research horizons that will occur if we can rise to the challenge of peta/exascale computing. But in the light of the radical changes in computing we are currently undergoing, it is clear that the software infrastructure necessary to make that ascent does not yet exist and that we are a long way from being in a position to create it.At the same time, the increasing use of computationally intensive m...