TO avoid circumlocutions, we will often use the terminology of the operator product expansion when discussing factorization theorems. In particular, we will use the term "Wilson coefficient" to denote the short-distance coefficient in the standard factorization theorem for deep-inelastic scattering etc.3 102
During the last decade, Python (an interpreted, high-level programming language) has arguably become the de facto standard for exploratory, interactive, and computational driven scientific research. This issue discusses the advantages of Python for scientific research and presents several of the core Python libraries and tools used in scientific research. While the articles in the present issue are self-contained, they nicely compliment the articles in the May/June 2007 special issue of CiSE titled "Python: Batteries Included." 1
Existing calculations of heavy quark production in charged-current and neutral current lepton-hadron scattering are formulated differently because of the artificial distinction of "light" and "heavy" quarks made in the traditional approach. A proper QCD formalism valid for a wide kinematic range from near threshold to energies much higher then the quark mass should treat these processes in a uniform way. We formulate a unified approach to both types of leptoproduction processes based on the conventional factorization theorem. In this paper, we present the general framework with complete kinematics appropriate for arbitrary masses, emphasizing the simplifications provided by the helicity formalism. We illustrate this approach with an explicit calculation of the leading order contribution to the quark structure functions with general masses. This provides the basis for a complete QCD analysis of charged current and neutral current leptoproduction of charm and bottom quarks to be presented in subsequent papers.
Great earthquakes rarely occur within active accretionary prisms, despite the intense long-term deformation associated with the formation of these geologic structures. This paucity of earthquakes is often attributed to partitioning of deformation across multiple structures as well as aseismic deformation within and at the base of the prism (Davis et al., 1983). We use teleseismic data and satellite optical and radar imaging of the 2013 M w 7.7 earthquake that occurred on the southeastern edge of the Makran plate boundary zone to study this unexpected earthquake. We first compute a multiple point-source solution from W-phase waveforms to estimate fault geometry and rupture duration and timing. We then derive the distribution of subsurface fault slip from geodetic coseismic offsets. We sample for the slip posterior probability density function using a Bayesian approach, including a full description of the data covariance and accounting for errors in the elastic structure of the crust. The rupture nucleated on a subvertical segment, branching out of the Chaman fault system, and grew into a major earthquake along a 50°north-dipping thrust fault with significant along-strike curvature. Fault slip propagated at an average speed of 3:0 km=s for about 180 km and is concentrated in the top 10 km with no displacement on the underlying décollement. This earthquake does not exhibit significant slip deficit near the surface, nor is there significant segmentation of the rupture. We propose that complex interaction between the subduction accommodating the Arabia-Eurasia convergence to the south and the Ornach Nal fault plate boundary between India and Eurasia resulted in the significant strain gradient observed prior to this earthquake. Convergence in this region is accommodated both along the subduction megathrust and as internal deformation of the accretionary wedge.
[1] Solver coupling can extend the capability of existing modeling software and provide a new venue to address previously intractable problems. A software package has been developed to couple geophysical solvers, demonstrating a method to accurately and efficiently solve multiscale geophysical problems with reengineered software using a computational framework (Pyre). Pyre is a modeling framework capable of handling all aspects of the specification and launching of numerical investigations. We restructured and ported CitcomS, a finite element code for mantle convection, into the Pyre framework. Two CitcomS solvers are coupled to investigate the interaction of a plume at high resolution with global mantle flow at low resolution. A comparison of the coupled models with parameterized models demonstrates the accuracy and efficiency of the coupled models and illustrates the limitations and utility of parameterized models.
Key questions that scientists and engineers typically want to address can be formulated in terms of predictive science. Questions such as: "How well does my computational model represent reality?", "What are the most important parameters in the problem?", and "What is the best next experiment to perform?" are fundamental in solving scientific problems. mystic is a framework for massively-parallel optimization and rigorous sensitivity analysis that enables these motivating questions to be addressed quantitatively as global optimization problems. Often realistic physics, engineering, and materials models may have hundreds of input parameters, hundreds of constraints, and may require execution times of seconds or longer. In more extreme cases, realistic models may be multi-scale, and require the use of high-performance computing clusters for their evaluation. Predictive calculations, formulated as a global optimization over a potential surface in design parameter space, may require an already prohibitively large simulation to be performed hundreds, if not thousands, of times. The need to prepare, schedule, and monitor thousands of model evaluations, and dynamically explore and analyze results, is a challenging problem that requires a software infrastructure capable of distributing and managing computations on large-scale heterogeneous resources. In this paper, we present the design behind an optimization framework, and also a framework for heterogeneous computing, that when utilized together, can make computationally intractable sensitivity and optimization problems much more tractable. The optimization framework provides global search algorithms that have been extended to parallel, where evaluations of the model can be distributed to appropriate large-scale resources, while the optimizer centrally manages their interactions and navigates the objective function. New methods have been developed for imposing and solving constraints that aid in reducing the size and complexity of the optimization problem. Additionally, new algorithms have been developed that launch multiple optimizers in parallel, thus allowing highly efficient local search algorithms to provide fast global optimization. In this way, parallelism in optimization also can allow us to not only find global minima, but to simultaneously find all local minima and transition points --thus providing a much more efficient means of mapping out a potential energy surface.
MCViNE (Monte-Carlo VIrtual Neutron Experiment)is an open-source Monte Carlo (MC) neutron ray-tracing software for performing computer modeling and simulations that mirror real neutron scattering experiments. We exploited the close similarity between how instrument components are designed and operated and how such components can be modeled in software. For example we used object oriented programming concepts for representing neutron scatterers and detector systems, and recursive algorithms for implementing multiple scattering. Combining these features together in MCViNE allows one to handle sophisticated neutron scattering problems in modern instruments, including, for example, neutron detection by complex detector systems, and single and multiple scattering events in a variety of samples and sample environments. In addition, MCViNE can use simulation components from linear-chain-based MC ray tracing packages which facilitates porting instrument models from those codes. Furthermore it allows for components written solely in Python, which expedites prototyping of new components. These developments have enabled detailed simulations of neutron scattering experiments, with non-trivial samples, for time-of-flight inelastic instruments at the Spallation Neutron Source. Examples of such simulations for powder and single-crystal samples with various scattering kernels, including kernels for phonon and magnon scattering, are presented. With simulations that closely reproduce experimental results, scattering mechanisms can be turned on and off to determine how they contribute to the measured scattering intensities, improving our understanding of the underlying physics.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.