The European Space Agency's Planck satellite, dedicated to studying the early Universe and its subsequent evolution, was launched 14 May 2009 and has been scanning the microwave and submillimetre sky continuously since 12 August 2009. In March 2013, ESA and the Planck Collaboration released the initial cosmology products based on the first 15.5 months of Planck data, along with a set of scientific and technical papers and a web-based explanatory supplement. This paper gives an overview of the mission and its performance, the processing, analysis, and characteristics of the data, the scientific results, and the science data products and papers in the release. The science products include maps of the cosmic microwave background (CMB) and diffuse extragalactic foregrounds, a catalogue of compact Galactic and extragalactic sources, and a list of sources detected through the Sunyaev-Zeldovich effect. The likelihood code used to assess cosmological models against the Planck data and a lensing likelihood are described. Scientific results include robust support for the standard six-parameter ΛCDM model of cosmology and improved measurements of its parameters, including a highly significant deviation from scale invariance of the primordial power spectrum. The Planck values for these parameters and others derived from them are significantly different from those previously determined. Several large-scale anomalies in the temperature distribution of the CMB, first detected by WMAP, are confirmed with higher confidence. Planck sets new limits on the number and mass of neutrinos, and has measured gravitational lensing of CMB anisotropies at greater than 25σ. Planck finds no evidence for non-Gaussianity in the CMB. Planck's results agree well with results from the measurements of baryon acoustic oscillations. Planck finds a lower Hubble constant than found in some more local measures. Some tension is also present between the amplitude of matter fluctuations (σ 8 ) derived from CMB data and that derived from Sunyaev-Zeldovich data. The Planck and WMAP power spectra are offset from each other by an average level of about 2% around the first acoustic peak. Analysis of Planck polarization data is not yet mature, therefore polarization results are not released, although the robust detection of E-mode polarization around CMB hot and cold spots is shown graphically.
Recent research in inductive category learning has demonstrated that interleaved study of category exemplars results in better performance than does studying each category in separate blocks. However, the questions of how the category structure influences this advantage and how simultaneous presentation interacts with the advantage are open issues. In this article, we present three experiments. The first experiment indicates that the advantage of interleaved over blocked study is modulated by the structure of the categories being studied. More specifically, interleaved study results in better generalization for categories with high within- and between-category similarity, whereas blocked presentation results in better generalization for categories with low within- and between-category similarity. In Experiment 2, we present evidence that when presented simultaneously, between-category comparisons (interleaved presentation) result in a performance advantage for high-similarity categories, but no differences were found for low-similarity categories. In Experiment 3, we directly compared simultaneous and successive presentation of low-similarity categories. We again found an overall benefit for blocked study with these categories. Overall, these results are consistent with the proposal that interleaving emphasizes differences between categories, whereas blocking emphasizes the discovery of commonalities among objects within the same category.
The European Space Agency's Planck satellite, which is dedicated to studying the early Universe and its subsequent evolution, was launched on 14 May 2009. It scanned the microwave and submillimetre sky continuously between 12 August 2009 and 23 October 2013. In February 2015, ESA and the Planck Collaboration released the second set of cosmology products based on data from the entire Planck mission, including both temperature and polarization, along with a set of scientific and technical papers and a web-based explanatory supplement. This paper gives an overview of the main characteristics of the data and the data products in the release, as well as the associated cosmological and astrophysical science results and papers. The data products include maps of the cosmic microwave background (CMB), the thermal Sunyaev-Zeldovich effect, diffuse foregrounds in temperature and polarization, catalogues of compact Galactic and extragalactic sources (including separate catalogues of Corresponding author: C. R. Lawrence, e-mail: email@example.comArticle published by EDP Sciences A1, page 1 of 38 A&A 594, A1 (2016) Sunyaev-Zeldovich clusters and Galactic cold clumps), and extensive simulations of signals and noise used in assessing uncertainties and the performance of the analysis methods. The likelihood code used to assess cosmological models against the Planck data is described, along with a CMB lensing likelihood. Scientific results include cosmological parameters derived from CMB power spectra, gravitational lensing, and cluster counts, as well as constraints on inflation, non-Gaussianity, primordial magnetic fields, dark energy, and modified gravity, and new results on low-frequency Galactic foregrounds.
We present a multidimensional multiple‐attenuation method that does not require any subsurface information for either surface or internal multiples. To derive these algorithms, we start with a scattering theory description of seismic data. We then introduce and develop several new theoretical concepts concerning the fundamental nature of and the relationship between forward and inverse scattering. These include (1) the idea that the inversion process can be viewed as a series of steps, each with a specific task; (2) the realization that the inverse‐scattering series provides an opportunity for separating out subseries with specific and useful tasks; (3) the recognition that these task‐specific subseries can have different (and more favorable) data requirements, convergence, and stability conditions than does the original complete inverse series; and, most importantly, (4) the development of the first method for physically interpreting the contribution that individual terms (and pieces of terms) in the inverse series make toward these tasks in the inversion process, which realizes the selection of task‐specific subseries. To date, two task‐specific subseries have been identified: a series for eliminating free‐surface multiples and a series for attenuating internal multiples. These series result in distinct algorithms for free‐surface and internal multiples, and neither requires a model of the subsurface reflectors that generate the multiples. The method attenuates multiples while preserving primaries at all offsets; hence, these methods are equally well suited for subsequent poststack structural mapping or prestack amplitude analysis. The method has demonstrated its usefulness and added value for free‐surface multiples when (1) the overburden has significant lateral variation, (2) reflectors are curved or dipping, (3) events are interfering, (4) multiples are difficult to identify, and (5) the geology is complex. The internal‐multiple algorithm has been tested with good results on band‐limited synthetic data; field data tests are planned. This procedure provides an approach for attenuating a significant class of heretofore inaccessible and troublesome multiples. There has been a recent rejuvenation of interest in multiple attenuation technology resulting from current exploration challenges, e.g., in deep water with a variable water bottom or in subsalt plays. These cases are representative of circumstances where 1-D assumptions are often violated and reliable detailed subsurface information is not available typically. The inverse scattering multiple attenuation methods are specifically designed to address these challenging problems. To date it is the only multidimensional multiple attenuation method that does not require 1-D assumptions, moveout differences, or ocean‐bottom or other subsurface velocity or structural information for either free‐surface or internal multiples. These algorithms require knowledge of the source signature and near‐source traces. We describe several current approaches, e.g., energy minimization and trace extrapolation, for satisfying these prerequisites in a stable and reliable manner.
This paper presents an overview and a detailed description of the key logic steps and mathematical-physics framework behind the development of practical algorithms for seismic exploration derived from the inverse scattering series. There are both significant symmetries and critical subtle differences between the forward scattering series construction and the inverse scattering series processing of seismic events. These similarities and differences help explain the efficiency and effectiveness of different inversion objectives. The inverse series performs all of the tasks associated with inversion using the entire wavefield recorded on the measurement surface as input. However, certain terms in the series act as though only one specific task,and no other task, existed. When isolated, these terms constitute a task-specific subseries. We present both the rationale for seeking and methods of identifying uncoupled task-specific subseries that accomplish: (1) free-surface multiple removal; (2) internal multiple attenuation; (3) imaging primaries at depth; and (4) inverting for earth material properties. A combination of forward series analogues and physical intuition is employed to locate those subseries. We show that the sum of the four taskspecific subseries does not correspond to the original inverse series since terms with coupled tasks are never considered or computed. Isolated tasks are accomplished sequentially and, after each is achieved, the problem is restarted as though that isolated task had never existed. This strategy avoids choosing portions of the series, at any stage, that correspond to a combination of tasks,i.e.,
PatternLab for proteomics is an integrated computational environment that unifies several previously published modules for analyzing shotgun proteomic data. PatternLab contains modules for formatting sequence databases, performing peptide spectrum matching, statistically filtering and organizing shotgun proteomic data, extracting quantitative information from label-free and chemically labeled data, performing statistics for differential proteomics, displaying results in a variety of graphical formats, performing similarity-driven studies with de novo sequencing data, analyzing time-course experiments, and helping with the understanding of the biological significance of data in the light of the Gene Ontology. Here we describe PatternLab for proteomics 4.0, which closely knits together all of these modules in a self-contained environment, covering the principal aspects of proteomic data analysis as a freely available and easily installable software package. All updates to PatternLab, as well as all new features added to it, have been tested over the years on millions of mass spectra.
scite is a Brooklyn-based startup that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.