The search for rocky exoplanets plays an important role in our quest for extra-terrestrial life. Here, we discuss the extreme physical properties possible for the first characterized rocky super-Earth, CoRoT-7b (R_pl = 1.58 \pm 0.10 R_Earth, Mpl = 6.9 \pm 1.2 M_Earth). It is extremely close to its star (a = 0.0171 AU = 4.48 R_st), with its spin and orbital rotation likely synchronized. The comparison of its location in the (Mpl, Rpl) plane with the predictions of planetary models for different compositions points to an Earth-like composition, even if the error bars of the measured quantities and the partial degeneracy of the models prevent a definitive conclusion. The proximity to its star provides an additional constraint on the model. It implies a high extreme-UV flux and particle wind, and the corresponding efficient erosion of the planetary atmosphere especially for volatile species including water. Consequently, we make the working hypothesis that the planet is rocky with no volatiles in its atmosphere, and derive the physical properties that result. As a consequence, the atmosphere is made of rocky vapours with a very low pressure (P \leq 1.5 Pa), no cloud can be sustained, and no thermalisation of the planetary is expected. The dayside is very hot (2474 \leq 71 K at the sub-stellar point) while the nightside is very cold (50 to 75 K). The sub-stellar point is as hot as the tungsten filament of an incandescent bulb, resulting in the melting and distillation of silicate rocks and the formation of a lava ocean. These possible features of CoRoT-7b could be common to many small and hot planets, including the recently discovered Kepler-10b. They define a new class of objects that we propose to name "Lava-ocean planets"
We provide two approaches for explaining inconsistency in multi-context systems, where decentralized and heterogeneous system parts interact via nonmonotonic bridge rules. Inconsistencies arise easily in such scenarios, and nonmonotonicity calls for specific methods of inconsistency analysis. Both our approaches characterize inconsistency in terms of involved bridge rules: either by pointing out rules which need to be altered for restoring consistency, or by finding combinations of rules which cause inconsistency. We show duality and modularity properties, give precise complexity characterizations, and provide algorithms for computation using HEXprograms. Our results form a basis for inconsistency management in heterogeneous knowledge integration systems.
A complete census of planetary systems around a volume-limited sample of solar-type stars (FGK dwarfs) in the Solar neighborhood (d ≤ 15 pc) with uniform sensitivity down to Earth-mass planets within their Habitable Zones out to several AUs would be a major milestone in extrasolar planets astrophysics. This fundamental goal can be achieved with a mission concept such as NEAT -the Nearby Earth Astrometric Telescope.NEAT is designed to carry out space-borne extremelyhigh-precision astrometric measurements at the 0.05 µas (1σ) accuracy level, sufficient to detect dynamical effects due to orbiting planets of mass even lower than Earth's around the nearest stars. Such a survey mission would provide the actual planetary masses and the full orbital geometry for all the components of the detected planetary systems down to the Earth-mass limit. The NEAT performance limits can be achieved by carrying out differential astrometry between the targets and a set of suitable reference stars in the field. The NEAT instrument design consists of an off-axis parabola singlemirror telescope (D = 1m), a detector with a largeThe complete affiliations are given at the end of the paper. The full list of members of the NEAT proposal is avialable at http://neat.obs.ujf-grenoble.fr.field of view located 40 m away from the telescope and made of 8 small movable CCDs located around a fixed central CCD, and an interferometric calibration system monitoring dynamical Young's fringes originating from metrology fibers located at the primary mirror. The mission profile is driven by the fact that the two main modules of the payload, the telescope and the focal plane, must be located 40 m away leading to the choice of a formation flying option as the reference mission, and of a deployable boom option as an alternative choice. The proposed mission architecture relies on the use of two satellites, of about 700 kg each, operating at L2 for 5 years, flying in formation and offering a capability of more than 20,000 reconfigurations. The two satellites will be launched in a stacked configuration using a Soyuz ST launch vehicle.The NEAT primary science program will encompass an astrometric survey of our 200 closest F-, Gand K-type stellar neighbors, with an average of 50 visits each distributed over the nominal mission duration. The main survey operation will use approximately 70% of the mission lifetime. The remaining 30% of NEAT observing time might be allocated, for example, to improve the characterization of the architecture of selected planetary systems around nearby targets of specific interest (low-mass stars, young stars, etc.) discovered by Gaia, ground-based high-precision radial-velocity surveys, and other programs. With its exquisite, surgical astrometric precision, NEAT holds the promise to provide the first thorough census for Earth-mass planets around stars in the immediate vicinity of our Sun.
HEX-programs extend logic programs under the answer set semantics with external computations through external atoms. As reasoning from ground Horn programs with nonmonotonic external atoms of polynomial complexity is already on the second level of the polynomial hierarchy, minimality checking of answer set candidates needs special attention. To this end, we present an approach based on unfounded sets as a generalization of related techniques for ASP programs. The unfounded set detection is expressed as a propositional SAT problem, for which we provide two different encodings and optimizations to them. We then integrate our approach into a previously developed evaluation framework for HEX-programs, which is enriched by additional learning techniques that aim at avoiding the reconstruction of the same or related unfounded sets. Furthermore, we provide a syntactic criterion that allows one to skip the minimality check in many cases. An experimental evaluation shows that the new approach significantly decreases runtime.
Pathfinding for a single agent is the problem of planning a route from an initial location to a goal location in an environment, going around obstacles. Pathfinding for multiple agents also aims to plan such routes for each agent, subject to different constraints, such as restrictions on the length of each path or on the total length of paths, no self-intersecting paths, no intersection of paths/plans, no crossing/meeting each other. It also has variations for finding optimal solutions, e.g., with respect to the maximum path length, or the sum of plan lengths. These problems are important for many real-life applications, such as motion planning, vehicle routing, environmental monitoring, patrolling, computer games. Motivated by such applications, we introduce a formal framework that is general enough to address all these problems: we use the expressive high-level representation formalism and efficient solvers of the declarative programming paradigm Answer Set Programming. We also introduce heuristics to improve the computational efficiency and/or solution quality. We show the applicability and usefulness of our framework by experiments, with randomly generated problem instances on a grid, on a real-world road network, and on a real computer game terrain.
Answer Set Programming (ASP) is a well-established declarative paradigm. One of the successes of ASP is the availability of efficient systems. State-of-the-art systems are based on the ground+solve approach. In some applications this approach is infeasible because the grounding of one or few constraints is expensive. In this paper, we systematically compare alternative strategies to avoid the instantiation of problematic constraints, that are based on custom extensions of the solver. Results on real and synthetic benchmarks highlight some strengths and weaknesses of the different strategies. (Under consideration for acceptance in TPLP, ICLP 2017 Special Issue.)
We provide a systematic analysis of levels of integration between discrete high-level reasoning and continuous low-level feasibility checks to address hybrid planning problems in robotic applications. We identify four distinct strategies for such an integration: (i) low-level checks are done for all possible cases in advance and the results are used during plan generation; (ii) low-level checks are done exactly when they are needed during the search for a plan; (iii) low-level checks are done after a plan is computed, and if the plan is found infeasible then a new plan is computed; (iv) similar to the previous strategy but the results of previous low-level checks are used during computation of a new plan. We analyze the usefulness of these strategies and their combinations by experiments on hybrid planning problems in different robotic application domains, in terms of computational efficiency and plan quality (relative to its feasibility).
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.