In this article we present an advanced version of Dual-PECCS, a cognitively-inspired knowledge representation and reasoning system aimed at extending the capabilities of artificial systems in conceptual categorization tasks. It combines different sorts of common-sense categorization (prototypical and exemplars-based categorization) with standard monotonic categorization procedures. These different types of inferential procedures are reconciled according to the tenets coming from the dual process theory of reasoning. On the other hand, from a representational perspective, the system relies on the hypothesis of conceptual structures represented as heterogeneous proxytypes. Dual-PECCS has been experimentally assessed in a task of conceptual categorization where a target concept illustrated by a simple common-sense linguistic description had to be identified by resorting to a mix of categorization strategies, and its output has been compared to human responses. The obtained results suggest that our approach can be beneficial to improve the representational and reasoning conceptual capabilities of standard cognitive artificial systems, and -in addition-that it may be plausibly applied to different general computational models of cognition. The current version of the system, in fact, extends our previous work, in that Dual-PECCS is now integrated and tested into two cognitive architectures, ACT-R and CLARION, implementing different assumptions on the underlying invariant structures governing human cognition. Such integration allowed us to extend our previous evaluation.
In this work we present a knowledge-based system equipped with a hybrid, cognitively inspired architecture for the representation of conceptual information. The proposed system aims at extending the classical representational and reasoning capabilities of the ontology-based frameworks towards the realm of the prototype theory. It is based on a hybrid knowledge base, composed of a classical symbolic component (grounded on a formal ontology) with a typicality based one (grounded on the conceptual spaces framework). The resulting system attempts to reconcile the heterogeneous approach to the concepts in Cognitive Science with the dual process theories of reasoning and rationality. The system has been experimentally assessed in a conceptual categorization task where common sense linguistic descriptions were given in input, and the corresponding target concepts had to be identified. The results show that the proposed solution substantially extends the representational and reasoning "conceptual" capabilities of standard ontology-based systems.
Tonal harmony analysis is a sophisticated task. It combines general knowledge with contextual cues, and it is concerned with faceted and evolving objects such as musical language, execution style and taste. We present BREVE, a system for performing a particular kind of harmony analysis, chord recognition: music is encoded as a sequence of sounding events and the system should assing the appropriate chord label to each event. The solution proposed to the problem relies on a conditional model, where domain knowledge is encoded in the form of Boolean features. BREVE exploits the recently proposed algorithm CarpeDiem to obtain significant computational gains in solving the optimization problem underlying the classification process. The implemented system has been validated on a corpus of chorales from J.S. Bach: we report and discuss the learnt weights, point out the committed errors, and elaborate on the correlation between errors and growth in the classification times in places where the music is less clearly asserted.
Abstract. In this paper we introduce the TTCS system, so named after Terms To Conceptual Spaces, that exploits a resource-driven approach relying on BabelNet, NASARI and ConceptNet. TTCS takes in input a term and its context of usage and produces as output a specific type of vector-based semantic representation, where conceptual information is encoded through the Conceptual Spaces (a geometric framework for common-sense knowledge representation and reasoning). The system has been evaluated in a twofold experimentation. In the first case we assessed the quality of the extracted common-sense conceptual information with respect to human judgments with an online questionnaire. In the second one we compared the performances of a conceptual categorization system that was run twice, once fed with extracted annotations and once with hand-crafted annotations. In both cases the results are encouraging and provide precious insights to make substantial improvements.
Abstract. Tonal harmony analysis is arguably one of the most sophisticated tasks that musicians deal with. It combines general knowledge with contextual cues, being ingrained with both faceted and evolving objects, such as musical language, execution style, or even taste. In the present work we introduce breve, a system for tonal analysis. breve automatically learns to analyse music using the recently developed framework of conditional models. The system is presented and assessed on a corpus of Western classical pieces from the 18 th to the late 19 th Centuries repertoire. The results are discussed and interesting issues in modeling this problem are drawn.
In this paper we illustrare a research based on NLP techniques aimed at automatically annotate modificatory provisions. We propose an approach which pairs deep syntactic parsing with rulebased shallow semantic analysis relying on a fine-grained taxonomy of modificatory provisions. The implemented system is evaluated on a large dataset hand-crafted by legal experts; the results are discussed and future directions of the research outlined.
Lexical resources are fundamental to tackle many tasks that are central to present and prospective research in Text Mining, Information Retrieval, and connected to Natural Language Processing. In this article we introduce COVER, a novel lexical resource, along with COVERAGE, the algorithm devised to build it. In order to describe concepts, COVER proposes a compact vectorial representation that combines the lexicographic precision characterizing BabelNet and the rich common-sense knowledge featuring Con-ceptNet. We propose COVER as a reliable and mature resource, that has been employed in as diverse tasks as conceptual categorization, keywords extraction, and conceptual similarity. The experimental assessment is performed on the last task: we report and discuss the obtained results, pointing out future improvements. We conclude that COVER can be directly exploited to build applications, and coupled with existing resources, as well.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.