In the context of complex granule computations within the Interactive Granular Computating (IGC) paradigm we frame a cognitive task where user perceptions of the suitability of a good are in relation to the parameters of the device producing it, all within a learning loop aimed at continuously improving those perceptions. We achieve this goal by extending the Fuzzy Inference System (FIS) paradigm to contexts where variables reckoning the user perceptions live in a non-metric space, hence neither users nor the learning algorithm have access to their true value. Namely, receiving in input a set of both crisp and fuzzy variables (respectively, from the hard_suit and the soft_suit of the c-granule to account for user and device logs), the inference system is asked to compute via the link_suit a set of crisp parameters satisfying some fuzzy evaluations stated by the user. A further complication is that the outputs are evaluated exactly in terms of the true unknown values held by the fuzzy attributes, which in turn must be inferred by the system. The whole work arose from everyday life problems faced by the European Project Social&Smart with the aim of optimally regulating household appliance runs. It represents a special instance of Interactive Rough Granular Computing (IRGC) that we face with a two-phase procedure that is reminiscent of the distal learning in neurocontrol. A web service is available where the reader may check the efficiency of the assessed procedure. Keywords Fuzzy inference systems Á Interactive granular computing Á Complex granules Á Distal learning Á Twophase learning This work has been supported by the European Project FP7 317947 Social&Smart.
Nowadays there is an high number of IoT applications that seldom can interact with each other because developed within different Vertical IoT Platforms that adopt different standards. Several efforts are devoted to the construction of cross-layered frameworks that facilitate the interoperability among cross-domain IoT platforms for the development of horizontal applications. Even if their realization poses different challenges across all layers of the network stack, in this paper we focus on the interoperability issues that arise at the data management layer. Specifically, starting from a flexible multi-granular Spatio-Temporal-Thematic data model according to which events generated by different kinds of sensors can be represented, we propose a Semantic Virtualization approach according to which the sensors belonging to different IoT platforms and the schema of the produced event streams are described in a Domain Ontology, obtained through the extension of the well-known Semantic Sensor Network ontology. Then, these sensors can be exploited for the creation of Data Acquisition Plans by means of which the streams of events can be filtered, merged, and aggregated in a meaningful way. A notion of consistency is introduced to bind the output streams of the services contained in the Data Acquisition Plan with the Domain Ontology in order to provide a semantic description of its final output. When these plans meet the consistency constraints, it means that the data they handle are well described at the Ontological level and thus the data acquisition process over passed the interoperability barriers occurring in the original sources. The facilities of the StreamLoader prototype are finally presented for supporting the user in the Semantic Virtualization process and for the construction of meaningful Data Acquisition Plans.INDEX TERMS Interoperability in IoT domain, ETL operations, semantic consistency.
Abstract. The paper shows the studies carried out within the framework of two research agreements for the census of abandoned enlisted assets owned by public bodies in the Provinces of Piacenza, Parma and Reggio Emilia, stipulated with the respective Heritage Departments. The purpose of this research is to increase the cognitive framework of cultural assets in a state of abandonment, and then to transfer the results to the dedicated WebGIS platform, with the aim of identifying the ones which are most in need of a conservation and reuse intervention, in order to return them back to their community. Afterwards, as a result of the cataloguing carried out, it was possible to make some critical considerations and statistical reworkings of the collected data. The data analysis, performed through the design of a GIS database, finds out a higher frequency of abandonment for some specific building types and with recurring locations. The research also highlights, once again, the importance of defining common ontologies, which are essential for statistical data processing and interoperability between different existing databases.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.