Mobile applications are getting a great deal of interest among researchers due to their proliferation and pervasiveness, especially in the context of digital libraries of educational institutes. However, their low acceptance and usage are observed, hence, in-depth investigations are required in order to understand the factors behind low acceptance and intention to use mobile library application (MLA). Therefore, the aim of this work is to empirically explore the acceptance of MLA with a proposed model that is evolved from the technology acceptance model (TAM). The study objects to deliver empirical provision on acceptance of MLA. A self-administrated cross-sectional survey-based study was conducted to gather data from 340 users of MLA. Structural equation model (SEM) with an analysis of moment structure (AMOS) software was conducted to examine quantitative data. Results revealed that perceived usefulness and perceived ease of use are direct significant predictors with the intention to use MLA whereas system quality and habit are the influencing factors toward the usage intention of MLA. The findings help as a guide for effective decision in the design and development of MLA. Further, the outcomes can be utilized in the resource allocation process for ensuring the success of the library's vision and mission.
Identification of anomaly and malicious traffic in the Internet of things (IoT) network is essential for the IoT security to keep eyes and block unwanted traffic flows in the IoT network. For this purpose, numerous machine learning (ML) technique models are presented by many researchers to block malicious traffic flows in the IoT network. However, due to the inappropriate feature selection, several ML models prone misclassify mostly malicious traffic flows. Nevertheless, the significant problem still needs to be studied more in-depth that is how to select effective features for accurate malicious traffic detection in IoT network. To address the problem, a new framework model is proposed. Firstly, a novel feature selection metric approach named CorrAUC proposed, and then based on CorrAUC, a new feature selection algorithm name Corrauc is develop and design, which is based on wrapper technique to filter the features accurately and select effective features for the selected ML algorithm by using AUC metric. Then, we applied integrated TOPSIS and Shannon Entropy based on a bijective soft set to validate selected features for malicious traffic identification in the IoT network. We evaluate our proposed approach by using the Bot-IoT dataset and four different ML algorithms. Experimental results analysis showed that our proposed method is efficient and can achieve >96% results on average.
Edge computing provides a promising paradigm to support the implementation of industrial Internet of Things (IIoT) by offloading computational-intensive tasks from resourcelimited machine-type devices (MTDs) to powerful edge servers. However, the performance gain of edge computing may be severely compromised due to limited spectrum resources, capacity-constrained batteries, and context unawareness. In this paper, we consider the optimization of channel selection which is critical for efficient and reliable task delivery. We aim at maximizing the long-term throughput subject to longterm constraints of energy budget and service reliability. We propose a learning-based channel selection framework with service reliability awareness, energy awareness, backlog awareness, and conflict awareness, by leveraging the combined power of machine learning, Lyapunov optimization, and matching theory. We provide rigorous theoretical analysis, and prove that the proposed framework can achieve guaranteed performance with a bounded deviation from the optimal performance with global state information (GSI) based on only local and causal information. Finally, simulations are conducted under both single-MTD and multi-MTD scenarios to verify the effectiveness and reliability of the proposed framework.
Recently Internet of Things (IoT) is being used in several fields like smart city, agriculture, weather forecasting, smart grids, waste management, etc. Even though IoT has huge potential in several applications, there are some areas for improvement. In the current work, we have concentrated on minimizing the energy consumption of sensors in the IoT network that will lead to an increase in the network lifetime. In this work, to optimize the energy consumption, most appropriate Cluster Head (CH) is chosen in the IoT network. The proposed work makes use of a hybrid meta-heuristic algorithm, namely, Whale Optimization Algorithm (WOA) with Simulated Annealing (SA). To select the optimal CH in the clusters of IoT network, several performance metrics such as the number of alive nodes, load, temperature, residual energy, cost function has been used. The proposed approach is then compared with several state-of-the-art optimization algorithms like Artificial Bee Colony (ABC) algorithm, Genetic Algorithm (GA), Adaptive Gravitational Search algorithm (AGSA), Whale Optimization Algorithm (WOA). The results prove the superiority of the proposed hybrid approach over existing approaches.
Today, internet and device ubiquity are paramount in individual, formal and societal considerations. Next generation communication technologies, such as Blockchains (BC), Internet of Things (IoT), cloud computing, etc. offer limitless capabilities for different applications and scenarios including industries, cities, healthcare systems, etc. Sustainable integration of healthcare nodes (i.e. devices, users, providers, etc.) resulting in healthcare IoT (or simply IoHT) provides a platform for efficient service delivery for the benefit of care givers (doctors, nurses, etc.) and patients. Whereas confidentiality, accessibility and reliability of medical data are accorded high premium in IoHT, semantic gaps and lack of appropriate assets or properties remain impediments to reliable information exchange in federated trust management frameworks. Consequently, We propose a Blockchain Decentralised Interoperable Trust framework (DIT) for IoT zones where a smart contract guarantees authentication of budgets and Indirect Trust Inference System (ITIS) reduces semantic gaps and enhances trustworthy factor (TF) estimation via the network nodes and edges. Our DIT IoHT makes use of a private Blockchain ripple chain to establish trustworthy communication by validating nodes based on their inter-operable structure so that controlled communication required to solve fusion and integration issues are facilitated via different zones of the IoHT infrastructure. Further, C implementation using Ethereum and ripple Blockchain are introduced as frameworks to associate and aggregate requests over trusted zones.
Classification of imbalanced data is a vastly explored issue of the last and present decade and still keeps the same importance because data are an essential term today and it becomes crucial when data are distributed into several classes. The term imbalance refers to uneven distribution of data into classes that severely affects the performance of traditional classifiers, that is, classifiers become biased toward the class having larger amount of data. The data generated from wireless sensor networks will have several imbalances. This review article is a decent analysis of imbalance issue for wireless sensor networks and other application domains, which will help the community to understand WHAT, WHY, and WHEN of imbalance in data and its remedies.
Satellite communication system is expected to play a vital role for realizing various remote internet of things (IoT) applications in 6G vision. Due to unique characteristics of satellite environment, one of the main challenges in this system is to accommodate massive random access (RA) requests of IoT devices while minimizing their energy consumptions. In this paper, we focus on the reliable design and detection of RA preamble to effectively enhance the access efficiency in highdynamic low-earth-orbit (LEO) scenarios. To avoid additional signaling overhead and detection process, a long preamble sequence is constructed by concatenating the conjugated and circularly shifted replicas of a single root Zadoff-Chu (ZC) sequence in RA procedure. Moreover, we propose a novel impulse-like timing metric based on length-alterable differential cross-correlation (LDCC), that is immune to carrier frequency offset (CFO) and capable of mitigating the impact of noise on timing estimation. Statistical analysis of the proposed metric reveals that increasing correlation length can obviously promote the output signal-to-noise power ratio, and the first-path detection threshold is independent of noise statistics. Simulation results in different LEO scenarios validate the robustness of the proposed method to severe channel distortion, and show that our method can achieve significant performance enhancement in terms of timing estimation accuracy, success probability of first access, and mean normalized access energy, compared with the existing RA methods.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.