This study deals with a methodology for increasing the efficiency of dynamic process calculations in elastic elements of complex engineering constructions. We studied the complex dynamic processes in a simple engineering construction, a mechanical system of an elastic body–continuous flow of homogeneous medium. The developed methodology is based on the use of a priori information on some of the vibrations forms, the construction of a “simplified” mathematical model of system dynamics, and the obtaining of an analytical relationship that describe the overall range of factors on the elastic vibrations of system. The methodology is used for cases of complex vibrations of elastic bodies, and the obtained results can serve as a basis for choosing the main technological and operational parameters of elastic elements of mechanisms and machines that perform complex vibrations. The results obtained in this work are the basis for calculating the blast effect on the elements of protective structures in order to increase their protective capacity by improving the method of their attachment or by using additional reinforcement, buff load effects on the elements of drilling strings and dynamic processes that occur during surface strengthening by work hardening in order to avoid resonance phenomena, and technological processes of vibration displacement or vibration separation of granular media.
The Covid-19 crisis lockdown caused rapid transformation to remote working/learning modes and the need for e-commerce-, web-education-related projects development, and maintenance. However, an increase in internet traffic has a direct impact on infrastructure and software performance. We study the problem of accurate and quick web-project infrastructure issues/bottleneck/overload identification. The research aims to achieve and ensure the reliability and availability of a commerce/educational web project by providing system observability and Site Reliability Engineering (SRE) methods. In this research, we propose methods for technical condition assessment by applying the correlation of user-engagement score and Service Level Indicators (SLIs)/Service Level Objectives (SLOs)/Service Level Agreements (SLAs) measurements to identify user satisfaction types along with the infrastructure state. Our solution helps to improve content quality and, mainly, detect abnormal system behavior and poor infrastructure conditions. A straightforward interpretation of potential performance bottlenecks and vulnerabilities is achieved with the developed contingency table and correlation matrix for that purpose. We identify big data and system logs and metrics as the central sources that have performance issues during web-project usage. Throughout the analysis of an educational platform dataset, we found the main features of web-project content that have high user-engagement and provide value to services’ customers. According to our study, the usage and correlation of SLOs/SLAs with other critical metrics, such as user satisfaction or engagement improves early indication of potential system issues and avoids having users face them. These findings correspond to the concepts of SRE that focus on maintaining high service availability.
Cohort analysis is a new practical method for e-commerce customers’ research, trends in their behavior, and experience during the COVID-19 crisis. The purpose of the research is to validate the efficiency of this method on the e-commerce records data set and find out the critical factors associated with customer awareness and loyalty levels. The cohort analysis features engineering, descriptive statistics, and exploratory data analysis are the main methods used to reach the study purpose. The research results showed that cohort analysis could answer various business questions and successfully solve real-world problems in e-commerce customer research. It could be extended to analyze user satisfaction with a platform’s technical performance and used for infrastructure monitoring. Obtained insights on e-commerce customers’ awareness and loyalty levels show the likeliness of a user to make a purchase or interact with the platform. Key e-business aspects from a customer point of view are analyzed and augment the user-experience understanding to strengthen customers’ relationships in e-commerce.
Fact-checking and journalists professional standards usually are considered to be the best fail-safe against manipulations in media. However, we found that newsmakers are able to manipulate even the audience of so-called ‘high-quality media’ who practice all mentioned approaches. To prove this we have refined the concept of ‘pseudo-event’, introduced by D.J. Boorstin, by defining the term ‘fake newsworthy event’ as an event created by newsmakers, that is high-profile and attractive for media, but the only or particular aim of these actions is an agenda-setting, and this aim is not obvious from the origin of the action. Namely, the member of parliament may file some bill realizing that it cannot be adopted and trying just to shape the public opinion. Or some person may claim against a celebrity or businessman having no chance to win at trial. On the example of Ukrainian ‘high-quality media’ we showed that journalists usually do not take into account whether some topics are launched just for manipulating agenda-setting. To prove that we gathered the data about publications focused on such topics in Ukrainian ‘high-quality media’, we provided their discourse analysis, and compared the result with experts’ evaluations of ‘media quality’ and ‘artificiality rate’ of the topic. We have not found correlations between ‘artificiality’ of the topic and the number of publications. Recommendations were elaborated for the media workers if they want to avoid this type of manipulation.
An approach to use Operational Intelligence with mathematical modeling and Machine Learning to solve industrial technology projects problems are very crucial for today’s IT (information technology) processes and operations, taking into account the exponential growth of information and the growing trend of Big Data-based projects. Monitoring and managing high-load data projects require new approaches to infrastructure, risk management, and data-driven decision support. Key difficulties that might arise when performing IT Operations are high error rates, unplanned downtimes, poor infrastructure KPIs and metrics. The methods used in the study include machine learning models, data preprocessing, missing data imputation, SRE (site reliability engineering) indicators computation, quantitative research, and a qualitative study of data project demands. A requirements analysis for the implementation of an Operational Intelligence solution with Machine learning capabilities has been conducted and represented in the study. A model based on machine learning algorithms for transaction status code and output predictions, in order to execute system load testing, risks identification and, to avoid downtimes, is developed. Metrics and indicators for determining infrastructure load are given in the paper to obtain Operational intelligence and Site reliability insights. It turned out that data mining among the set of Operational Big Data simplifies the task of getting an understanding of what is happening with requests within the data acquisition pipeline and helps identify errors before a user faces them. Transaction tracing in a distributed environment has been enhanced using machine learning and mathematical modelling. Additionally, a step-by-step algorithm for applying the application monitoring solution in a data-based project, especially when it is dealing with Big Data is described and proposed within the study.
In this article, we have explored methods for the strategic management of web projects. By introducing a long-term web project development strategy into the operation of a web project, mechanisms can be developed to improve the efficiency and effectiveness of the web project. An important factor is to develop a strategy taking into account all possible crisis situations and ways out of these situations. The authors analyzed and simulated the web project structure, working out methods of web project strategy realization and implementation in a crisis situation. Additionally, in this article, the authors have presented the model for the strategic map of the balanced scorecard of a web project. The authors tested the developed methods on six web projects of university departments. The received results confirmed the appropriateness and necessity of the development and implementation of methods of the strategic management of web projects.
This article discusses the relevant task of analyzing user data in the process of managing various web projects. The results of this analysis will help to improve the management of diverse web projects during crises. The authors explore the concept of data heterogeneity in web projects, classify web projects by function and purpose, and analyze the search models and data display in web projects. The proposed algorithms for analyzing user data in the process of managing diverse web projects will improve the structuring and presentation of data on the web project platform. The model user data analysis complex developed by the authors will simplify the process of managing various web projects during crises.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.