Reliable information about wildlife is absolutely important for making informed management decisions. The issues with the effectiveness of the control and monitoring of both large and small wild animals are relevant to assess and protect the world’s biodiversity. Monitoring becomes part of the methods in wildlife ecology for observation, assessment, and forecasting of the human environment. World practice reveals the potential of the joint application of both proven traditional and modern technologies using specialized equipment to organize environmental control and management processes. Monitoring large terrestrial animals require an individual approach due to their low density and larger habitat. Elk/moose are such animals. This work aims to evaluate the methods for monitoring large wild animals, suitable for controlling the number of elk/moose in the framework of nature conservation activities. Using different models allows determining the population size without affecting the animals and without significant financial costs. Although, the accuracy of each model is determined by its postulates implementation and initial conditions that need statistical data. Depending on the geographical, climatic, and economic conditions in each territory, it is possible to use different tools and equipment (e.g., cameras, GPS sensors, and unmanned aerial vehicles), a flexible variation of which will allow reaching the golden mean between the desires and capabilities of researchers.
The structure of bacterial film formed at the inner surface of the recirculation reactor tube, is studied. The surface relief of the biofilm was visualized by scanning electron microscopy. The effect of electrochemically activated water solution on the film formed from planktonic lactobacteria or E. coli was studied. Treatment with electrochemically activated water solution destroys cells and polymeric matrix of the biofilm.
Network planning, or network analysis, is a class of application methods for project management that provides planning, analysis of deadlines (both early and late), risk of project failure or individual parts of the project. These methods allow you to link the performance of different works and processes over time, to make an operational schedule of the project, to get a forecast of the total duration of the project. In the modern practice of designing, building and managing seaports, network planning represents the most sought-after toolkit for decision makers. Network planning methods are conditionally divided into deterministic (Gant's diagrams, rigid and with additional time-lag, critical path method, etc.) and probabilistic. Latter, in turn, are divided into non-alternative (method of statistical tests or the Monte Carlo method, the method of evaluating and revising PERT plans) and alternative (GERT graphic assessment method).In many applications, the basis of the used method is to find a path on the graph. Multiple repetition of experiments, characteristic of the most effective probabilistic methods, imposes high demands to reduce the computational laboriousness of the used algorithms. In addition, the different nature of cause-and-effect relations between objects of network models leads to the formation of such a structure depicting graph processes, which do not allow the use of most known algorithms. A matrix algorithm for finding paths on weighted oriented graphs, characterized by low computational laboriousness, simplicity and visibility, and allowing different types of cause-effect relations between events, is described in the paper. The proposed algorithm is effective in terms of the set tasks, and its implementation is almost no different from the pseudo-code used to describe it.This makes it easy to implement, easy to debug and verify code, and easy to embed the algorithm in various network planning applications. One of these tasks is to find critical paths in the context of the time parameters of all works (operations) linking the tops of events.
A reliable estimation of the sea port cargo handling throughput remains an actual problem since many different tasks stem from this decision. The preliminary technological design of a future port, project consideration concerning its reconstruction and modernization, changing of its functional profile, variation of the ship sizes and patterns of call (liner and tramp, with scheduled and random calls respectfully), reasonability of calls, scheduled maintenance and random failures of the berth cargo-handling equipment -all these factors form a wide range of the consumers for this prognoses information, both on strategical and operative level of decision taking. Obviously, this level, as well as the time and labor resources allocated for this task set different features of the procedure, while the demands for the principal quality of the resulting decision remain nearly the same and reflect "a certain level of uncertainty" due to stochastic nature of values. As the conducted study shows, there are alternative variants to gain relatively reliable prognoses result: development of unique simulation models, requiring essential labor and time inputs and bearing a narrow "local" character, or usage of simplified analogues, nearly as general as the analytical techniques. The paper deals with a new approach for port project and design procedure based upon the stochastic model of the sea cargo front combined with the elements of discrete-event simulation. The model develops analytical estimations gained by the analytical methods aimed for assessment of berths number, which forms on task of the direct design procedure of the technologic design. In the reverse design procedure the model enables to assess a possible cargo flow which could be handled by the group of berths. The adequacy of the results is confirmed by the simulation experiments.
At design stages of any sea port development projects one of the key tasks is to estimate the amount of cargo volume to be stored on the port warehouse. The shortage of the warehouse facilities would disrupt port operations and affect the port marketing position, while the surplus capacity would raise the selfcost of the services rendered by the port. Many port developing projects and long years of operational practice have resulted into certain commonly accepted mathematical techniques that enable to assess all main parameters sufficiently accurately. With ever-growing completion between the ports worldwide, to find a delicate balance between the cost and quality becomes a core task behind nearly every aspect of port design activity. The tools that have been used for centuries in port design and development started to lose their adequacy in modern economic and logistic environment. As the response for this challenge the port designers more and more move to simulation models. In the same time, an adequate simulation models need not only accurate and reliable data, but also requests quite long time. Moreover, the models of the kind usually are created ad hoc, reflecting particular features of the primal object under development and forfeiting the generality and universality of analytical models. At beginning stages of port developing one need to have simple and easy tools for the preliminary accession of project parameters, since usually there are several variants and the full-scaled simulation of them is excluded. Still, these tools should be more enhanced sophisticated than common analytical formulae. The main drawback of the formula calculation (streaming computing by the current IT terminology) is they principally deal with deterministic values, while the real worlds is inhibited with the stochastic ones. The study represented here is an attempt to narrow this gap. The area selected to demonstrate the approach is the port warehouse size, regardless of the cargo type handled. In the same time, this technique can be spread on many other port project parameters needed to be assessed. http://www.transnav.eu the International Journal on Marine Navigation and Safety of Sea Transportation Volume 14 Number 4
Making decisions based on intuitive analytical methods is becoming a dangerous practice in modern conditions of competition and high capital capacity of cargo terminals operation. Such methods are allowing to evaluate an average of performance indicators for cargo terminals at very best, while their market stability is becoming gradually more dependent on the pattern of distribution around the averages. This research proposes a method of cargo terminal performance analysis, based on the simulation modelling. The importance of focusing on cargo flows through the terminal, instead of modelling only the operational processes of a certain cargo terminal, is emphasized. The paper describes the approach to creating such models of distribution around the averages. The proposed model structure is targeted at a wide range of "dry port" type container terminals. There have been analyzed all possible traffic flows which require different capacities and technological resources for handling at the terminal. A standard description of the freight routes passing through the terminal as a simulated model provides low labor-intensiveness of planning experiments which helps easily change the cargo handling flow chart of the terminal analyzed. The efficiency of the simulation modelling method for calculating technological parameters of dry cargo terminals has been approved in the course of implementing several large projects.
The simulation as a tool for the design of port and terminals has emerged as an answer for the demand to enhance the quality and reliability of the project results. Very high costs of the project solution implementation and practically total lack of liquidity of transport infrastructure objects always induced the immense commercial risks in the terminal business. Lately these risks have multiplied significantly due to rapid changes on the global and regional markets of transport services. Today, many experts come to see this volatility as an indicator of the next phase in development of the global trade system and the derivative cargo transportation system, specifically the state of temporal saturation. The shift of the global goods volumes from quick and steady growth to relatively small fluctuations around constant values causes quick oscillations in redistribution of demand over the oversized supply. This new business and economic environment seriously affected the paradigm of transport terminal design and development techniques. The new operational environment of terminals put a request for the designers to arrange the results not in terms of "point", but in terms of "functions". Eventually it resulted in development of the modern object-oriented model approach. The wide spread of this approach witnesses the objective demand for this discipline, while in many aspects it remains in the intuitive (pre-paradigmal) phase of its development. The main reason for it is in the problem definition itself, which usually is formulated as the simulation of a given terminal. At the same time, the task is to assess the operational characteristics of the terminal engaged in processing of a given combination of cargo flows. Consequently, it is not the terminal that should be simulated, but the processes of cargo flows handling performed by this terminal under investigation. Another problem that restricts the practical spread of simulation is in the model adequacy. A model which adequacy is not proved has no gnoseological value at all.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.