2019
DOI: 10.18048/2019.57.01.
|View full text |Cite
|
Sign up to set email alerts
|

Artificial neural network for predicting values of residuary resistance per unit weight of displacement

Abstract: This paper proposes the usage of an Artificial neural network (ANN) to predict the values of the residuary resistance per unit weight of displacement from the variables describing ship’s dimensions. For this purpose, a Multilayer perceptron (MLP) regressor ANN is used, with the grid search technique being applied to determine the appropriate properties of the model. After the model training, its quality is determined using R2 value and a Bland-Altman (BA) graph which shows a majority of values predicted fallin… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
8
1

Year Published

2020
2020
2024
2024

Publication Types

Select...
5
1

Relationship

4
2

Authors

Journals

citations
Cited by 10 publications
(10 citation statements)
references
References 48 publications
0
8
1
Order By: Relevance
“…Higher value of the learning rate will cause the ANN to converge to a solution faster, as weights will be changed more quickly, but setting it too high can cause the ANN to diverge instead of converge as it will skip over the weights that will cause the convergence to a solution. Setting the learning rate too low will cause the neural network too converge extremely slowly -or, more problematically, not to converge at all due to the runtime (number of iterations) being too high for realistic applications [48]. -α Adjustment -adjustment of learning rates through the training.…”
Section: Hyperparameter Adjustment / Podešavanje Hiperparametramentioning
confidence: 99%
See 1 more Smart Citation
“…Higher value of the learning rate will cause the ANN to converge to a solution faster, as weights will be changed more quickly, but setting it too high can cause the ANN to diverge instead of converge as it will skip over the weights that will cause the convergence to a solution. Setting the learning rate too low will cause the neural network too converge extremely slowly -or, more problematically, not to converge at all due to the runtime (number of iterations) being too high for realistic applications [48]. -α Adjustment -adjustment of learning rates through the training.…”
Section: Hyperparameter Adjustment / Podešavanje Hiperparametramentioning
confidence: 99%
“…Hyperparameters that are adjusted and their possible values can be seen in the Table 2. Hidden Layers ( 16), ( 32), ( 50), (100), (16,16), (32,32), (16,32,16), (32,32,32), (16,16,16,16), (16,32,32,16), (32,32,32,32), (32,50,50,32), (100,100,100), (100,100,100,100) Hyperparameter choices, including solver, learning rate and others, are usually randomized, but since this adds to complexity by increasing the search space the possible values are selected based on authors previous experience with similar problems in which similar hyperparameter choices and combinations provided good results [8,11,48].…”
Section: Hyperparameter Adjustment / Podešavanje Hiperparametramentioning
confidence: 99%
“…The final input variable is the Froude number which defines the ratio of the flow inertia to the external field. The output, residuary resistance per unit weight of displacement describes the amount of resistance experienced in the dependence with vessel displacement expressed in unit weights [15].…”
Section: Datasetmentioning
confidence: 99%
“…The longitudinal position of the center of buoyancy or uniform distributions, considering the small amount of data. The second is to allow for a more direct comparison to the previous research in the application of machine learning methods on the dataset used in this research -as the research in question has not used the cross-validation method, but instead used the standard train-test data split [15,16], used in the presented research.…”
Section: Residuary Resistance Per Unit Weight Of Displacementmentioning
confidence: 99%
See 1 more Smart Citation