The platform will undergo maintenance on Sep 14 at about 7:45 AM EST and will be unavailable for approximately 2 hours.
2016
DOI: 10.1016/j.swevo.2016.05.002
|View full text |Cite
|
Sign up to set email alerts
|

Maintaining regularity and generalization in data using the minimum description length principle and genetic algorithm: Case of grammatical inference

Abstract: Maintaining regularity and generalization in data using the minimum description length principle and genetic algorithm: case of grammatical inference. Swarm and Evolutionary Computation, 31. pp. 11-23.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
12
0

Year Published

2017
2017
2023
2023

Publication Types

Select...
4
3

Relationship

2
5

Authors

Journals

citations
Cited by 21 publications
(12 citation statements)
references
References 52 publications
0
12
0
Order By: Relevance
“…Multiple simulations have been performed, and the average results are reported. An extensive control parameter tuning is done by Taguchi signal to noise ratio (SNR) method along with orthogonal matrix as done in [58] [59] [60]. Taguchi SNR is a log function of the desired output that serves as an objective function as shown in eq.…”
Section: Simulation Settingsmentioning
confidence: 99%
“…Multiple simulations have been performed, and the average results are reported. An extensive control parameter tuning is done by Taguchi signal to noise ratio (SNR) method along with orthogonal matrix as done in [58] [59] [60]. Taguchi SNR is a log function of the desired output that serves as an objective function as shown in eq.…”
Section: Simulation Settingsmentioning
confidence: 99%
“…Identifying these parameter values is important or increases the probability of global solution. If these parameter values have not been tuned correctly, then it leads to premature convergence -a situation when diversity decreases over some generations [19][20][21]. Exploration and exploitation is the key for the success of any meta-heuristic search algorithm.…”
Section: Proposed Techniquementioning
confidence: 99%
“…The researcher most of the time struggles to find the suitable parameter value for the metaheuristic search algorithms-same is the case with HS. In our approach, to tune the correct value of HMCR and HMS, we incorporated an orthogonal array based approach and a Taguchi method that determines the signal to noise ratio, helps in finding the right combinations of parameters [19][20][21]. PAR and BW parameters play an important role in the convergence of the algorithm to find the optimal solution [14].…”
Section: Proposed Techniquementioning
confidence: 99%
See 1 more Smart Citation
“…A comprehensive work on parameter calibration was presented in [12], though the authors suggest that better results can be achieved through Evolutionary Algorithms (EAs). GAs have been found very effective in several areas including grammar inference [14] [18], time tabling [19]. Considering this view, we propose a hybrid deep learning mechanism which utilizes the merits of GAs to enhance Gradient Decent in backpropagation learning.…”
Section: Introductionmentioning
confidence: 99%