2001
DOI: 10.1109/72.914519
|View full text |Cite
|
Sign up to set email alerts
|

STRIP - a strip-based neural-network growth algorithm for learning multiple-valued functions

Abstract: We consider the problem of synthesizing multiple-valued logic functions by neural networks. A genetic algorithm (GA) which finds the longest strip in V is a subset of K(n) is described. A strip contains points located between two parallel hyperplanes. Repeated application of GA partitions the space V into certain number of strips, each of them corresponding to a hidden unit. We construct two neural networks based on these hidden units and show that they correctly compute the given but arbitrary multiple-valued… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
5
0

Year Published

2001
2001
2024
2024

Publication Types

Select...
4
4
2

Relationship

0
10

Authors

Journals

citations
Cited by 24 publications
(9 citation statements)
references
References 33 publications
0
5
0
Order By: Relevance
“…If algorithms need acceleration, due to Big Data, possibilities are: (a) The existing computing architecture could be enhanced [24], (b) A new computing architecture could be introduced [25], (c) The number of iterations of the algorithm could be minimized using machine intelligence [26], or (d) Each iteration could be cut shorter if some kind of suboptimal computing is utilized [27].…”
Section: The Fourth Casementioning
confidence: 99%
“…If algorithms need acceleration, due to Big Data, possibilities are: (a) The existing computing architecture could be enhanced [24], (b) A new computing architecture could be introduced [25], (c) The number of iterations of the algorithm could be minimized using machine intelligence [26], or (d) Each iteration could be cut shorter if some kind of suboptimal computing is utilized [27].…”
Section: The Fourth Casementioning
confidence: 99%
“…Model selection depends on algorithm selection, hyperparameter optimization and data preprocessing, like reduction and integration, which are often heuristically found by the systematic grid, a random search, or by applying machine learning [42], and this paper addresses algorithm selection only. Using logical domain-specific meta-features would make it possible to efficiently provide results for a given task and budget by minimizing computational resources and amount of data required for model selection in AutoML frameworks, which would speed up the learning process and improve overall performance.…”
Section: B Problem Analysismentioning
confidence: 99%
“…There are also other works, see Maniezzo (1994), Mitra and Hayashi (2000) and Pattichis and Schizas (1996), for their attempt in providing an exhaustive survey of neuro‐fuzzy rule generation algorithms. Moreover, Ngom et al (2001) considered the problem of synthesizing multiple‐valued logic functions by NNs, and the authors described a GA which finds the longest strip in. Most recently, Palaniappan et al (2002) applied NNs to classify alcoholics and non‐alcoholics by features extracted from visual evoked potential (VEP).…”
Section: Related Literature Reviewmentioning
confidence: 99%