2003
DOI: 10.1109/tnn.2003.810598
|View full text |Cite
|
Sign up to set email alerts
|

On the number of multilinear partitions and the computing capacity of multiple-valued multiple-threshold perceptrons

Abstract: We introduce the concept of multilinear partition of a point set V/spl sub/R/sup n/ and the concept of multilinear separability of a function f:Vtwo head right arrowK={0,...,k-1}. Based on well-known relationships between linear partitions and minimal pairs, we derive formulae for the number of multilinear partitions of a point set in general position and of the set K(2). The (n,k,s)-perceptrons partition the input space V into s+1 regions with s parallel hyperplanes. We obtain results on the capacity of a sin… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
6
0

Year Published

2004
2004
2020
2020

Publication Types

Select...
3
1
1

Relationship

1
4

Authors

Journals

citations
Cited by 8 publications
(6 citation statements)
references
References 20 publications
0
6
0
Order By: Relevance
“…A -valued -threshold function of one variable is defined as if if for if (1) where output vector; threshold vector with ; number of threshold values. A -input -valued -threshold perceptron [23], [40], [24], [41], abbreviated as -perceptron, computes a weighted -input -valued -threshold function given by (2) where input vector; …”
Section: A Multiple-valued Logic Neural Networkmentioning
confidence: 99%
See 4 more Smart Citations
“…A -valued -threshold function of one variable is defined as if if for if (1) where output vector; threshold vector with ; number of threshold values. A -input -valued -threshold perceptron [23], [40], [24], [41], abbreviated as -perceptron, computes a weighted -input -valued -threshold function given by (2) where input vector; …”
Section: A Multiple-valued Logic Neural Networkmentioning
confidence: 99%
“…Obradović [25] described algorithms for learning multiple-valued logic functions on either a single homogeneous -perceptron or a depth-two network composed of -perceptrons in the hidden layer and one homogeneous -perceptron in the output layer. Ngom [24], [41] introduced learning algorithms for permutably homogeneous -perceptrons and proved them to be more powerful than the algorithms mentioned in [25]. These are but a few mentioned methods for learning multiple-valued logic functions; the reader can refer to [22] for a survey on multiple-valued logic neural networks.…”
Section: Background and Motivationsmentioning
confidence: 99%
See 3 more Smart Citations