2020
DOI: 10.1007/s00500-020-05185-z
|View full text |Cite
|
Sign up to set email alerts
|

The use of grossone in elastic net regularization and sparse support vector machines

Abstract: New algorithms for the numerical solution of optimization problems involving the l 0 pseudo-norm are proposed. They are designed to use a recently proposed computational methodology that is able to deal numerically with finite, infinite and infinitesimal numbers. This new methodology introduces an infinite unit of measure expressed by the numeral x (grossone) and indicating the number of elements of the set IN, of natural numbers. We show how the numerical system built upon x and the proposed approximation of … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
3
2
1

Relationship

0
6

Authors

Journals

citations
Cited by 7 publications
(4 citation statements)
references
References 29 publications
0
3
0
Order By: Relevance
“…It has been proposed by Sergeyev, and [31] contains an exhaustive discussion of the topic. GM has found several applications in optimization theory, such as regularization [14], conjugate gradient methods [15], and especially in lexicographic multi-objective LP. In [8], indeed, a Grossone-version of the Simplex algorithm (the G-Simplex) has been implemented and theoretically studied.…”
Section: Grossone Methodologymentioning
confidence: 99%
See 1 more Smart Citation
“…It has been proposed by Sergeyev, and [31] contains an exhaustive discussion of the topic. GM has found several applications in optimization theory, such as regularization [14], conjugate gradient methods [15], and especially in lexicographic multi-objective LP. In [8], indeed, a Grossone-version of the Simplex algorithm (the G-Simplex) has been implemented and theoretically studied.…”
Section: Grossone Methodologymentioning
confidence: 99%
“…x is larger than expected. This is due to the fact that here we use the same strategy used in the design of the I-Big-M method [7], i.e., we have added to the problem a set of artificial variables, which have been infinitely penalized (by the term x, as done in the pioneering works [9,14] by De Leone et al). This allowed us to have an initial basis to start from (the one made of only artificial variables).…”
Section: Experiments 1: Infinitesimally Perturbed Rock-paper-scissorsmentioning
confidence: 99%
“…Sergeyev, who has introduced the grossone methodology (Sergeyev, 2017). Since its appearance in 2003, a number of applications have emerged in extremely disparate fields: optimization (De Leone et al, 2020b;Lai et al, 2021a;2021b;De Leone, 2018;Cococcioni and Fiaschi, 2020; ordinary differential equations (Sergeyev et al, 2016;Amodio et al, 2017;Iavernaro et al, 2020) and machine learning (De Leone et al, 2020a;Astorino and Fuduli, 2020), among others. In addition, an implementation of the grossone methodology within Simulink R has been recently introduced (Falcone et al, 2020a;.…”
Section: Related Workmentioning
confidence: 99%
“…The first group discusses optimization problems and algorithms including local, global, and multi-criteria optimization (see Franchini et al 2020;Ž ilinskas and Litvinas 2020;Nesterov 2020;Posypkin et al 2020;Shao et al 2020;De Leone et al 2020;Capuano et al 2020;Lančinskas et al 2020;Sergeyev et al 2020;Crisci et al 2020;Cavallaro et al 2020;Astorino and Fuduli 2020;Candelieri et al 2020). The second group of papers (see D'Alotto 2020; Pepelyshev and Zhigljavsky 2020;Falcone et al 2020;Amodio et al 2020;Gangle et al 2020;De Leone et al 2020;Astorino and Fuduli 2020) deals with problems and algorithms using the already mentioned recent computational framework allowing one to work with different infinities and infinitesimals numerically. It can be seen from these articles that, in addition to its already known applications, the new computational methodology can be successfully used in such fields as optimization, ordinary differential equations, game theory, classification, logic, and fractals.…”
mentioning
confidence: 99%