2017
DOI: 10.22436/jnsa.010.05.04
|View full text |Cite
|
Sign up to set email alerts
|

Weak theta-phi-contraction and discontinuity

Abstract: In this paper, we introduce the notion of weak θ-φ-contraction ensuring a convergence of successive approximations but does not force the mapping to be continuous at the fixed point. Thus, we answer one more solution to the open question raised

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4

Citation Types

0
7
0

Year Published

2018
2018
2024
2024

Publication Types

Select...
8

Relationship

0
8

Authors

Journals

citations
Cited by 15 publications
(7 citation statements)
references
References 13 publications
(17 reference statements)
0
7
0
Order By: Relevance
“…Observe that condition Θ1 can be withdrawn, and still Theorem 3.1 (also, most of the existence results in the literature (e.g., results of[25][26][27][28][29])) survives (in view of Proposition 3.1). Now, Proposition 3.1 and Remark 3.2 led us to define a weaker contraction under the name of weak θ -contraction as follows.…”
mentioning
confidence: 97%
“…Observe that condition Θ1 can be withdrawn, and still Theorem 3.1 (also, most of the existence results in the literature (e.g., results of[25][26][27][28][29])) survives (in view of Proposition 3.1). Now, Proposition 3.1 and Remark 3.2 led us to define a weaker contraction under the name of weak θ -contraction as follows.…”
mentioning
confidence: 97%
“…In 1999, Pant [27] proved two fixed point theorems in which the considered mappings were discontinuous at the fixed points, hence gave affirmative solutions to the Rhoades problem for both the single and a pair of self-mappings. Some new solutions to this problem with applications to neural networks have been reported in [2,3,4,5,6,12,23,24,25,26,29,30,31,32,37,39]. Fixed point theorems for discontinuous mappings have found a variety of applications, e.g., neural networks are generally used in character recognition, image compression, stock market prediction and to solve non-negative sparse approximation problems ( [10,11,20,21,22,38]).…”
Section: Introductionmentioning
confidence: 99%
“…Fixed point theorems for contractive mappings which admit discontinuity at the fixed point and their applications to neural networks with discontinuous activation functions have emerged as a very active area of research (e.g. Bisht and Rakocevic [4,5], Ozgur and Tas [27,28], Rashid et al [33], Tas and Ozgur [38], Tas et al [39], Zheng and Wang [44]). The question of the existence of contractive mappings which admit discontinuity at the fixed point arose with the publication of two papers by Kannan [18,19].…”
Section: Introductionmentioning
confidence: 99%
“…Recently some more solutions to the problem of continuity at fixed point and applications of such results to neural networks with discontinuous activation functions have been reported (e.g. Bisht and Pant [2,3], Bisht and Rakocevic [4,5], Ozgur and Tas [27,28], Rashid et al [33], Tas and Ozgur [38], Tas et al [39], Zheng and Wang [44]). All the known solutions of the Rhoades' problem (e.g.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation