Let be a real Hilbert space. Let C be a nonempty closed convex subset of with inner product and norm respectively and let be a self-mapping of We use to denote the set of fixed points of (i.e., ).
Recall that
is said to be a
-strict pseudo-contraction if there exists a constant
such that
Please note that the class of
-strict pseudo-contractions strictly includes the class of
nonexpansive mappings which are self-mappings
on
C such that
In particular, is nonexpansive mapping if and only if is a 0-strict pseudo-contraction.
Iterative methods for finding fixed points of nonexpansive mappings are an important topic in the theory of weak and strong convergence theorem, see for example [
1,
2,
3] and the references therein.
Over recent decades, many authors have constructed various types of iterative methods to approximate fixed points. The first one is the Mann iteration introduced by Mann [
4] in 1953 which is defined as follows:
where
is chosen arbitrarily and
is a mapping. If
is a nonexpansive mapping, the sequence
be generated by
converges weakly to an element of
It is well known that in an infinite-dimensional Hilbert space, the normal Mann’s iterative algorithm [
4] is only weakly convergent.
It is clear that strict pseudo-contractions are more general than nonexpansive mappings, and therefore they have a wider range of applications. Therefore, it is important to develop the theory of iterative methods for strict pseudo-contractions. Indeed, Browder and Petryshyn [
5] proved that if the sequence
is generated by
with a constant control parameter
for all
Then the sequence
converges weakly to a fixed point of the strict pseudo-contraction
Moreover, many mathematicians proposed iterative algorithms and proved the strong convergence theorems for a nonexpansive mapping and a
-strictly pseudo-contractive mapping in Hilbert space to find their fixed points, see for example [
6,
7,
8,
9].
To prove the strong convergence of iterations determined by nonexpansive mapping, Moudafi [
1] established a theorem for finding fixed points of nonexpansive mappings. More precisely, he established the following result, known as the
.
Theorem 1. Let C be a nonempty closed convex subset of a real Hilbert spaceand let S be a nonexpansive mapping of C into itself such thatis nonempty. Let f be a contraction of C into itself and letbe a sequence defined as follows:whereis a sequence of positive real numbers having to go to zero. Then the sequenceconverges strongly to, whereandis a metric projection ofonto. The Moudafi viscosity approximation method can be applied to elliptic differential equations, linear programming, convex optimization and monotone inclusions, it has been widely studied in the literature (see [
10,
11,
12]).
To construct an iterative algorithm such that it converges strongly to the fixed points of a finite family of strict pseudo-contractions by using the concept of the viscosity approximation method (
4) and Manns iteration (
3), Yao et al. [
13] proposed the intermixed algorithm for two strict pseudo-contractions as follows:
Algorithm 1. For arbitrarily given,, let the sequencesandbe generated iteratively bywhereandare two sequences of real number in (0,1),are a strict-pseudo-contractions,is a-contraction andis a-contraction,is a constant. Then they proved the strong convergence theorem of the iterative sequences and defined by as follows.
Theorem 2. Suppose thatand. Assume the following conditions are satisfied:
and,
for all.
Then the sequencesandgenerated byconverge strongly toand, respectively.
If putting
and
in (
5), we have
which is a modified version of viscosity approximation method. Observe that the sequence
and
are mutually dependent on each other.
Let
The
is to find a point
such that
for all
. The set of solutions of (
7) is denoted by
. It is known that the variational inequality, as a strong and important tool, has already been studied for a wide class of optimization problems in economics, and equilibrium problems arising in physics and several other branches of pure and applied sciences, see for example [
14,
15,
16,
17].
Recently, in 2018, Siriyan and Kangtunyakarn [
18] introduced the following
modified generalized system of variational inequalities (MGSV), which involves finding
such that
where
and
If putting
, in (
8), we have
which is generalized system of variational inequalities modified by Ceng et al. [
19],
To find an element of the set of solutions of modified generalized system of variational inequalities problem
, Siriyan and Kangtunyakarn [
18] introduced the following iterative scheme:
where
be
-inverse strongly monotone mappings, respectively,
is defined by
and
. Under some suitable conditions, see more details [
18], they proved that the sequence
converges strongly to
and
is a solution of (
10) where
and
.
Moreover, they proved Lemma 3 in the next section which involving MGSV and the set of solution of fixed point of nonlinear equation related to a metric projection onto
C. This lemma is very important to prove our main result in
Section 2.
By using the concept of (
5), we introduce a new iterative method for solving a modified generalized system of variational inequalities as follows:
Algorithm 2. Starting with, let the sequencesandbe defined by By putting
, we get
which is a modified version of (5).
Under some extra conditions in Theorem 3, we prove a strong convergence theorem for solving fixed-point problems of nonlinear mappings and two variational inequality problems by using Algorithm 2 which is an approximate MGSV. Moreover, using our main result, we obtain additional results involving the split feasibility problem (SFP) and the constrained convex minimization problem. Finally, we give a numerical example for the main theorem.
1. Preliminaries
We denote the weak convergence and the strong convergence by and respectively. For every , there exists a unique nearest point in C such that for all is called the metric projection of onto
Definition 1. A mappingis called contractive if there exists a constantsuch thatfor all. A mapping
is called
-
if there exists a positive real number
such that
for all
The following lemmas are needed to prove the main theorem.
Lemma 1 ([
20])
. Each Hilbert space satisfies Opial’s condition, i.e., for any sequence with the inequalityholds for every with . Lemma 2. Letbe a real Hilbert space. Then, for allandwithwe have
- (i)
- (ii)
Lemma 3 ([
18]).
Let C be a nonempty closed convex subset of a real Hilbert space and let are three mappings. For every and . The following statements are equivalent- (i)
is a solution of problem (8) - (ii)
is a fixed point of the mapping G, i.e.,, defined the mappingby, whereand.
Lemma 4 ([
21])
. Let be a sequence of nonnegative real numbers satisfyingwhere is a sequence in and is a sequence such that- (i)
,
- (ii)
or
Then
Lemma 5 ([
22])
. For a given and Furthermore, is a firmly nonexpansive mapping of onto C, i.e., Lemma 6 ([
23])
. Let C be a nonempty closed convex subset of a real Hilbert space and let be a κ-strictly pseudo-contractive mapping with . Then, the following statements hold:- (i)
- (ii)
For everyand,forandand
2. Main Result
In this section, we introduce a strong convergence theorem for solving fixed-point problems of nonlinear mappings and two variational inequality problems by using Algorithm 2.
Theorem 3. Let C be nonempty closed convex subset of a real HilbertForletbe-inverse strongly monotone mapping withand letbeand-contraction mappings withForandletbe-inverse strongly monotone, wherewithFordefinebyLet the sequencesandbe generated byand bywherewithandwith γ = min. Assume the following conditions hold: - (i)
for,
- (ii)
and,
- (iii)
for alland for some,
- (iv)
Thenconverges strongly towhereandandconverges strongly towhereand
Proof. The proof of this theorem will be divided into five steps.
Step 1. We will show that is bounded.
First, we will prove that
is nonexpansive with
, for
we get
Thus, is a nonexpansive mapping, for and .
Let
and
. Then we have
Combining
and
, we have
By induction, we can derive that
for every
This implies that
and
are bounded.
Step 2. Claim that
First, we let
and
. Then, observe that
By the definition of
and
we obtain
Using the same method as derived in
, we have
From
and
, then we get
Applying Lemma 4 and the condition (ii), (iii) and (iv) we can conclude that
Step 3. Prove that
To show this, take
. Then we derive that
which implies that
From
and
, we obtain
Observe that
by
and
, we obtain
Applying the same arguments as for deriving
, we also obtain
From
and
, we have
From
,
and condition (ii), we get
From
and
, we have
Applying the same method as
, we also have
Step 4. Claim that , where .
First, take a subsequence
of
such that
Since is bounded, there exists a subsequence of such that as From , we obtain
Next, we need to show that
Assume
. Then, we have
By the Opial’s condition, we obtain
This is a contradiction.
Assume , then we get .
From the Opial’s condition and
, we have
This is a contradiction.
By
and
, this yields that
Since
,
and Lemma 5, we can derive that
Following the same method as for
, we obtain that
Step 5. Finally, prove that the sequences and converge strongly to and , respectively.
By firm nonexpansiveness of
, we derive that
which yields
From the definition of
and
, we get
Similarly, as derived above, we also have
From
and
, we deduce that
Applying the condition , , , and Lemma 4, we can conclude that the sequences and converge strongly to and , respectively. This completes the proof. □
Corollary 1. Let C be nonempty closed convex subset of a real HilbertForletbe a-strictly pseudo-contractive mapping withand letbeand-contraction mappings withForandletbe-inverse strongly monotone, wherewithFordefinebyLet the sequenceandbe generated byand bywherewith,with α =and. Assume the following conditions hold: - (i)
for,
- (ii)
and,
- (iii)
forand for some,
- (iv)
Thenconverges strongly towhereandandconverges strongly towhereand
Proof . From Theorems 3 and 6, we have the desired conclusion. □
3. Application
In this section, we obtain Theorems 4 and 5 which solve the split feasibility problem and the constrained convex minimization problem. To prove these theorems, the following definition and lemmas are needed.
Let and be real Hilbert spaces and let be nonempty closed convex subsets of and , respectively. Let be bounded linear operator with are adjoint of and , respectively.
3.1. The Split Feasibility Problem
The split feasibility problem (SFP) is to find a point
and
This problem was introduced by Censor and Elfving [
24]. The set of all solution (SFP) is denoted by
. The split feasibility problem was studied extensively as an extremely powerful tool in various fields such as medical image reconstruction, signal processing, intensity-modulated radiation therapy problems and computer tomograph; see [
25,
26,
27] and the references therein.
In 2012, Ceng [
28] introduced the following lemma to solve SFP;
Lemma 7. Given, the following statements are equivalent:
- (i)
;
- (ii)
, whereis adjoint of A;
- (iii)
solves the variational inequality problem (VIP) of findingsuch that, for alland
By using these results, we obtain the following theorem
Theorem 4. Letandbe a real Hilbert spaces and letbe a nonempty closed convex subsets of a real Hilbert spaceandrespectively. Letbe bounded linear operator withare adjoint ofand, respectively andare spectral radius ofand, respectively withLetbeand-contraction mappings withForandletbe-inverse strongly monotone, wherewithFordefinebyLet the sequencesandbe generated byand bywhere,for all,withandwith. Assume the following conditions hold: - (i)
wherefor,
- (ii)
and,
- (iii)
for alland for some,
- (iv)
Thenconverges strongly towhereandandconverges strongly towhereand
Proof . Let .
First, we will show that is -inverse strongly monotone.
From the property of
, we have
Since
, we get
Then is -inverse strongly monotone.
Using the same method as
, we have
Then is -inverse strongly monotone.
By using Theorems 3 and 7, we obtain the conclusion. □
3.2. The Constrained Convex Minimization Problem
Let
C be closed convex subset of
. The constrained convex minimization problem is to find
such that
where
is a continuous differentiable function. The set of all solution of
is denoted by
.
It is known that the gradient-projection algorithm is one of the powerful methods for solving the minimization problem
, see [
29,
30,
31].
Before we prove the theorem, we need the following lemma.
Lemma 8 ([
32])
. A necessary condition of optimality for a point to be a solution of the minimization problem is that solves the variational inequalityEquivalently,solves the fixed-point equationfor every constantIf, in addition, ℑ is convex, then the optimality conditionis also sufficient. By using these results, we obtain the following theorem.
Theorem 5. Let C be nonempty closed convex subset of a real HilbertForandletbe continuous differentiable function withis-inverse strongly monotone withLetbeand-contraction mappings with. Forandletbe-inverse strongly monotone, wherewithFordefinebyLet the sequencesandbe recursively defined byand bywherewith,with α=and γ = min. Assume that the following conditions are satisfied: - (i)
for,
- (ii)
and,
- (iii)
for alland for some,
- (iv)
Thenconverges strongly towhereandandconverges strongly towhereand
Proof . By using Theorems 3 and 8, we obtain the conclusion. □
4. A Numerical Example
In this section, we give an example to support our main theorem.
Example 1. Letbe the set of real numbers,Letbe defined byandfor everyFor everyletbe defined byfor everyLetbe defined byandfor allFor everyletbe defined bydefinebyand
Let the sequencesandbe generated byand byfor allThen the sequenceconverges strongly toandconverges strongly to. Solution. By the definition of for every we have From Theorem 3, we can conclude that the sequences and converge strongly to .
The following
Table 1 and
Figure 1 show the numerical results of the sequences
and
where
,
and
.
5. Conclusions
From the above numerical results, we can conclude that
Table 1 and
Figure 1 show that the sequences
and
converge to
and the convergence of
and
can be guaranteed by Theorem 3.