Stability Analysis for Stochastic Markovian Jumping Neural Networks with Leakage Delay



Yajun Li1, *, Xisheng Dai2, Wenping Xiao1, Like Jiao1
1 Institute of Electronics and Information Engineering, Shunde Polytechnic, Foshan, 528300, China
2 School of Electrical and Information Engineering, Guangxi University of Science and Technology, Guangxi, Liuzhou, 545006, China


Article Metrics

CrossRef Citations:
0
Total Statistics:

Full-Text HTML Views: 932
Abstract HTML Views: 480
PDF Downloads: 313
ePub Downloads: 202
Total Views/Downloads: 1927
Unique Statistics:

Full-Text HTML Views: 383
Abstract HTML Views: 261
PDF Downloads: 205
ePub Downloads: 127
Total Views/Downloads: 976



© Li et al.; Licensee Bentham Open.

open-access license: This is an open access article licensed under the terms of the Creative Commons Attribution-Non-Commercial 4.0 International Public License (CC BY-NC 4.0) (https://creativecommons.org/licenses/by-nc/4.0/legalcode), which permits unrestricted, non-commercial use, distribution and reproduction in any medium, provided the work is properly cited.

* Address correspondence to this author at Shunde Polytechnic, Desheng East Road, Shunde District, Foshan City, Guangdong province, 528300, China; Tel: 13420841579; Fax: 0757-22322669; E-mail: lyjfirst@163.com, 5380901@qq.com


Abstract

The stability problem for a class of stochastic neural networks with Markovian jump parameters and leakage delay is addressed in this study. The sufficient condition to ensure an exponentially stable stochastic neural networks system is presented and proven with Lyapunov functional theory, stochastic stability technique and linear matrix inequality method. The effect of leakage delay on the stability of the neural networks system is discussed and numerical examples are provided to show the correctness and effectiveness of the research results .

Keywords: Markovian jumping, Exponentially stable, Linear matrix inequality (LMI), Neural networks, Time-varying delay, Leakage delay.



1. INTRODUCTION

In the past decades, neural networks systems have elicited much attention because of its massive potential application in many fields, such as pattern classification, reconstruction of moving images, and combinatorial optimization, etc. In the real applications, ensuring the stability of the equilibrium point of the designed neural networks is an important task and has been a popular topic. Time delay and stochastic disturbance are well known as the two main factors that affect the stability of neural networks. Many new results have been obtained from stability analyses of stochastic neural networks with different types of time delays [1-8].

Recently, a special delay called leakage or forgetting delay has been investigated widely since its existence in real systems was discovered. This special delay affects the stability of time delay systems. Many exciting research results have been reported recently [9-13]. As pointed out in [9, 10], neural networks with leakage delay is a class of important networks. In [10], the global exponential stability of complex-valued neural networks with leakage delay was investigated with complex-valued linear matrix inequality technique, by establishing a novel stability lemma . Then, by using stochastic analysis theory and matrix inequalities technique, the exponential stability of a kind of stochastic neural networks with leakage terms are studied in [11]. The global μ-stability results for complex-valued neural networks with leakage time delay and unbounded time-varying delays were obtained in [12] through the free weighting matrix method and stability theory. By using the properties of M-matrix, the properties of the fuzzy logic operator, the eigenvalue of the spectral radius of nonnegative matrices and delay differential inequality, a class of fuzzy cellular neural networks with time delay in the leakage term and impulsive perturbations was investigated in [13]. However, the leakage delay studied above was constant. Subsequently, the research on leakage delay was extended to time-varying [14, 15]. In [14], with Lyapunov method, a triple Lyapunov-Krasovskii functional term was employed to study the robust stability of discrete-time uncertain neural networks with leakage time-varying delay.

Stochastic model plays an important role in the branches of economics, industry and science. A particular field of interest is stochastic system with Markovian jumping parameters. The Markovian jump system shows an advantage in modeling these dynamic systems presented above, and much progress has been made in stability analysis, impulsive response and state estimation of stochastic neural networks with Markovian jumping parameters, [16-23] and references therein.

Motivated by the discussion above, this study investigates the exponential stability of a class of stochastic neural networks with time-varying and leakage delays. By employing a suitable Lyapunov functional and introducing a new inequality technique, the sufficient condition to render the system exponentially stable is obtained by solving a set of strict linear matrix inequalities. Examples and simulations are presented to show the effectiveness of the proposed methods. The mutual effect between discrete and leakage delays as well as the derivative of time-delay are discussed. The experimental analysis reveals that the effect of leakage delay existing in neural networks on stability can not be disregarded.

2. PROBLEM FORMULATION

Let rt, t ≥ 0 be a right-continuous Markov chain defined on a complete probability space (Ω,F,P) and take discrete values in a finite state space S = {1, 2, ..., N} with generator given by Π = (πij)N×N:

(1)

where Δ > 0, πij ≥ 0 is the transition rate from i to j while .

Consider the following stochastic neural network with time-varying delay:

(2)

where is the state vector of the neural network associated with n neurons, ω(t) is an m-dimensional Brownian motion defined on a probability space (Ω,F,P), which is assumed to satisfy E{dω(t)} = 0, E{dω2(t)} = dt. A(rt) = diag{a1,...,an} is a diagonal matrix with positive entries ai > 0 (i = 1, 2,..., n), f(x(t)) = [f(x1(t)),..., f(xn(t))]T denotes the neuron activation function and W0(rt), W1(rt), W2(rt), W3(rt) are the connection weight matrices and the delayed connection weight matrices, respectively. C(rt) and D(rt) are known real constant matrices with compatible dimensions. The δ(t), τ(t) denotes leakage term and the transmission delay, respectively, which satisfying:

(3)

ρ = max{δ, τ} and µ is some positive scalar. ζ(t) is real-valued continuous initial condition on [-ρ, 0]. Throughout the paper, we assume that ω(t) and r(t) are independent.

For simplicity, in the sequel, for each rt = i ϵ S, A(rt), W0(rt), W1(rt) are denoted by Ai, W0i, W1i and so on. Therefore, the system (2) can be rewritten as:

(4)

Assumption 1. For i ϵ {1,2,..., n}, x, y, , , xy, the neuron activation function fi(·) is continuous, bounded and satisfies:

(5)

where Σ1 and Σ2 are some constant matrices.

Remark 1. In this study, Assumption 1 is based on neuron activation function, which is called sector-bounded neuron activation function [12]. As pointed out in [12], when Σ1 = Σ2 = -Σ, the condition (5) becomes:

(6)

The condition is less restrictive than the descriptions on both the sigmond activation functions [9, 10] and the Lipschitz-type activation functions.

At first, we derive the following Lemmas which will be used frequently in the proof of our main results.

Lemma 1 [4]. For any constant symmetric positive defined matrix J Rm×m, scalar η and the vector function ν : [0, η]→ m, the following inequality holds:

Lemma 2 [24]. Assume that x(t), φ(t), g(t) satisfy the stochastic differential equation

(7)

where ω(t) is a Brownian motion. For any constant matrix Z ≥ 0 and scalar h > 0, the following integration holds:

(8)

Definition 1. The stochastic neural networks system (2) is said to be exponential stable in the mean square sense, if there exist positive scalars α, β such that:

(9)

3. MAIN RESULTS

In this section, the exponential stability of system (4) is developed by Theorem 1.

Theorem 1. For given positive scalars δ, µ, ρσ and τ, the Markovian jumping stochastic neural network system (4) is exponential mean square stable, if there exist symmetric positive definite matrices Pi, i S, Rk(k = 1,2,3,4,5) and two diagonal matrices F1 > 0, F2 > 0 such that the following linear matrix inequality (LMI) holds:

(10)

where

Proof: For simplicity, we let:

(11)
(12)

Then the system (4) can be rewritten as:

(13)

We choose the following Lyapunov-Krasovskii functional candidate as:

(14)

where

Following Itô differential rule, the stochastic differential of dѴ(x(t), t, i) with respect to t along the system (4) is obtained by:

(15)

where

(16)

and

(17)
(18)
(19)
(20)
(21)

Applying Lemma 1 to (19) and Lemma 2 to (21), we can obtain:

(22)
(23)

On the other hand, it can be deduced from Assumption 1 that for i = 1,2,..., n,

(24)

Then there exist scalars λ1i > 0, λ2i > 0, diagonal matrices F1 ≥ 0, F2 ≥ 0 and Σi(i = 1, 2) such that the following inequalities hold:

(25)
(26)

where

(27)

By substituting (16)-(23) into (15), and adding the right sides of (25)-(26) to the right side of (15), we can obtain:

(28)

where Π = Π11 + Π12 Pi-1 12)T + Π13 R5-1 13)T,

..

By taking expectations on both sides of (28), we can obtain:

(29)

where

.

Then, by taking expectations on both side of (16) and integrating from 0 to T, we can obtain:

(30)

At the same time, it follows from (14) that:

(31)

where

.

Therefore, from (30)-(31), the following inequality can be obtained:

(32)

Applying Gronwall-Bellman lemma to the inequality (30), we can obtain:

.

Noting that there exists a scalar α > 0 such that:

(33)

Then by (9), we can draw the conclusion that the system (4) is exponentially mean square stable, thus completing the proof.

When leakage delay is constant, that is, δ(t) = δ, the system (4) can be rewritten as follows:

(34)

Then we can get the following Theorem 2:

Theorem 2. For given positive scalars δ, µ and τ, the Markovian jumping stochastic neural network system (34) is exponential mean square stable, if there exist symmetric positive definite matrices Pi, Rj(j = 1,2,3,4,5) and two diagonal matrices F1 > 0, F2 > 0 such that the following linear matrix inequality (LMI) holds:

,

where

,

4. NUMERICAL EXAMPLES

In this section, two numerical examples with simulation results are provided to demonstrate the effectiveness of the proposed approaches.

Example 1. Consider a two-neuron stochastic neural networks system (4) with the following parameters:

Mode 1

Mode 2

Assume that the Markov process governing the mode switching generates:

Take the neuron activation functions as follows: f (x) = 0.5(|x + 1| - |x - 1|).

Following Assumption 1, we obtain, F1 = diag {0, 0}, F2 = diag {0.05, 0.05}. In this example, we set µ = 0.3, τ = 0.8 and δ = 0.26, by virtue of the Matlab LMI Control box, solving the LMI (10), the feasible solution can be obtained as follows:

Fig. (1). Response of the state x(t).

Fig. (2). Markovian jumping mode of x(t).

The simulation results of the state response of system (4) are plotted in Figs. (1) and (2), where the initial condition x(0) = [1.5; -1], Fig. (1) shows the state response of system (4), and Fig. (2) depicts the switching modes. The figures illustrate that when two Markov processes are under control, the stochastic neural networks system with leakage delay is stable.

The upper bounds of delays ρσ, δ and τ guaranteeing the stability of system (4) are listed from Tables 1 to 3, where - signifies that LMI (10) exhibits no feasible solution. Table 1 shows the maximum allowable upper bound δ for different values of ρσ, which indicating that the bound of the derivative of the leakage time-varying is effective and plays an important role in obtaining feasible results.

Table 1. Allowable upper bounds of δ with different value of ρσ, µ = 0.1, and τ = 0.5.
ρσ 0 0.02 0.06 0.1 0.2 0.3 0.4
δ 0.3619 0.2970 0.2408 0.1991 0.1176 0.0519 -
Table 2. Allowable upper bounds of τ for different value of δ, γ = 0.1, and τδ = 0.01, µ = 0.5.
δ 0 0.05 0.1 0.15 0.2
τ 0.3867 0.2887 0.2415 0.1292 -

Table 2 indicates that when fixing the value of ε, ρσ and γ, the allowable upper value of τ is effected by δ, especially when δ = 0.2, the feasible solution cannot be obtained.

When ρσ is a non-zero constant, the allowable upper bounds of τ for different values of ε are listed in Table 3.

Table 3. Minimum allowable bounds of τ for different values of ρσ and δ = 0.2.
µ 0.1 0.3 0.5 0.9 0.95
τ 0.4778 0.4174 0.3254 0.2727 0.2727

Example 2. Consider a three-neuron two-mode stochastic neural networks with Markovian jump parameters and mixed time delays (2) with the following parameters:

Mode 1

Mode 2

Let the Markov process governing the mode switching has generator:

By setting µ = 0.1, ρσ = 0.001, τ = 1.2 the state and switching modes simulation curve of the system (4) are shown in Figs. (3) and (4), which conform the effectiveness of our results.

Fig. (3). Response of the state x(t).

Fig. (4). Markovian jumping mode of x(t).

Furthermore, by setting δ = 0.8 and τ = 1.6, we can obtain the state simulation curve in Fig. (5). Fig. (5) indicates that when leakage delay increases, the system (4) tends to be unstable.

Fig. (5). Response of the state x(t).

SUMMARY

We studied the stability problem for a class of stochastic neural networks with Markovian jump parameters and leakage delay in this study. By employing a proper Lyapunov functional, combined with the stochastic stability analysis method, the stability criterion is derived to ensure the developed system is exponentially mean square stable based on LMIs. Finally, we discussed the effect of leakage delay on stability of neural networks system. Numerical examples were provided to testify the rightness and effectiveness of the research results.

CONFLICT OF INTEREST

The authors confirm that this article content has no conflict of interest.

ACKNOWLEDGEMENTS

The authors are very much thankful to the editor and anonymous reviewers for their careful reading, constructive comments and fruitful suggestions to improve the quality of this manuscript. This work was supported by the Natural Science Foundation of Guang Dong Province under grants 2015A030310336 and the National Natural Science Foundation of No. 61364006.

REFERENCES

[1] Arik S. Global asymptotic stability of a larger class of neural networks with constant time delay. Phys Lett, Sec A 2003; 311(6): 504-11.
[2] He Y, Liu G, Rees D, Wu M. Stability analysis for neural networks with time-varying interval delay. IEEE Trans Neural Netw 2007; 18(6): 1850-4.
[3] Song C, Gao H, Zheng W. A new approach to stability analysis of discrete-time recurrent neural networks with time-varying delay. Neurocomputing 2009; 72(10-12): 2563-8.
[4] Li T, Ye XL. Improved stability criteria of neural networks with time-varying delays: An augmented LKF approach. Neurocomputing 2010; 73(4-6): 1038-47.
[5] Chen WH, Lu X, Guan ZH, Zheng WX. Delay-dependent exponential stability of neural networks with variable delay: An LMI approach. IEEE Trans Circuits Sys II 2006; 53(9): 837-42.
[6] Zeng HB, He Y, Wu M, Zhang CF. Complete delay-decomposing approach to asymptotic stability for neural networks with time-varying delays. IEEE Trans Neural Netw 2011; 22(5): 806-12.
[7] Hu L, Gao H, Shi P. New stability criteria for Cohen-Grossberg neural networks with time delays. IET Control Theory Appl 2009; 3(9): 1275-82.
[8] Yang R, Gao H, Shi P. Novel robust stability criteria for stochastic Hopfield neural networks with time delays. IEEE Trans Syst Man Cybern B Cybern 2009; 39(2): 467-74.
[9] Gopalsamy K. Leakage delays in BAM. J Math Anal Appl 2007; 325(2): 1117-32.
[10] Song Q, Zhao Z. Stability criterion of complex-valued neural networks with both leakage delay and time-varying delays on time scales. Neurocomputing 2016; 171(3): 179-84.
[11] Xie W, Zhu Q, Jiang F. Exponential stability of stochastic neural networks with leakage delays and expectations in the coefficients. Neurocomputing 2016; 173(3): 1268-75.
[12] Balasubramaniam P, Kalpana M, Rakkiyappan R. Global asymptotic stability of BAM fuzzy cellular neural networks with time delay in the leakage term. Math Comput Model 2011; 53(5-6): 839-53.
[13] Park M, Kwon O, Park J, Lee S, Cha E. Synchronization criteria for coupled stochastic neural networks with time-varying delays and leakage delay. J Franklin Inst 2012; 349(5): 1699-720.
[14] Balasubramaniam P, Vembarasan V, Rakkiyappan R. Leakage delays in T-S fuzzy cellular neural networks. Neural Process Lett 2011; 33(2): 111-36.
[15] Song Q, Cao J. Passivity of uncertain neural networks with both leakage delay and time-varying delay. Nonlinear Dyn 2012; 67(2): 1695-707.
[16] Wang Z, Liu Y, Li M, Liu X. Stability analysis for stochastic Cohen-Grossberg neural networks with mixed time delays. IEEE Trans Neural Netw 2006; 17(3): 814-20.
[17] Hu J, Zhong S, Liang L. Exponential stability analysis of stochastic delayed cellular neural network. Chaos Solitons Fractals 2006; 27(4): 1006-10.
[18] Rakkiyappan R, Balasubramaniam P, Lakshmanan S. Robust stability results for uncertain stochastic neural networks with discrete interval and distributed time-varying delays. Phys Lett A 2008; 372(32): 5290-8.
[19] Wang Z, Fang J, Liu X. Global stability of stochastic high-order neural networks with discrete and distributed delays. Chaos Solitons Fractals 2008; 36(2): 388-96.
[20] Lou X, Cui B. Stochastic exponential stability for Markovian jumping BAM neural networks with time-varying delays. IEEE Trans Syst Man Cybern B Cybern 2007; 37(3): 713-9.
[21] Ma Q, Xu S, Zou Y, Lu J. Stability of stochastic Markovian jump neural networks with mode-dependent delays. Neurocomputing 2011; 74(12–13): 2157-63.
[22] Chen Y, Zheng WX. Stochastic state estimation for neural networks with distributed delays and Markovian jump. Neural Netw 2012; 25(1): 14-20.
[23] Zhu Q, Cao J. Robust exponential stability of Markovian jump impulsive stochastic Cohen-Grossberg neural networks with mixed time delays. IEEE Trans Neural Netw 2010; 21(8): 1314-25.
[24] Wu H, Wang J, Shi P. A delay decomposition approach to L2-L1 filter design for stochastic systems with time-varying delay. Automatica 2011; 47(7): 1482-8.