# Novel Conditions for Robust Stability of Bidirectional Associative Memory Neural Networks with Multiple Time Delays.

Abstract: This paper deals with the problem of robust stability of the class of bidirectional associative memory (BAM) neural networks with multiple time delays. Several new sufficient conditions that imply the existence, uniqueness and global robust stability of the equilibrium point for the class of BAM neural networks are obatined by the use of the proper Lyapunov functionals and exploiting the norm properties of the interval matrices. The derived results basically depend on the system parameters of neural network model and they are independent of the time delays. We also give some numerical examples to show the applicability and novelty of the results, and compare the results with the corresponding robust stability results derived in the previous literature.Keywords: Robust Stability, BAM Neural Networks, Lyapunov Theorems, Interval Matrices.

1. Introduction

In recent years, dynamical neural networks have been extensively studied due to their potential applications in image processing, control theory, pattern recognition, associative memories, optimization problems. In these types of applications, stability properties of the equilibrium point of neural networks are of great importance. In particular, when a neural network is electronically implemented, time delays become important parameters on the stability properties. On the other hand, in hardware implementation of neural networks, the network parameters of the system may change because of the deviations in values of the electronic components. In this case, we need to study the robust stability of neural networks. In the past literature, many different stability results for various neural network models have been reported in [1]-[17]. Bidirectional associative memory (BAM) neural networks have been first introduced in [18]. The stability of the BAM neural networks has been extensively studied in the past years and a great number of various sufficient conditions on the stability of BAM neural networks have been presented in [18]-[35]. However, most of these stability results derived for the BAM neural networks are applicable when neural network model has a single delay. In this paper, we will consider bidirectional associative memory neural networks with multiple time delays. By using some suitable Lyapunov-Krasovskii functionals and properties of intervalized interconnection matrices of the neural system, some new delay-independent sufficient conditions for the existence, uniqueness and global robust asymptotic stability of the equilibrium point for hybrid, BAM neural networks with time delays are derived. Some numerical examples will be presented to show the advantages of our results over to the previous stability results derived in the literature.

2. BAM Neural Networks

Dynamics of a BAM neural network with constant multiple time delays is described by the differential equations of the form:

[mathematical expression not reproducible] (1)

The BAM neural network model (1) consists of two layers. n denotes number of the neurons in the first layer and m denotes the number of neurons in the second layer. [u.sub.i](t) is the state of the ith neuron in the first layer and [z.sub.j](t) is the state of the jth neuron in the second layer. [a.sub.i] and [b.sub.j] denote the neuron charging time constants and passive decay rates, respectively; [w.sub.ji], [w.sup.[tau].sub.ji], [v.sub.ij] and [v.sup.[tau].sub.ij] are synaptic connection strengths; [g.sub.i] and [g.sub.j] represent the activation functions of the neurons and the propagational signal functions, respectively; and [I.sub.i] and [J.sub.j] are the exogenous inputs.

We assume that [a.sub.i], [b.sub.j], [w.sub.ji], [w.sup.[tau].sub.ji], [v.sub.ij], [v.sup.[tau].sub.ij], [[tau].sub.ji] and [[sigma].sub.ij] in system (1) are defined at the following intervals:

[mathematical expression not reproducible] (2)

The activation functions are assumed to satisfy the following conditions:

(H1) There exist some positive constants [l.sub.i], i = 1,2,...,n and [k.sub.j], j = 1,2,...,m such that

[mathematical expression not reproducible]

for all [bar.x], [bar.y], [??], [??] [member of] R. This class of functions is denoted by g [member of] K.

(H2) There exist positive constants [M.sub.i], i = 1,2,...,n and [L.sub.j], j = 1,2,...,m such that |[g.sub.i](u)| [less than or equal to] [M.sub.i] and |[g.sub.j](z)| [less than or equal to] [L.sub.j] for all u, z [member of] R. This class of functions is denoted by g [member of] B.

3. Preliminaries

Let v = [([v.sub.1], [v.sub.2],..., [v.sub.n]).sup.T] [member of] [R.sup.n] be a column vector and Q = [([q.sub.ij]).sub.nxn] be a real matrix. The three commonly used vector norms [[parallel]v[parallel].sub.1], [[parallel]v[parallel].sub.2], [[parallel]v[parallel].sub.[infinity]] are defined as :

[mathematical expression not reproducible]

The three commonly used matrix norms [[parallel]Q[parallel].sub.1], [[parallel]Q[parallel].sub.2], [[parallel]Q[parallel].sub.[infinity]] are defined as follows:

[mathematical expression not reproducible]

If v = [([v.sub.1], [v.sub.2],..., [v.sub.n]).sup.T], then, |v| will denote v = [(|[v.sub.1]|, |[v.sub.2]|,..., |[v.sub.n]|).sup.T]. If Q = [([q.sub.ij]).sub.nxn], then, |Q| will denote |Q| = [(|[q.sub.ij]|).sub.nxn], and [[lambda].sub.m](Q) and [[lambda].sub.M](Q) will denote the minimum and maximum eigenvalues of Q, respectively.

Lemma 1 [36] : Let A be any real matrix defined by A [member of] [A.sub.1] := {A = ([[alpha].sub.ij]) : [A.bar] [less than or equal to] A [less than or equal to] [bar.A], i.e., [[a.bar].sub.ij] [less than or equal to] [a.sub.ij] [less than or equal to] [[bar.a].sub.ij], i,j = 1,2,...,n}. Define A* = 1/2 ([bar.A] + [A.bar]) and [A.sub.*] = 1/2 ([bar.A] - [A.bar]). Let

[[sigma].sub.1](A) = [square root of [[parallel] |[A.sup.*T]A*| + 2|[A.sup.*T]|[A.sub.*], + [A.sup.T.sub.*], [parallel].sub.2]]

Then, the following inequality holds [[parallel]A[parallel].sub.2] [less than or equal to] [[sigma].sub.1](A)

Lemma 2 [37] : Let A be any real matrix defined by A [member of] [A.sub.1] := {A = ([a.sub.ij]) : [A.bar] [less than or equal to] A [less than or equal to] [bar.A], i.e., [[a.bar].sub.ij] [less than or equal to] [a.sub.ij] [less than or equal to] [[bar.a].sub.ij], i,j = 1,2,...,n}. Define A* = 1/2 ([bar.A] + [A.bar]) and [A.sub.*] = 1/2 ([bar.A] - [A.bar]). Let

[[sigma].sub.2](A) = [[parallel]A*[parallel].sub.2] + [[parallel][A.sub.*][parallel].sub.2]

Then, the following inequality holds

[[parallel]A[parallel].sub.2] [less than or equal to] [[sigma].sub.2](A)

Lemma 3 [38] : Let A be any real matrix defined by A [member of] [A.sub.1] := {A = ([a.sub.ij]) : [A.bar] [less than or equal to] A [less than or equal to] [bar.A], i.e., [[a.bar].sub.ij] [less than or equal to] [a.sub.ij] [less than or equal to] [[bar.a].sub.ij], i,j = 1,2,...,n}. Define A* = 1/2 ([bar.A]+ [A.bar]) and [A.sub.*] = 1/2 ([bar.A] - [A.bar]). Let

[[sigma].sub.3](A) = [square root of [[parallel] A*[parallel].sup.2.sub.2] + [[parallel][A.sub.*][parallel].sup.2.sub.2] + 2[[parallel][A.sup.T.sub.*]|A*| [parallel].sub.2]]

Then, the following inequality holds

[[parallel]A[parallel].sub.2] [less than or equal to] [[sigma].sub.3](A)

Lemma 4 [39] : Let A be any real matrix defined by A [member of] [A.sub.1] := { A = ([a.sub.ij]) : [A.bar] [less than or equal to] A [less than or equal to] [bar.A], i.e., [[a.bar].sub.ij] [less than or equal to] [a.sub.ij] [less than or equal to] [[bar.a].sub.ij]. i,j = 1,2,...,n} Define [mathematical expression not reproducible] and [mathematical expression not reproducible]. Let

[mathematical expression not reproducible]

Then, the following inequality holds

[[parallel]A[parallel].sub.2] [less than or equal to] [[sigma].sub.4](A)

4. Global Robust Stability Results

In this section, we present some sufficient conditions for the global robust asymptotic stability of the equilibrium point of neural network model (1). First, the equilibrium point of system (1) will be shifted to the origin. By the transformation

[x.sub.i](.) = [u.sub.i](.) - [u.sup.*.sub.i], i = 1,2,...,n, [y.sub.j](.) = [z.sub.j](.) - [z.sup.*.sub.j], j = 1,2,...,m,

system (1) can be transformed into a new system of the following form :

[mathematical expression not reproducible] (3)

where [mathematical expression not reproducible]

The functions [f.sub.i]([x.sub.i]), [f.sub.j]([y.sub.j]) are of the form :

[mathematical expression not reproducible]

It can be noted that the functions [f.sub.i] and [f.sub.j] satisfy the assumptions on [g.sub.i] and [g.sub.j], i.e., [g.sub.i] [member of] K and [g.sub.j] [member of] B implies that [f.sub.i] [member of] K and [f.sub.j] [member of] B, respectively. It is also easy to see that [f.sub.i](0) = 0 and [f.sub.j](0) = 0, i = 1,2,...,n.

Note that the equilibrium point of system (1) is globally asymptotically stable, if the origin of system (3) is a globally asymptotically stable. Therefore, the proof of global asymptotic stability of the equilibrium point of system (1) is equivalent to the proof of the global asymptotic stability of the origin of system (3). We now state the following result :

Theorem 1 : Let the assumptions (H1) and (H2) hold. Then, neural system (1) with (2) has a unique equilibrium point which is globally asymptotically robustly stable if there exist positive constants [alpha], [gamma] and [beta] such that

[mathematical expression not reproducible]

where [mathematical expression not reproducible]

Proof: Define the following positive definite Lyapunov functional :

[mathematical expression not reproducible]

The derivative of V(x(t), y(t)) along the trajectories of the system is obtained as :

[mathematical expression not reproducible] (4)

We note the following inequalities :

[mathematical expression not reproducible] (5)

[mathematical expression not reproducible] (6)

[mathematical expression not reproducible] (7)

[mathematical expression not reproducible] (8)

Using (5)-(8) in (4) results in

[mathematical expression not reproducible]

Since [mathematical expression not reproducible]

[mathematical expression not reproducible]

Since [[delta].sub.i] > 0 for i = 1,2,...,n and [[OMEGA].sub.j] > 0 for j = 1,2,...,m, it follows that [mathematical expression not reproducible] for x(t) [not equal to] 0 or y(t) [not equal to] 0. Hence, by the standard Lyapunov-type theorem in functional differential equations we can conclude that the origin of system (3) is globally asymptotically stable.

Theorem 2 : Let the assumptions (H1) and (H2) hold. Then, neural system (1) with (2) has a unique equilibrium point which is globally asymptotically robustly stable if there exist positive constants [alpha] and [beta] such that

[mathematical expression not reproducible]

where [mathematical expression not reproducible]

Proof : Define the following positive definite Lyapunov functional :

[mathematical expression not reproducible]

The derivative of V(x(t), y(t)) along the trajectories of the system is obtained as :

[mathematical expression not reproducible] (9)

We also note that

[mathematical expression not reproducible] (10)

[mathematical expression not reproducible] (11)

Using (5), (6), (10) and (11) in (9) leads to :

[mathematical expression not reproducible]

Since [mathematical expression not reproducible]

[mathematical expression not reproducible]

in which [??](x(t), y(t)) < 0 for all x(t) [not equal to] 0 or y(t) [not equal to] 0. Hence, the origin of system (3) is globally asymptotically stable.

The following corollaries are the direct results of Theorems 1 and 2 :

Corollary 1: Let [[a.bar].sub.m] = min{[[a.bar].sub.i]}, [[b.bar].sub.m] = min{[[b.bar].sub.j]}, [l.sub.M] = max{[l.sub.i]}, [k.sub.M] = max{[k.sub.j]}.

[mathematical expression not reproducible]

where [mathematical expression not reproducible].

Corollary 2: Let [mathematical expression not reproducible]

[mathematical expression not reproducible]

where [mathematical expression not reproducible].

Corollary 3: Let [gamma] = [l.sub.M][[sigma].sub.m](V), [beta] = [k.sub.M][[sigma].sub.m](W).

[mathematical expression not reproducible]

where [mathematical expression not reproducible]

Corollary 4: Let [gamma] = [l.sub.M][[sigma].sub.m](V), [beta] = [k.sub.M][[sigma].sub.m](W).

[mathematical expression not reproducible]

where [mathematical expression not reproducible].

5. Comparisons and Examples

In this section, the results obtained in this paper will be compared with the previous global robust stability results of BAM neural networks derived in the literature. In order to make the comparison precise, first the previous results will be restated :

Corollary 5 [40] : Let the activation functions satisfy assumptions (H1) and (H2). Then, neural system (1) with (2) has a unique equilibrium point which is globally asymptotically robustly stable if there exist positive constants [alpha], [beta] and [gamma] such that the network parameters of the system satisfy the following conditions

[mathematical expression not reproducible]

where [mathematical expression not reproducible].

Corollary 6 [40]: Let the activation functions satisfy assumptions (H1) and (H2). Then, neural system (1) with (2) has a unique equilibrium point which is globally asymptotically robustly stable if there exist positive constants [alpha], [beta] and [gamma] such that the network parameters of the system satisfy the following conditions

[mathematical expression not reproducible]

where W = ([w.sub.ji]), V = ([v.sub.ij]), W* = 1/2([bar.W] + [W.bar]), [W.sub.*] = 1/2([bar.W] - [W.bar]), V* = 1/2([bar.V] + [V.bar]), [V.sub.*] = 1/2([bar.V] - [V.bar]), [v.sup.[tau]*.sub.ij] = max{|[[v.bar].sup.[tau].sub.ij]|, |[[bar.v].sup.[tau].sub.ij]|} and [w.sup.[tau]*.sub.ji] = max{|[[w.bar].sup.[tau].sub.ji]|, |[[bar.w].sup.[tau].sub.ji]|}.

We can write the following results for Corollary 5 and Corollary 6 :

Corollary 7: Let [l.sub.M] = max{[l.sub.i]}, [k.sub.M] = max{[k.sub.j]}, [gamma] = [l.sub.M][square root of [[parallel]V*[parallel].sup.2.sub.2] + [[parallel][V.sub.*][parallel].sup.2.sub.2] + 2[[parallel][V.sup.T.sub.*]|V*|[parallel].sub.2]], [beta] = [k.sub.M][square root of [[parallel]W*[parallel].sup.2.sub.2] + [[parallel][W.sub.*][parallel].sup.2.sub.2] + 2[[parallel][W.sup.T.sub.*]|W*|[parallel].sub.2]].

[mathematical expression not reproducible]

Corollary 8: Let [l.sub.M] = max{[l.sub.i]}, [k.sub.M] = max{[k.sub.j]}, [gamma] = [l.sub.M][square root of [[parallel]V*[parallel].sup.2.sub.2] + [[parallel][V.sub.*][parallel].sup.2.sub.2] + 2[[parallel][V.sup.T.sub.*]|V*|[parallel].sub.2]], [beta] = [k.sub.M][square root of [[parallel]W*[parallel].sup.2.sub.2] + [[parallel][W.sub.*][parallel].sup.2.sub.2] + 2[[parallel][W.sup.T.sub.*]|W*|[parallel].sub.2]].

[mathematical expression not reproducible]

Example 1: Assume that the network parameters of neural system (1) are given as follows :

[mathematical expression not reproducible]

[A.bar] = A = [bar.A] = [B.bar] = B = [bar.B] = I, [l.sub.1] = [l.sub.2] = [l.sub.3] = [l.sub.4] = [k.sub.1] = [k.sub.2] = [k.sub.3] = [k.sub.4] = 1, Where a> 0 is real number. The matrices W*, [W.sub.*], V*, [V.sub.*], [W.sup.T.sub.*]|W*|, [V.sup.T.sub.*]|V*|, [??] and [??] are obtained as follows

[mathematical expression not reproducible]

We calculate

[mathematical expression not reproducible]

[[sigma].sub.1](W) = [[sigma].sub.1](V), [[sigma].sub.2](W) = [[sigma].sub.2](V), [[sigma].sub.3](W) = [[sigma].sub.3](V), [[sigma].sub.4](W) = [[sigma].sub.4](V). Hence

[[sigma].sub.m](V) = min{[[sigma].sub.1](V), [[sigma].sub.2](V), [[sigma].sub.3](V), [[sigma].sub.4](V)} = 5,0364a [[sigma].sub.m](W) = min{[[sigma].sub.1](W), [[sigma].sub.2](W), [[sigma].sub.3](W), [[sigma].sub.4](W)} = 5,0364a

For the network parameters of this example, the conditions of Corollary 3 and Corollary 4 are obtained as follows:

[[phi].sub.1] = [[phi].sub.2] = [[phi].sub.3] = [[phi].sub.4] = [[psi].sub.1] = [[psi].sub.2] = [[psi].sub.3] = [[psi].sub.4] = [[zeta].sub.1] = [[zeta].sub.2] = [[zeta].sub.3] = [[zeta].sub.4] = [[xi].sub.1] = [[xi].sub.2] = [[xi].sub.3] = [[xi].sub.4] = 8 - 4[alpha] - 8(5,0364a) - [256[a.sup.2]]/[alpha]

Let [alpha] = 8a. Hence, if a < 8/104,2912 holds, then the conditions of Corollaries 3 and 4 are satisfied.

We will now check the results of Corollary 7 and Corollary 8 for the same network parameters. The conditions of Corollary 7 and Corollary 8 are obtained as follows:

[[delta].sub.1] = [[delta].sub.2] = [[delta].sub.3] = [[delta].sub.4] = [[OMEGA].sub.1] = [[OMEGA].sub.2] = [[OMEGA].sub.3] = [[OMEGA].sub.4] = [[phi].sub.1] = [[phi].sub.2] = [[phi].sub.3] = [[phi].sub.4] = [[theta].sub.1] = [[theta].sub.2] = [[theta].sub.3] = [[theta].sub.4] = 8 - 4[alpha] - 8(5,6245a) - [256[a.sup.2]]/[alpha]

Let [alpha] = 8a. Hence, if a < 8/108,996 holds, then the conditions of Corollaries 7 and 8 are satisfied.

Remark: For the parameters in this example, our results require that a < 8/104,2912. However, the results of Corollaries 7 and 8 require that a < 8/108,996. Therefore, for 8/108,996 [less than or equal to] a < 8/104,2912, our conditions obtained in Corollary 3 and Corollary 4 are satisfied but the results of Corollary 7 and Corollary 8 do not hold.

6. Conclusions

In this paper, by using the Lyapunov stability theorems and the norm properties of the interconnection matrices of the neural system, some novel sufficient conditions for the existence, uniqueness and the global robust asymptotic stability of the equilibrium point have been obtained for the class of bidirectional associative memory (BAM) neural networks with multiple time delays. We have also compared our results with the most recent corresponding stability results, implying that our results establish a new set of global robust asymptotic stability criteria for BAM neural networks with multiple time delays.

7. References

[1] C-D. Zheng, H. Zhang and Z. Wang, "Novel Exponential Stability Criteria of High-Order Neural Networks With Time-Varying Delays", IEEE Transactions On Systems Man And Cybernetics Part B:Cybernetics , vol. 41, no. 2, pp. 486-496, 2011

[2] Q. Song and Z. Wang, "Neural networks with discrete and distributed time-varying delays: A general stability analysis", Chaos, Solitons and Fractals, vol. 37, no. 5, pp. 1538-1547, 2008.

[3] Y. He, G.P. Liu, D. Rees and M. Wu, "Stability analysis for neural networks with time-varying interval delay", IEEE Transactions On Neural NetWorks, vol. 18, no. 6, pp. 1850-1854, 2007.

[4] X. Meng, M. Tian and S. Hu, "Stability analysis of stochastic recurrent neural networks with unbounded time-varying delays", Neurocomputing, vol. 74, no. 6, pp. 949-953, 2011.

[5] H. Zhang, Z. Wang, and D. Liu, "Global asymptotic stability of recurrent neural networks with multiple time varying delays", IEEE Transactions on Neural Networks, vol. 19, no. 5, pp. 855873, 2008.

[6] R. Yang, Z. Zhang and P. Shi, "Exponential Stability on Stochastic Neural Networks With Discrete Interval and Distributed Delays", IEEE Transactions on Neural Networks, vol. 21, no. 1, pp. 169-175, 2010.

[7] R.-S. Gau, C.-H. Lien and J.-G. Hsieh, "Novel Stability Conditions For Interval Delayed Neural Networks With Multiple Time-Varying Delays", International Journal Of Innovative Computing Information And Control, vol. 7, no. 1, pp. 433-444, 2011.

[8] Z. Liu, H. Zhang and Q. Zhang, "Novel Stability Analysis for Recurrent Neural Networks with Multiple Delays via Line Integral-Type L-K Functional", IEEE Transactions on Neural Networks, vol. 21, no. 11, pp. 1710-1718, 2010.

[9] Z. Zuo, C. Yang and Y. Wang, "A New Method for Stability Analysis of Recurrent Neural Networks With Interval Time-Varying Delay", IEEE Transactions on Neural Networks, vol. 21, no. 2, pp. 339-344, 2010.

[10] Z.-G. Wu, Ju H. Park, H. Su and J. Chu, "New results on exponential passivity of neural networks with time-varying delays", Nonlinear Analysis: Real World Applications, vol. 13, no. 4, pp. 1593-1599, 2012.

[11] D.H. Ji, J.H. Koo, S.C. Won, S.M. Lee and Ju H. Park, "Passivity-based control for Hopfield neural networks using convex representation", Applied Mathematics and Computation, vol. 217, no. 13, pp. 6168-6175, 2011.

[12] P. Balasubramaniam and S. Lakshmanan, "Delay-range dependent stability criteria for neural networks with

Markovian jumping parameters", Nonlinear Analysis: Hybrid Systems, vol. 3, no. 4, pp. 749-756, 2009.

[13] H. Zhang, Z. Wang and D. Liu, "Global Asymptotic Stability and Robust Stability of a Class of Cohen-Grossberg Neural Networks With Mixed Delays", IEEE Transactions on Circuits and Systems I: Regular Papers, vol. 56, no. 3, pp. 616-629, 2009.

[14] P. Balasubramaniam and M.S. Ali, "Robust stability of uncertain fuzzy cellular neural networks with time-varying delays and reaction diffusion terms", Neurocomputing, vol. 74, no. 1-3; pp. 439-446, 2010.

[15] W.-H. Chen and W.X. Zheng, "Robust Stability Analysis for Stochastic Neural Networks With Time-Varying Delay", IEEE Transactions on Neural Networks, vol. 21, no. 3, pp. 508-514, 2010.

[16] Y. Zhao, L. Zhang, S. Shen and H. Gao, "Robust Stability Criterion for Discrete-Time Uncertain Markovian Jumping Neural Networks with Defective Statistics of Modes Transitions", IEEE Transactions on Neural Networks, vol. 22, no. 1, pp. 164-170, 2011.

[17] P. Balasubramaniam, S. Lakshmanan and R. Rakkiyappan, "Delay-interval dependent robust stability criteria for stochastic neural networks with linear fractional uncertainties", Neurocomputing, vol. 72, no. 16-18, pp. 3675-3682, 2009.

[18] B. Kosko, "Adaptive bi-directional associative memories", Appl. Opt., vol. 26, pp. 4947-4960, 1987.

[19] G. Mathai, B.R. Upadhyaya, "Performance analysis and application of the bidirectional associative memoryto industrial spectral signatures", Proc. IJCNN, vol. 89, no. 1, pp. 33-37, 1989.

[20] S. Arik, "Global Asymptotic Stability Analysis of Bidirectional Associative Memory Neural Networks with Time Delays", IEEE Transactions on Neural Networks, vol. 16, no. 3, pp. 580586, 2005.

[21] JD. Cao, JL. Liang and J. Lam, "Exponential stability of high-order bidirectional associative memory neural networks with time delays", Physica D-Nonlinear Phenomena, vol. 199, no. 3-4, pp. 425-436, 2004.

[22] Z.-T. Huang, X.-S Luo and Q.-G. Yang, "Global asymptotic stability analysis of bidirectional associative memory neural networks with distributed delays and impulse", Chaos, Solitons and Fractals, vol. 34, no. 3, pp. 878-885, 2007.

[23] X. Lou, B. Cui and W. Wu, "On global exponential stability and existence of periodic solutions for BAM neural networks with distributed delays and reactiondiffusion terms", Chaos, Solitons and Fractals, vol. 36, no. 4, pp. 1044-1054, 2008.

[24] J.H. Park, S.M. Lee and O.M. Kwon, "On exponential stability of bidirectional associative memory neural networks with time-varying delays", Chaos, Solitons and Fractals, vol. 39, pp. 10831091, 2009.

[25] Y. Wang, "Global exponential stability analysis of bidirectional associative memory neural networks with time-varying delays", Nonlinear Analysis: Real World Applications, vol. 10, pp. 1527-1539, 2009.

[26] Y. Yuan and X. Li, "New results for global robust asymptotic stability of BAM neural networks with time-varying delays", Neurocomputing, vol. 74, no. 1-3, pp. 337-342, 2010.

[27] B. Chen, L. Yu and W.-A. Zhang, "Exponential convergence rate estimation for neutral BAM neural networks with mixed time-delays", Neural Computing and Applications, vol. 20, no. 3, pp. 451-460, 2011.

[28] P. Balasubramaniam and C. Vidhya, "Global asymptotic stability of stochastic BAM neural networks with distributed delays and reaction-diffusion terms", Journal of Computational and Applied Mathematics, vol. 234, no. 12, pp. 3458-3466, 2010.

[29] Ju H. Park, C.H. Park, O.M. Kwon and S.M. Lee, "A new stability criterion for bidirectional associative memory neural networks of neutral-type", Applied Mathematics and Computation, vol. 199, no. 2, pp. 716-722, 2008.

[30] Ju H. Park, "Robust stability of bidirectional associative memory neural networks with time delays", Physics Letters A, vol. 349, no. 6, pp. 494-499, 2006.

[31] X. F. Liao and K. Wong, "Global exponential stability of hybrid bidirectional associative memory neural networks with discrete delays", Physical Review E, vol. 67, no. 4, (0402901), 2003.

[32] X. F. Liao and K. Wong, "Robust stability of interval bidirectional associative memory neural network with time delays", IEEE Trans. Systems, Man and Cybernetics-Part C, vol. 34, pp. 1142-1154, 2004.

[33] S. Senan and S. Arik, "New results for global robust stability of bidirectional associative memory neural networks with multiple time delays", Chaos, Solitons and Fractals, vol. 41, no. 4, pp. 2106-2114, 2009.

[34] S. Senan and S. Arik, "Global robust stability of bidirectional associative memory neural networks with multiple time delays", IEEE Trans. Systems, Man and Cybernetics-Part B, vol. 37, no. 5, pp. 1375-1381, 2007.

[35] N.Ozcan and S.Arik, "A new sufficient condition for global robust stability of bidirectional associative memory neural networks with multiple time delays", Nonlinear Analysis:Real World Applications, vol.10, pp. 3312-3320, 2009.

[36] O. Faydasicok and S. Arik, S. "A new upper bound for the norm of interval matrices with application to robust stability analysis of delayed neural networks", Neural Networks, vol. 44, pp. 6471, 2013.

[37] A. Chen, J. Cao and L. Huang, "Global robust stability of interval cellular neural networks with time-varying delays", Chaos, Solitons and Fractals, vol. 23, no :3, pp. 787799, 2005.

[38] T. Ensari and S. Arik, "New results for robust stability of dynamical neural networks with discrete time delays", Expert Systems with Applications, vol. 37, no : 8, pp. 59255930, 2010.

[39] V. Singh, "Global robust stability of delayed neural networks: estimating upper limit of norm of delayed connection weight matrix", Chaos, Solitons and Fractals, vol. 32, no :1, pp. 259263, 2007.

[40] S. Senan, S. Arik and D. Liu, "New robust stability results for bidirectional associative memory neural networks with multiple time delays", Applied Mathematics and Computation, vol. 218, no : 23, pp. 11472-11482, 2013.

Eylem Yucel received the B.Sc., M.Sc. and Ph.D. degrees from Istanbul University, Istanbul, Turkey, in 2001, 2005 and 2010 respectively. She is working as an Assistant Professor at the Department of Computer Engineering, Istanbul University since 2010. Her research interests are neural networks and nonlinear systems.

Eylem Yucel

Istanbul University, Department of Computer Engineering, Istanbul, Turkey, eylem@istanbul.edu.tr

Received on: 24.05.2016

Accepted on: 31.01.2017

Printer friendly Cite/link Email Feedback | |

Author: | Yucel, Eylem |
---|---|

Publication: | Istanbul University - Journal of Electrical & Electronics Engineering |

Article Type: | Report |

Date: | Jan 1, 2017 |

Words: | 4380 |

Previous Article: | MULTILANE TRAFFIC DENSITY ESTIMATION AND TRACKING. |

Next Article: | SOLVING SUDOKU PUZZLE with NUMBERS RECOGNIZED by USING ARTIFICIAL NEURAL NETWORKS. |

Topics: |