资源预览内容
第1页 / 共45页
第2页 / 共45页
第3页 / 共45页
第4页 / 共45页
第5页 / 共45页
第6页 / 共45页
第7页 / 共45页
第8页 / 共45页
第9页 / 共45页
第10页 / 共45页
第11页 / 共45页
第12页 / 共45页
第13页 / 共45页
第14页 / 共45页
第15页 / 共45页
第16页 / 共45页
第17页 / 共45页
第18页 / 共45页
第19页 / 共45页
第20页 / 共45页
亲,该文档总共45页,到这儿已超出免费预览范围,如果喜欢就下载吧!
点击查看更多>>
资源描述
,单击此处编辑母版标题样式,单击此处编辑母版文本样式,第二级,第三级,第四级,第五级,*,AI&NN Notes,Chapter 9,Feedback Neural Networks,9.1 Basic Concepts,Attractor,:a state toward which the system evolves in,time starting from certain initial conditions.,Basin of attraction,:the set of initial conditions which,initiates the evolution terminating in the attractor.,Fixed point,:,if an attractor is in a form of a unique,point in state space.,Limit Cycle,:if an attractor consists of a periodic,sequence of states.,Hopfield Network and its Basic Assumptions,1,2,n,v,v,v,1,2,n,T,T,T,1,2,n,i,i,i,n,2,1,w,w,w,w,21,12,n1,w,w,n2,1n,2n,1.1 layer,n neurons,2.T -Threshold of,neuron i,3.w -weight from,j to i,4.v -output of,neuron j,5.i -external input to,the i-th neuron,ij,i,j,i,The total input of the i-th neuron is,net =,i,J=1,j,i,n,w v +i -T =W V+i -T ,for I=2,n,ij,j,i,i,i,t,i,i,where,W =,w,w,w,i,i1,i2,in,.,.,.,V=,v,v,v,1,2,n,The complete matrix description of the linear portion,of the system shown in the figure is given by,net=WV+i-t,where,net=,net,net,net,i=,i,i,i,1,2,n,1,2,n,are vectors containing activation,external input,to each neuron and threshold vector,respectively.,t=,T,T,T,1,n,2,W is an n,n matrix containing network weights:,W=,w,w,w,=,w,w,w,w,w,w,w,w,w,w,w,w,0,0,0,0,1,2,n,t,t,t,12,13,1n,21,23,2n,31,32,3n,n1,n2,n3,w =w,ij,ji,w =0,ii,and,9.2 Discrete-Time Hopfield Network,Assuming that the neurons activation function is sgn,the,transition rule,of the i-th neuron would be,-1,if net 0 (excitatory state),v=,If,for a given time,only a,single neuron,is allowed to,update its output and only one entry in vector v is,allowed to change,this is an,asynchronous operation,under which each element of the output vector is,updated separately while taking into account the most,recent values for the elements that have already been,updated and remain stable.,(*),Based on(*),the,update rule,of a discrete-time,recurrent network,for one value of i at a time,becomes,v =sgn(w v +i -T)for,random,i,i=1,2,n,and k=0,1,2,.,i,K+1,i,t,k,i,i,where k denotes the index of recursive update.This is,referred as,asynchronous stochastic recursion,of the,Hopfield network.This update process will continue,until all,n,entries of v have been updated.,The recursive computation continues until the output,node vector remains unchanged with further iterations.,Similarly,for,synchronous operation,we have,i,K+1,v =TWv +i-t,for all neurons,k=0,1,.,K+1,k,where all neurons change their output simultaneously.,Geometrical Explanation,The output vector v is one of the vertices of the n-,dimensional cube-1,1 in E space.The vector,moves during recursions from vertex to vertex,until,it is stabilized in one of the 2 vertices available.,The movement is from a vertex to an,adjacent,vertex,since the,asynchronous update mode,allows for a,single-component update of an n-tuple vector at a,time.,The,final,position of v as k,is determined by,weights,thresholds,inputs,and the initial vector v,as well as the order of transitions,.,n,n,0,n,To evaluate the,stability property,of the dynamical,system of interest,the,computational energy function,is defined in n-dimensional output space v .,If,the increments,of a certain bounded positive-valued,computational energy function under the transition,rule are found to be,non-positive,then the function,can be called a,Lyapunov function,and the system,would be asymptotically stable.,The scalar-valued energy function for the discussed,system is a quadratic form:,n,E=-1/2 V WV-i V+t V,t,t,t,E=-1/2 V WV-i V+t V,or,E=-1/2,w v v -i v +t v,i=1,n,j=1,n,ij,i,j,i=1,i=1,n,i,i,n,i,i,The energy function in asynchronous mode.,Assume that the output node i has been updated at the,k-th instant so that v -v =,v .Computing the energy,gradient vector:,i,k+1,k,i,E=-1/2(W +W)v-i +t =-Wv-i +t,v,t,t,t,t,t,W =W,t,The energy increment becomes,E=(,E,)v=(-W v-i +t )v,t,t,t,t,i,i,i,i,This is because only the i-th output is updated.,v,j,i,t,t,t,Therefore we have,(,v)=0,v 0,i,t,This can be rewritten as,E=-(w v +i -t)v for j,i,i,i,i,j,ij,n,j=1,or briefly,E=-net v,i,i,Note that,when net 0,then,v 0,then,v,0,thus,(net,v)is always non-negative,.In other words,any corresponding,energy changes,E are non-positive,provided that w =w .,i,i,i,i,i,i,ij,ji,Further we can show that the non-increasing energy,function has a minimum.,Since W is indefinite because of its zero diagonal,then,E has neither a minimum nor maximum in,unconstrained output space.However,E is obviously,bounded in n-dimensional space consisting of the 2,vertices of n-dimensional cube,Thus,E has to reach,its minimum finally under the update algorithm.,n,Example of recursive asynchronous update of,computed digit 4:,(a),(b),(c),(d),(e),where(a)k=0,(b)k=1,(c)k=2,(d)k=3,(e)k=4.,The initial map is a destroyed digit 4 with 20%of the,pixels randomly reversed.For k4,no changes are,produced at the network output since the system,arrived at one of its stable states.,9.3 Gradient-Type Hopfield Network,Consider the continuous-time single-layer feedback,networks.One of
点击显示更多内容>>

最新DOC

最新PPT

最新RAR

收藏 下载该资源
网站客服QQ:3392350380
装配图网版权所有
苏ICP备12009002号-6