梯度下降算法

王朝百科·作者佚名  2010-06-06  
宽屏版  字体: |||超大  

VB梯度下降算法function grad_ascent(x,y,z,px,py,N,mu,xstart,ystart)

xga(1)= xstart;

yga(1)= ystart;

zga(1)=func(xga(1),yga(1));

for i=1:N

gradx = ( func(xga(i)+eps,yga(i))-func(xga(i),yga(i)) )/eps;

grady = ( func(xga(i),yga(i)+eps)-func(xga(i),yga(i)) )/eps;

xga(i+1) = xga(i) + mu*gradx;

yga(i+1) = yga(i) + mu*grady;

zga(i+1)=func(xga(i+1),yga(i+1));

end

hold off

contour(x,y,z,10)

hold on

quiver(x,y,px,py)

hold on

plot(xga,yga)

S = sprintf('Gradiant Ascent: N = %d, Step Size = %f',N,mu);

title(S)

xlabel('x axis')

ylabel('yaxis')

DEMO

clear

print_flag = 1;

width = 1.5;

xord = -width:.15:width;

yord = -width:.15:width;

[x,y] = meshgrid(xord,yord);

z = func(x,y);

hold off

surfl(x,y,z)

xlabel('x axis')

ylabel('yaxis')

if print_flag, print

else, input('Coninue?'), end

[px,py] = gradient(z,.2,.2);

xstart = 0.9*width;

ystart =-0.3*width;

N = 100;

mu = 0.02;

grad_ascent(x,y,z,px,py,N,mu,xstart,ystart)

if print_flag, print

else, input('Coninue?'), end

N = 100;

mu = 0.06;

grad_ascent(x,y,z,px,py,N,mu,xstart,ystart)

if print_flag, print

else, input('Coninue?'), end

N = 100;

mu = 0.18;

grad_ascent(x,y,z,px,py,N,mu,xstart,ystart)

if print_flag, print

else, input('Coninue?'), end

 
免责声明:本文为网络用户发布,其观点仅代表作者个人观点,与本站无关,本站仅提供信息存储服务。文中陈述内容未经本站证实,其真实性、完整性、及时性本站不作任何保证或承诺,请读者仅作参考,并请自行核实相关内容。
 
© 2005- 王朝百科 版权所有