24小时热门版块排行榜    

查看: 344  |  回复: 2

sero

铁虫 (正式写手)

[求助] 【Matlab】求助有没有大侠用过matlab的神经网络优化发酵培养基?

求助有没有大侠用过matlab的神经网络优化发酵培养基?急求帮助!!!!!!
特别是Bp网络结合GA算法的!
回复此楼
将讨厌的变成兴趣!
已阅   回复此楼   关注TA 给TA发消息 送TA红花 TA的回帖

yqx1985

木虫 (著名写手)

云中仙

【答案】应助回帖


xzhdty(金币+1): 欢迎常来 2011-10-09 12:13:26
function ga_bp
%1)主程序
%遗传算法对BP网络进行优化
global P
global T
global R
global S2
global S1
global S
P=p;
T=t;
R=size(P,1);
S2=size(T,1);
S1=25;%隐含层节点数
S=R*S1+S1*S2+S1+S2;%遗传算法编码长度
aa=ones(S,1)*[-1,1];
popu=50;%种群规模
initPpp=initializega(popu,aa,'gabpEval');%初始化种群
gen=100;%遗传代数
%调用GAOT工具箱,其中目标函数定义为gabpEval
[x,endPop,bPop,trace]=ga(aa,'gabpEval',[],initPpp,[1e-6 1 1],'maxGenTerm',gen,...
'normGeomSelect',[0.09],['arithXover'],[2],'nonUnifMutation',[2 gen 3]);
%绘收敛曲线图
figure(1)
plot(trace(:,1),1./trace(:,3),'r-');
hold on
plot(trace(:,1),1./trace(:,2),'b-');
xlabel('Generation');
ylabel('Sum-Squared Error');
figure(2)
plot(trace(:,1),trace(:,3),'r-');
hold on
plot(trace(:,1),trace(:,2),'b-');
xlabel('Generation');
ylabel('Fittness');

%将遗传算法的编码分解为BP网络所对应的权值、阈值
function[W1,B1,W2,B2,P,T,A1,A2,SE,val]=gadecod(x)
% [P,T,R,Sl,S2,S]=nninit;
%前R*S1个编码为W1
global p
global t
global R
global S2
global S1
global S
global P
global T
for i=1:S1
for k=1:R
W1(i,k)=x(R*(i-1)+k);
end
end
%接着的S1*S2个编码(即第R*S1个后的编码)为W2
for i=1:S2
for k=1:S1
W2(i,k)=x(S1*(i-1)+k+R*S1);
end
end
%接着的S1个编码(即第R*SI+SI*S2个后的编码)为B1
for i=1:S1
B1(i,1)=x((R*S1+S1*S2)+i);
end
%接着的S2个编码(即第R*SI+SI*S2+S1个后的编码)为B2
for i=1:S2
B2(i,1)=x((R*S1+S1*S2+S1)+i);
end
%计算S1与S2层的输出
A1=tansig(W1*P,B1);
A2=purelin(W2*A1,B2);
%计算误差平方和
SE=sumsqr(T-A2);
val=1/SE;%遗传算法的适应值
%将遗传算法的结果分解为BP网络所对应的权值、阈值
% [W1,B1,W2,B2,P,T,A1,A2,SE,val]=gadecod(x);

%适应值函数gabpEval
function[sol,val]=gabpEval(sol,options)
%val-the fittness of this individual
%s01-the individual。returned to allow for Lamarckian evolution
%options-[current_generation]
day=[0.9363 -0.9698 -0.9907 -0.9562 -0.9507 0.9363 -0.9164 0.9045 0.8918;
-0.9358 -0.9751 0.9821 -0.9544 -0.9469 0.9426 0.9182 0.8967 -0.8841;
0.9516 -0.9781 -0.9744 -0.9525 0.9509 0.9368 0.9082 -0.8903 -0.8665;
-0.9480 -0.9795 -0.9796 -0.9507 0.9509 0.9300 -0.9075 -0.8902 -0.8671;
-0.9433 -0.9923 -0.9812 -0.9596 -0.9406 -0.9230 0.9071 -0.8864 -0.8547;
-0.9424 1.0000 -0.9800 -0.9514 0.9349 -0.9089 0.9206 -0.8780 -0.8414;
0.9355 -0.9878 -0.9737 -0.9499 0.9337 0.9084 -0.9072 -0.8745 -0.8332]
%数据预处理
dayh=day;
[dayhn,mindayh,maxdayh]=premnmx(dayh);
m=dayhn;
%输入和输出样本
P=m(:,1:8);
T=dayhn(:,2:9);
R=size(P,1);
S2=size(T,1);
S1=25%隐含层节点数
S=R*S1+S1*S2+S1+S2;%遗传算法编码长度
for i=1:S,
x(i)=sol(i);
end;
[W1,B1,W2,B2,P,T,A1,A2,SE,val]=gadecod(x);

day=[0.9363 -0.9698 -0.9907 -0.9562 -0.9507 0.9363 -0.9164 0.9045 0.8918;
-0.9358 -0.9751 0.9821 -0.9544 -0.9469 0.9426 0.9182 0.8967 -0.8841;
0.9516 -0.9781 -0.9744 -0.9525 0.9509 0.9368 0.9082 -0.8903 -0.8665;
-0.9480 -0.9795 -0.9796 -0.9507 0.9509 0.9300 -0.9075 -0.8902 -0.8671;
-0.9433 -0.9923 -0.9812 -0.9596 -0.9406 -0.9230 0.9071 -0.8864 -0.8547;
-0.9424 1.0000 -0.9800 -0.9514 0.9349 -0.9089 0.9206 -0.8780 -0.8414;
0.9355 -0.9878 -0.9737 -0.9499 0.9337 0.9084 -0.9072 -0.8745 -0.8332];
%数据预处理
dayh=day;
[dayhn,mindayh,maxdayh]=premnmx(dayh);
m=dayhn;
%输入和输出样本
p=m(:,1:8);
t=dayhn(:,2:9);
%测试样本
k=[0.9435 0.9796 -0.9706 -0.9552 -0.9298 -0.9130 -0.9003 0.8708 0.8234;
    -0.9358 -0.9751 0.9821 -0.9544 -0.9469 0.9426 0.9182 0.8967 -0.8841;
0.9516 -0.9781 -0.9744 -0.9525 0.9509 0.9368 0.9082 -0.8903 -0.8665;
-0.9480 -0.9795 -0.9796 -0.9507 0.9509 0.9300 -0.9075 -0.8902 -0.8671;
-0.9433 -0.9923 -0.9812 -0.9596 -0.9406 -0.9230 0.9071 -0.8864 -0.8547;
-0.9424 1.0000 -0.9800 -0.9514 0.9349 -0.9089 0.9206 -0.8780 -0.8414;
     -0.9496 -0.9778 -0.9693 -0.9536 -0.9352 -0.9111 -0.9076 0.8797 -0.8227];
kn=tramnmx(k,mindayh,maxdayh);

[W1,B1,W2,B2,P,T,A1,A2,SE,val]=gadecod(x);
net.IW{1,1}=W1;
net.LW{2,1}=W2;
net.b{1}=B1;
net.b{2}=B2;

net.trainParam.show=50;
net.trainParam.lr=0.1;
net.trainParam.epochs=1000;
net.trainParam.goal=1e-28;

net=train(net,P,T);

%新建BP网络
S1=25
net=newff(minmax(p),[S1,7,7],{'logsig','logsig','purelin'},'trainlm');
%BP网络参数设置
net.trainParam.show=1;
net.trainParam.epochs=2000;
net.trainParam.goal=1.0e-28;
net.trainParam.max.fail=5;
net.trainParam.lr=0.3;
%BP网络初次训练
net=init(net);
[net,tr,Y,E]=train(net,p,t)
%BP网络初次训练后的预测
s=sim(net,kn);
plot(s,'r+-');
hold on;
plot(kn,'b*-')
人面不知何时去,桃花依旧笑春风
2楼2011-10-09 10:48:05
已阅   回复此楼   关注TA 给TA发消息 送TA红花 TA的回帖

yqx1985

木虫 (著名写手)

云中仙


jjdg(金币+1): 感谢补充 2011-10-13 01:07:11
引用回帖:
2楼: Originally posted by yqx1985 at 2011-10-09 10:48:05:
function ga_bp
%1)主程序
%遗传算法对BP网络进行优化
global P
global T
global R
global S2
global S1
global S
P=p;
T=t;
R=size(P,1);
S2=size(T,1);
S1=25;%隐含层节点数
S=R*S1+S1*S2+S1+S2; ...

When you try to use those code, replace data in code as you own data.
人面不知何时去,桃花依旧笑春风
3楼2011-10-12 22:07:58
已阅   回复此楼   关注TA 给TA发消息 送TA红花 TA的回帖
相关版块跳转 我要订阅楼主 sero 的主题更新
信息提示
请填处理意见