24小时热门版块排行榜    

CyRhmU.jpeg
查看: 2020  |  回复: 13
当前只显示满足指定条件的回帖,点击这里查看本话题的所有回帖

zhangxiaodao

银虫 (小有名气)

[求助] 根据数据和方程的形式,求方程中的参数

RT, 数据如下列, 然后方程的形式为y=a*exp[-p/(x+q)],求方程中的p和q,谢谢各位大侠,急求啊~~
x=1.1
1.2
1.3
1.4
1.5
1.6
1.7
1.8
1.9
2
2.1
2.2
2.3
2.4
y=10363972.36
5697599.795
3269017.372
1948866.775
1202604.284
765578.5464
501320.0508
336809.4864
231640.3371
162754.7914
116618.904
85080.3473
63109.81473
47535.43773
回复此楼
已阅   回复此楼   关注TA 给TA发消息 送TA红花 TA的回帖

dbb627

荣誉版主 (著名写手)

【答案】应助回帖


感谢参与,应助指数 +1
zhangxiaodao(金币+20): 2011-12-20 13:52:42
xiegangmai(金币+1): 谢谢应助! 2011-12-20 23:02:49
General model:
     f(x) = a*exp(-p/(x+q))
Coefficients (with 95% confidence bounds):
       a =           1  (1, 1)
       p =         -42  (-42, -42)
       q =         1.5  (1.5, 1.5)

Goodness of fit:
  SSE: 1.767e-007
  R-square: 1
  Adjusted R-square: 1
  RMSE: 0.0001267

初值a=10000 p=-2  q=-1
The more you learn, the more you know, the more you know, and the more you forget. The more you forget, the less you know. So why bother to learn.
6楼2011-12-20 11:22:06
已阅   回复此楼   关注TA 给TA发消息 送TA红花 TA的回帖

dbb627

荣誉版主 (著名写手)


xiegangmai(金币+1): 辛苦了! 2011-12-20 23:03:02
引用回帖:
5楼: Originally posted by 715211229 at 2011-12-20 10:48:24:
你看下这个帖子,我这个可能是错误的
http://muchong.com/bbs/viewthread.php?tid=3866180

图如下





» 本帖已获得的红花(最新10朵)

The more you learn, the more you know, the more you know, and the more you forget. The more you forget, the less you know. So why bother to learn.
7楼2011-12-20 11:23:32
已阅   回复此楼   关注TA 给TA发消息 送TA红花 TA的回帖

dbb627

荣誉版主 (著名写手)

CODE:
x=[1.1
1.2
1.3
1.4
1.5
1.6
1.7
1.8
1.9
2
2.1
2.2
2.3
2.4];
y=[10363972.36
5697599.795
3269017.372
1948866.775
1202604.284
765578.5464
501320.0508
336809.4864
231640.3371
162754.7914
116618.904
85080.3473
63109.81473
47535.43773];
st_ = [1000 -2 -1];
ft_ = fittype('a*exp(-p/(x+q))','dependent',{'y'},'independent',{'x'},'coefficients',{'a', 'p','q'});
[cf_,good]= fit(x,y,ft_ ,'Startpoint',st_)
h_ = plot(cf_,'fit',0.95);
legend off;  % turn off legend from plot method call
set(h_(1),'Color',[1 0 0],...
     'LineStyle','-', 'LineWidth',2,...
     'Marker','none', 'MarkerSize',6);
hold on,plot(x,y,'*')

cf_ =

     General model:
     cf_(x) = a*exp(-p/(x+q))
     Coefficients (with 95% confidence bounds):
       a =           1  (1, 1)
       p =         -42  (-42, -42)
       q =         1.5  (1.5, 1.5)

good =

           sse: 1.7672e-007
       rsquare: 1
           dfe: 11
    adjrsquare: 1
          rmse: 1.2675e-004

» 本帖已获得的红花(最新10朵)

The more you learn, the more you know, the more you know, and the more you forget. The more you forget, the less you know. So why bother to learn.
9楼2011-12-20 14:38:54
已阅   回复此楼   关注TA 给TA发消息 送TA红花 TA的回帖

dbb627

荣誉版主 (著名写手)

引用回帖:
10楼: Originally posted by zhangxiaodao at 2011-12-20 15:41:40:
非常感谢dbb627版主,可能您也知道我问您要源程序的目的了,我手里还有一些数据,是跟上面一样的方程,希望得到a,p,q值,请问能直接套用这个程序吗,或者需要做哪些调整,谢谢您。

需要注意只有一个,就是初值的选取,
st_ = [1000 -2 -1];
你需要大概估计下a,p,q值的范围,如果他们差得不太大,就用  
       a =           1  (1, 1)
       p =         -42  (-42, -42)
       q =         1.5  (1.5, 1.5)这个作为初值就可以
The more you learn, the more you know, the more you know, and the more you forget. The more you forget, the less you know. So why bother to learn.
11楼2011-12-20 15:48:12
已阅   回复此楼   关注TA 给TA发消息 送TA红花 TA的回帖

dbb627

荣誉版主 (著名写手)

根据实际参数范围,另外就是不断尝试,也可以用些全局算法估计,遗传或者退化算法

» 本帖已获得的红花(最新10朵)

The more you learn, the more you know, the more you know, and the more you forget. The more you forget, the less you know. So why bother to learn.
13楼2011-12-20 20:32:41
已阅   回复此楼   关注TA 给TA发消息 送TA红花 TA的回帖
相关版块跳转 我要订阅楼主 zhangxiaodao 的主题更新
信息提示
请填处理意见