±±¾©Ê¯ÓÍ»¯¹¤Ñ§Ôº2026ÄêÑо¿ÉúÕÐÉú½ÓÊÕµ÷¼Á¹«¸æ
²é¿´: 7420  |  »Ø¸´: 20
±¾Ìû²úÉú 1 ¸ö ¼ÆËãÇ¿Ìû £¬µã»÷ÕâÀï½øÐв鿴

glazio

Ìú³æ (СÓÐÃûÆø)

[ÇóÖú] Çó½ÌoriginÄâºÏ¼òµ¥µÄÉøÁ÷·½³Ì£¨ÓÖ³ÆÓâÉø·½³Ì£©£¬×ö¸´ºÏ²ÄÁϵÄÓ¦¸Ã»áÓöµ½°É

×Ô¼ºÊÔÁ˺öà´Î£¬origin»¹ÊǸø³öÎÞ·¨ÄâºÏµÄ½á¹û£¬Òò´ËÔÚ´ËÇó½Ì¸ßÈËÃÇ¡£

1. ¹«Ê½
ÉøÁ÷¹«Ê½Îªy=A*(x-xc)^p
ÆäÖÐxΪ×Ô±äÁ¿£¬yΪÒò±äÁ¿£¬A¡¢xcºÍp¾ùΪ³£Êý¡£

2. Êý¾Ý
ΪÁ˲âÊÔÄ£Ä⣬É趨A=18.5£¬xc=0.095£¬p=-2.3£¬µÃµ½ÒÔÏÂÊý¾Ý

x                y
------------------------------
0.1001        3.5E+06
0.1002        3.3E+06
0.11        2.9E+05
0.12        9.0E+04
0.15        1.5E+04
0.2        3.3E+03
0.3        7.1E+02
0.4        2.8E+02
0.5        1.5E+02
0.6        8.9E+01

3. ÎÒµÄorigin£¨Pro V8.5£©ÄâºÏ¹ý³Ì
ѡȡNonlinear Curve Fit£¬CategoryѡȡPower£¬FunctionѡȡPower1£¬¸Ã·½³ÌÐÎʽΪy=A|x-xc|^p¡£
ÔÚ²ÎÊýboundsÖÐÉ趨p<0£¬0
¿¼Âǵ½yÖµ±ä¶¯½Ï´ó£¬ÓÖÔÚNLCF-Settings-Data SelectionÖн«yÖµÈ¨ÖØÉèΪ¡°Variance~y^2¡±ºó£¬ÏÔʾChi-SquareΪ1.42513£¬ÏÖÔÚÖ´ÐС°1 Iteration¡±£¬µ«ÏµÍ³ÈÔÈ»ÏÔʾ¡°Fit did not converge - reason unknown.¡±



ÒÔÉÏΪÎÊÌâºÍ²½Öè³ÂÊö£¬ÇëÎÊÎÒµÄÄâºÏÎÊÌâÔÚÄÄÀÈçºÎ½â¾öÄØ£¿

[ Last edited by glazio on 2011-11-25 at 09:33 ]
»Ø¸´´ËÂ¥

» ÊÕ¼±¾ÌûµÄÌÔÌûר¼­ÍƼö

¸ÐÐËȤµÄÎÊÌâ ÎÄÏ× Ð´×÷¸¨Öú

» ²ÂÄãϲ»¶

» ±¾Ö÷ÌâÏà¹ØÉ̼ÒÍÆ¼ö: (ÎÒÒ²ÒªÔÚÕâÀïÍÆ¹ã)

» ±¾Ö÷ÌâÏà¹Ø¼ÛÖµÌùÍÆ¼ö£¬¶ÔÄúͬÑùÓаïÖú:

ÒÑÔÄ   »Ø¸´´ËÂ¥   ¹Ø×¢TA ¸øTA·¢ÏûÏ¢ ËÍTAºì»¨ TAµÄ»ØÌû
»ØÌûÖö¥ ( ¹²ÓÐ1¸ö )

dbb627

ÈÙÓþ°æÖ÷ (ÖøÃûдÊÖ)

¡¾´ð°¸¡¿Ó¦Öú»ØÌû

glazio(½ð±Ò+10): dbb627°æÖ÷²»À¢ÊÇmatlabÉñÈË£¬ËùÓÐÎÊÌâ¶¼µÃµ½»Ø´ð£¬·Ç³£¸Ðл£¡ 2011-11-30 08:57:21
cenwanglai(¼ÆËãÇ¿Ìû+1): ÄãµÄ»Ø¸´¶¼Í¦ºÃ¡£²»¹ýÎÒ²»ÊìϤ£¬Ã»ÓÐ×Ðϸ¿´Ã÷°×¡£¾ÍÕâÀï¸øÒ»¸öEPI°É~ 2011-12-22 20:09:16
ÒýÓûØÌû:
13Â¥: Originally posted by glazio at 2011-11-29 22:38:41:
Äã¸øµÄÌû×ÓÀïµÄÇé¿öºÃÏñ²¢²»ÊÇÎÒ˵µÄ¹²Ïí²ÎÊýÄâºÏ¡£ÏÂÃæ¾Ù¸öÀý×Ó˵Ã÷һϡ£

¼ÙÈç˵ÔÚÊý¾ÝͼÖÐÓÐÁ½×éʵÑéÊý¾Ý¼¯Y1ºÍY2£¬¿ÉÒÔÓÃij¸öº¬ÓÐÈý¸ö²ÎÊýa£¬b£¬cµÄ·½³Ìy=f(x)ÃèÊö¡£ÆÕͨµÄÄâºÏÒ»°ãÊÇÓÃF(x)¶ÔÊý¾Ý¼¯Y1ºÍY2 ...

Õâ¸öûÓÐÎÊÌâ
matlab¿ÉÒÔ×öµÄ
¼ûÏÂÃæµÄÀý×Ó
CODE:
x=rand(1,7);
y1=2*x+3*sin(x)+6*x.^2;
y2=2*x+3*cos(x)+2*x.^2;%²ÎÊý 2 3 6 2Ç°Ãæ2 3¹²ÓÃ
F=@(p,x)[p(1)*x+p(2)*sin(x)+p(3)*x.^2;p(1)*x+p(2)*cos(x)+p(4)*x.^2];
p = lsqcurvefit(F, [1 1 1 1], x,[y1;y2])

Local minimum found.

Optimization completed because the size of the gradient is less than
the default value of the function tolerance.




p =

    2.0000    3.0000    6.0000    2.0000
The more you learn, the more you know, the more you know, and the more you forget. The more you forget, the less you know. So why bother to learn.
14Â¥2011-11-30 00:26:50
ÒÑÔÄ   »Ø¸´´ËÂ¥   ¹Ø×¢TA ¸øTA·¢ÏûÏ¢ ËÍTAºì»¨ TAµÄ»ØÌû
ÆÕͨ»ØÌû

seaharrier

Ìú¸Ëľ³æ (ÖªÃû×÷¼Ò)

µÛ¹ú¿Õ¾üÖн«

¡¾´ð°¸¡¿Ó¦Öú»ØÌû

ÎÒÒ²²»»á£¬
ÊÔÁËһϣ¬Ñ¡ÔñPower2¿ÉÒÔÄâºÏ£¬µ«ÊÇϵÊý¸úÄã¸øµÄ²»Ò»Ñù£¬
¿ÉÒԲο¼Ò»Ï¿´¿´ÊÇ·ñ¿ÉÐС£
Patienceisbitter,butitsfruitissweet.
2Â¥2011-11-25 13:02:45
ÒÑÔÄ   »Ø¸´´ËÂ¥   ¹Ø×¢TA ¸øTA·¢ÏûÏ¢ ËÍTAºì»¨ TAµÄ»ØÌû

dbb627

ÈÙÓþ°æÖ÷ (ÖøÃûдÊÖ)

¡¾´ð°¸¡¿Ó¦Öú»ØÌû

¡ï ¡ï ¡ï
cenwanglai(½ð±Ò+3): ~~ 2011-12-22 20:07:18
origin²»³£Óã¬matlab¿ÉÒÔÄâºÏ³ö½ÏºÃµÄ½á¹û
´úÂëÈçÏÂ
CODE:
A=[0.1001       3.5e06
0.1002       3.3e06
0.11         2.9e05
0.12        9.0e04
0.15        1.5e04
0.2        3.3e03
0.3        7.1e02
0.4        2.8e02
0.5        1.5e02
0.6        8.9e01];
x=A(:,1);y=A(:,2);
st_ = [18.5 0.095 -2.3];
ft_ = fittype('A*(x-xc).^p','dependent',{'y'},'independent',{'x'},'coefficients',{'A', 'xc','p'});
[cf_,good]= fit(x,y,ft_ ,'Startpoint',st_)
h_ = plot(cf_,'fit',0.95);
legend off;  % turn off legend from plot method call
set(h_(1),'Color',[1 0 0],...
     'LineStyle','-', 'LineWidth',2,...
     'Marker','none', 'MarkerSize',6);
hold on,plot(x,y,'*')

cf_ =

     General model:
     cf_(x) = A*(x-xc).^p
     Coefficients (with 95% confidence bounds):
       A =       111.5  (11.4, 211.6)
       xc =     0.09683  (0.09604, 0.09762)
       p =      -1.809  (-2.04, -1.578)

good =

           sse: 3.8499e+008
       rsquare: 1.0000
           dfe: 7
    adjrsquare: 1.0000
          rmse: 7.4161e+003
The more you learn, the more you know, the more you know, and the more you forget. The more you forget, the less you know. So why bother to learn.
3Â¥2011-11-25 20:14:26
ÒÑÔÄ   »Ø¸´´ËÂ¥   ¹Ø×¢TA ¸øTA·¢ÏûÏ¢ ËÍTAºì»¨ TAµÄ»ØÌû

dbb627

ÈÙÓþ°æÖ÷ (ÖøÃûдÊÖ)

¡¾´ð°¸¡¿Ó¦Öú»ØÌû

ͼÈçÏÂ


The more you learn, the more you know, the more you know, and the more you forget. The more you forget, the less you know. So why bother to learn.
4Â¥2011-11-25 20:18:15
ÒÑÔÄ   »Ø¸´´ËÂ¥   ¹Ø×¢TA ¸øTA·¢ÏûÏ¢ ËÍTAºì»¨ TAµÄ»ØÌû

glazio

Ìú³æ (СÓÐÃûÆø)

ÒýÓûØÌû:
2Â¥: Originally posted by seaharrier at 2011-11-25 13:02:45:
ÎÒÒ²²»»á£¬
ÊÔÁËһϣ¬Ñ¡ÔñPower2¿ÉÒÔÄâºÏ£¬µ«ÊÇϵÊý¸úÄã¸øµÄ²»Ò»Ñù£¬
¿ÉÒԲο¼Ò»Ï¿´¿´ÊÇ·ñ¿ÉÐС£

ºÃµÄ£¬ÎÒÔÙÈ¥ÊÔÒ»ÏÂpowerÀàÀïÆäËûº¯ÊýµÄÄâºÏ£¬¿´¿´Äܲ»Äܽâ¾ö¡£Ð»Ð»ÄãµÄ½¨Òé¡£
5Â¥2011-11-25 20:40:31
ÒÑÔÄ   »Ø¸´´ËÂ¥   ¹Ø×¢TA ¸øTA·¢ÏûÏ¢ ËÍTAºì»¨ TAµÄ»ØÌû

glazio

Ìú³æ (СÓÐÃûÆø)

ÒýÓûØÌû:
3Â¥: Originally posted by dbb627 at 2011-11-25 20:14:26:
origin²»³£Óã¬matlab¿ÉÒÔÄâºÏ³ö½ÏºÃµÄ½á¹û
´úÂëÈçÏÂ
[code]


A=[0.1001       3.5e06
0.1002       3.3e06
0.11         2.9e05
0.12        9.0e04
0.15        1.5e04
0.2        3.3e03
0.3     ...

Dbb627°æÖ÷µÄMatlabÄâºÏ´úÂëºÜÏêϸ£¬ÎÒÓÃÄãÄâºÏµÃµ½µÄ²ÎÊý[A, xc, p] = [111.5 0.09683 -1.809 ]´øÈ빫ʽy=A*(x-xc)^p¼ÆËã³öµÄ½á¹û(Fitted)£¬ºÍԭʼÊý¾Ý(Data)Ò»Æð×÷semilogͼÈçÏ¡£½«FittedºÍData¶Ô±È¿É¼û£¬¿ÉÄÜÔÚÄâºÏµÄ¹ý³ÌÖÐûÓвÉÓÃÈ¨ÖØw_i=1/y_i^2£¬ËùÒÔÄâºÏ¶ÔsseµÄ¿ØÖÆÖ»¼¯ÖÐÔڽϴóµÄy_iÊý¾Ý²¿·Ö£¬¶øÔÚ½ÏСµÄy_iÈ´Æ«À뿪ÁË¡£¶ÔÓÚÕâ¸öÎÊÌâÎÒÒÔǰ³¢ÊÔ½«¹«Ê½Á½±ßÈ¡¶ÔÊýºóÔÙ×öMatlabÄâºÏ£¬ÕâÑùµÄЧ¹ûµÄÈ·½ÏºÃ¡£ÁíÍâÎÒÏë¿ÉÒÔÒ²Ðí²ÉÈ¡È¨ÖØw_i=1/y_i^2µÄ·½·¨°É£¬µ«Ã»ÊÔ¹ý²»Çå³þ¡£

´ËǰÎÒÊÇÓÃMatlabµÄCftoolÄâºÏ£¨ÎªÁËÄâºÏ²Å´ÖdzµÄѧÁ˵ãMatlab£©£¬ÓÉÓÚÏÖÔÚÐèÒªÓÃÒ»¸ö·½³Ì¶Ô¶à×éÊý¾ÝÄâºÏ£¬Õâ¾ÍÐèÒª²ÉÓÃÄâºÏ²ÎÊý¹²ÏíµÄ·½·¨ÁË£¬¼´originËùÐû³ÆµÄGlobal Nonlinear Curve Fitting¡£

ÔÚ´ÎÏëÇë½ÌDbb627°æÖ÷£¬
1. ÎÒµÄÄ¿µÄÊǽâ¾öÄâºÏ²ÎÊý¹²ÏíµÄÎÊÌ⣬²ÅͶÈëOriginµÄ»³±§µÄ¡£ÇëÎÊMatlabĿǰÓнâ¾öÕâ¸öÎÊÌâµÄ·½°¸Ã´£¿Èç¹ûÓеϰÇëÌáʾһ¶þ¡£
2. ½ÓÏÂÀ´²ÉÓõķ½³Ì²»¿ÉÄÜÌ«¼òµ¥£¬ËùÒÔÖ±½ÓÈ¥¶ÔÊýÕâÌõ·²»ÄÜÒ»Ö±×ßµ½ºÚ¡£Òò´ËÈôÖ±½ÓÓÃMatlab¶Ôy=A*(x-xc)^pÔ­·½³ÌÄâºÏ¹ý³ÌÖУ¬²ÉÓÃÈ¨ÖØw_i=1/y_i^2¿ÉÒÔÓÐЧÌá¸ßÄâºÏ¾«¶Èô£¿MatlabÖ§³Ö£¨´úÂ뷽ʽ»òCftool·½Ê½£©ÕâÖÖÈ¨ÖØÂð£¿

Dbb627°æµÄMatlabÄâºÏ½á¹û

6Â¥2011-11-25 21:39:03
ÒÑÔÄ   »Ø¸´´ËÂ¥   ¹Ø×¢TA ¸øTA·¢ÏûÏ¢ ËÍTAºì»¨ TAµÄ»ØÌû

dbb627

ÈÙÓþ°æÖ÷ (ÖøÃûдÊÖ)

¡¾´ð°¸¡¿Ó¦Öú»ØÌû

ÊÇ¿ÉÒÔ¸ÄÉÆµÄ
CODE:
A=[0.1001   3.5e06
0.1002      3.3e06
0.11        2.9e05
0.12        9.0e04
0.15        1.5e04
0.2        3.3e03
0.3        7.1e02
0.4        2.8e02
0.5        1.5e02
0.6        8.9e01];
x=A(:,1);y=A(:,2);
w=1./y;
opts=fitoptions('method','NonlinearLeastSquares','Weights',w.^2,'Lower',[-Inf 0  -Inf],'Upper',[Inf 1 0]);
opts.StartPoint= [10 0.095 -2.3];
ft_ = fittype('A*(x-xc).^p','dependent',{'y'},'independent',{'x'},'coefficients',{'A', 'xc','p'},'options',opts);
[cf_,good]= fit(x,y,ft_)
plot(x,log(cf_(x)),'or-',x, log(y), '* ');
legend('ÄâºÏͼ','ԭʼÊý¾Ý')

cf_ =

     General model:
     cf_(x) = A*(x-xc).^p
     Coefficients (with 95% confidence bounds):
       A =       18.46  (18.03, 18.88)
       xc =     0.09496  (0.09485, 0.09508)
       p =      -2.304  (-2.315, -2.292)

good =

           sse: 0.0010
       rsquare: 0.9999
           dfe: 7
    adjrsquare: 0.9998
          rmse: 0.0121
The more you learn, the more you know, the more you know, and the more you forget. The more you forget, the less you know. So why bother to learn.
7Â¥2011-11-25 23:22:39
ÒÑÔÄ   »Ø¸´´ËÂ¥   ¹Ø×¢TA ¸øTA·¢ÏûÏ¢ ËÍTAºì»¨ TAµÄ»ØÌû

dbb627

ÈÙÓþ°æÖ÷ (ÖøÃûдÊÖ)

¡¾´ð°¸¡¿Ó¦Öú»ØÌû

cenwanglai: ~~ 2011-12-22 20:08:01
ÄâºÏ½á¹û¼ûͼ


The more you learn, the more you know, the more you know, and the more you forget. The more you forget, the less you know. So why bother to learn.
8Â¥2011-11-25 23:24:34
ÒÑÔÄ   »Ø¸´´ËÂ¥   ¹Ø×¢TA ¸øTA·¢ÏûÏ¢ ËÍTAºì»¨ TAµÄ»ØÌû

wypjyq

ľ³æ (ÕýʽдÊÖ)

ÒýÓûØÌû:
3Â¥: Originally posted by dbb627 at 2011-11-25 20:14:26:
origin²»³£Óã¬matlab¿ÉÒÔÄâºÏ³ö½ÏºÃµÄ½á¹û
´úÂëÈçÏÂ
[code]


A=[0.1001       3.5e06
0.1002       3.3e06
0.11         2.9e05
0.12        9.0e04
0.15        1.5e04
0.2        3.3e03
0.3     ...

°ßÖñ£¬ÎÊÏÂÄãÔÚÆ½Ê±×ö¼ÆËãµÄʱºòÔõôȷ¶¨²ÎÊýµÄ³õÖµ£¨st£©µÄ£¬ÎÒÿ´Î¶¼Òªµ÷ÊԺܾòÅÄÜÊÕÁ²µô¡£
9Â¥2011-11-26 22:08:31
ÒÑÔÄ   »Ø¸´´ËÂ¥   ¹Ø×¢TA ¸øTA·¢ÏûÏ¢ ËÍTAºì»¨ TAµÄ»ØÌû

dbb627

ÈÙÓþ°æÖ÷ (ÖøÃûдÊÖ)

¡¾´ð°¸¡¿Ó¦Öú»ØÌû

Õâ¸öÒ»°ãÈç¹ûÕâ¸öʽ×ÓÓÐÃ÷ȷʵ¼ÊÒâÒ壬¾Í¸ù¾Ýʵ¼Ê¶¨·¶Î§£¬Èç¹ûûÓеϰ£¬¿´Äܲ»Äܱä³ÉÏßÐÔµÄÔÚÇó²ÎÊýµÄ·¶Î§£¬»¹Óиö·½·¨¾ÍÊÇÓÃÐ©Ëæ»úÈ«¾ÖÓÅ»¯Ë㷨ȷ¶¨³õÖµ£¬ÈçÒÅ´«Ëã·¨£¬ÍË»ðËã·¨µÈ
The more you learn, the more you know, the more you know, and the more you forget. The more you forget, the less you know. So why bother to learn.
10Â¥2011-11-26 23:35:39
ÒÑÔÄ   »Ø¸´´ËÂ¥   ¹Ø×¢TA ¸øTA·¢ÏûÏ¢ ËÍTAºì»¨ TAµÄ»ØÌû
Ïà¹Ø°æ¿éÌø×ª ÎÒÒª¶©ÔÄÂ¥Ö÷ glazio µÄÖ÷Ìâ¸üÐÂ
×î¾ßÈËÆøÈÈÌûÍÆ¼ö [²é¿´È«²¿] ×÷Õß »Ø/¿´ ×îºó·¢±í
[¿¼ÑÐ] 086000ÉúÎïÓëÒ½Ò©298µ÷¼ÁÇóÖú +8 ÔªÔªÇàÇà 2026-03-31 10/500 2026-04-05 17:44 by Ecowxq666£¡
[¿¼ÑÐ] ±¾¿Æ211ÉúÎïҽѧ¹¤³Ì085409Çóµ÷¼Á339·Ö +9 Àï×Óľyy 2026-03-29 9/450 2026-04-05 17:38 by Ecowxq666£¡
[¿¼ÑÐ] 081200-11408-276ѧ˶Çóµ÷¼Á +4 ´Þwj 2026-04-04 5/250 2026-04-05 14:06 by imissbao
[¿¼ÑÐ] Çóµ÷¼Á +9 Hllºú 2026-04-04 9/450 2026-04-05 13:31 by wwytracy
[¿¼ÑÐ] Ò»Ö¾Ô¸Î÷±±Å©ÁÖÐóÄÁר˶336·ÖÇóµ÷¼Á +3 5ourr 2026-04-03 3/150 2026-04-05 10:40 by JOKER0401
[¿¼ÑÐ] Çóµ÷¼Á +3 µçÆøÐ¡Éñͯ 2026-04-04 3/150 2026-04-05 10:17 by barlinike
[¿¼ÑÐ] ²ÄÁϵ÷¼Á +15 Ò»ÑùYWY 2026-04-01 15/750 2026-04-04 22:23 by hemengdong
[¿¼ÑÐ] »·¾³¿ÆÑ§Ó빤³Ì334·ÖÇóµ÷¼Á +9 ÍõÒ»Ò»ÒÀÒÀ 2026-03-30 12/600 2026-04-04 20:55 by dongzh2009
[¿¼ÑÐ] 085600²ÄÁÏÓ뻯¹¤µ÷¼Á +26 kikiki7 2026-03-30 27/1350 2026-04-04 09:18 by qlm5820
[¿¼ÑÐ] µ÷¼Á0855-288 +5 xÐܶþa 2026-04-03 5/250 2026-04-04 00:19 by Öí»á·É
[¿¼ÑÐ] Ò»Ö¾Ô¸Äϲý´óѧ324Çóµ÷¼Á +13 hanamiko 2026-04-01 13/650 2026-04-03 18:30 by lsÁõ˧
[¿¼ÑÐ] ÍÁľˮÀû328·ÖÇóµ÷¼Á +6 ¼²·çÖª¾¢²Ý666 2026-04-02 6/300 2026-04-03 11:38 by znian
[¿¼ÑÐ] 0856²ÄÁÏÓ뻯¹¤µ÷¼Á£¬339 +14 10213207 2026-03-31 14/700 2026-04-02 21:01 by 1104338198
[¿¼ÑÐ] 311Çóµ÷¼Á +14 À¶ÔÂÁÁÁÁ 2026-03-30 14/700 2026-04-02 12:18 by 1753564080
[¿¼ÑÐ] 0710ÉúÎïѧ£¬325Çóµ÷¼Á +3 mkkkkkl 2026-04-01 3/150 2026-04-02 09:48 by Jaylen.
[¿¼ÑÐ] °²È«¹¤³Ì 285 Çóµ÷¼Á +3 Xinyu56 2026-04-01 4/200 2026-04-01 21:50 by ¾²¾²¾²¾²¾²¾²¾²¾
[¿¼ÑÐ] 070300»¯Ñ§279Çóµ÷¼Á +15 ¹þ¹þ¹þ^_^ 2026-03-31 17/850 2026-04-01 21:37 by ¸øÄãÄã×¢ÒâÐÝÏ¢
[¿¼ÑÐ] Çóµ÷¼Á ÉúÎïѧ 377·Ö +6 zzll03 2026-03-31 6/300 2026-03-31 17:33 by ÌÆãå¶ù
[¿¼ÑÐ] һ־ԸʳƷ¿ÆÑ§Ó빤³Ì083200Çóµ÷¼Á +4 XQTJZ 2026-03-30 4/200 2026-03-31 04:10 by fmesaito
[¿¼ÑÐ] ±§Ç¸ +3 ÌïºéÓÐ 2026-03-30 3/150 2026-03-30 19:11 by ÃÔºýCCPs
ÐÅÏ¢Ìáʾ
ÇëÌî´¦ÀíÒâ¼û