Znn3bq.jpeg
²é¿´: 1334  |  »Ø¸´: 3

tmy1977

½ð³æ (ÕýʽдÊÖ)

[ÇóÖú] ÇëÄÄλÅóÓѰïæ²éÒ»ÏÂÏÂÃæÕâÆªÎÄÕÂÊÇ·ñ±»Sci¼ìË÷?лл!

1. ÌâÃû: A new descent memory gradient method and its global convergence
   2. ×÷Õß: Min Sun and Qingguo Bai
   3. ÆÚ¿¯ÌâÃû: Journal of Systems Science and Complexity
   4. ³ö°æÉç: Springer
   5. Äê·Ý: 2011Äê
  6. ¾íÆÚºÍÒ³Âë: Volume 24, Number 4, 784-794.
»Ø¸´´ËÂ¥

» ²ÂÄãϲ»¶

» ±¾Ö÷ÌâÏà¹Ø¼ÛÖµÌùÍÆ¼ö£¬¶ÔÄúͬÑùÓаïÖú:

ÒÑÔÄ   »Ø¸´´ËÂ¥   ¹Ø×¢TA ¸øTA·¢ÏûÏ¢ ËÍTAºì»¨ TAµÄ»ØÌû

tllijp

½ð³æ (ÕýʽдÊÖ)

¡¾´ð°¸¡¿Ó¦Öú»ØÌû

¸Ðл²ÎÓ룬ӦÖúÖ¸Êý +1
tmy1977(½ð±Ò+50): ¡ï¡ï¡ï¡ï¡ï×î¼Ñ´ð°¸ лл.¿´À´ÊǽøÈ¥ÁË! 2012-02-14 19:37:35
1.



¥¿¥¤¥È¥ë: A new descent memory gradient method and its global convergence
ÖøÕßÃû: Sun Min; Bai Qingguo
¥¸¥ã©`¥Ê¥ëÃû: JOURNAL OF SYSTEMS SCIENCE & COMPLEXITY  Ž†: 24   ºÅ: 4   ¥Ú©`¥¸: 784-794   DOI: 10.1007/s11424-011-8150-0   °kÐÐ: AUG 2011
±»ÒýÓÃÊý: 0 (Web of Science ¤«¤é)


[ ³­åh¤ò·Ç±íʾ] In this article, a new descent memory gradient method without restarts is proposed for solving large scale unconstrained optimization problems. The method has the following attractive properties: 1) The search direction is always a sufficiently descent direction at every iteration without the line search used; 2) The search direction always satisfies the angle property, which is independent of the convexity of the objective function. Under mild conditions, the authors prove that the proposed method has global convergence, and its convergence rate is also investigated. The numerical results show that the new descent memory method is efficient for the given test problems.
2Â¥2012-02-14 19:29:00
ÒÑÔÄ   »Ø¸´´ËÂ¥   ¹Ø×¢TA ¸øTA·¢ÏûÏ¢ ËÍTAºì»¨ TAµÄ»ØÌû

tllijp

½ð³æ (ÕýʽдÊÖ)

¡¾´ð°¸¡¿Ó¦Öú»ØÌû

±êÌâ: A new descent memory gradient method and its global convergence
×÷Õß: Sun Min; Bai Qingguo
À´Ô´³ö°æÎï: JOURNAL OF SYSTEMS SCIENCE & COMPLEXITY  ¾í: 24   ÆÚ: 4   Ò³: 784-794   DOI: 10.1007/s11424-011-8150-0   ³ö°æÄê: AUG 2011
±»ÒýƵ´Î: 0 (À´×Ô Web of Science)
3Â¥2012-02-14 19:31:51
ÒÑÔÄ   »Ø¸´´ËÂ¥   ¹Ø×¢TA ¸øTA·¢ÏûÏ¢ ËÍTAºì»¨ TAµÄ»ØÌû

slimmy

гæ (ÕýʽдÊÖ)

CSDL Union NCL OPAC   (0)    ±£´æÎª:   ¸ü¶àÑ¡Ïî  

A new descent memory gradient method and its global convergence  
×÷Õß: Sun, M (Sun, Min)1; Bai, QG (Bai, Qingguo)2  
À´Ô´³ö°æÎï: JOURNAL OF SYSTEMS SCIENCE & COMPLEXITY  ¾í: 24   ÆÚ: 4   Ò³: 784-794   DOI: 10.1007/s11424-011-8150-0   ³ö°æÄê: AUG 2011  
±»ÒýƵ´Î: 0 (À´×Ô Web of Science)  
ÒýÓõIJο¼ÎÄÏ×: 18 [ ²é¿´ Related Records ]     ÒýÖ¤¹ØÏµÍ¼      
ÕªÒª: In this article, a new descent memory gradient method without restarts is proposed for solving large scale unconstrained optimization problems. The method has the following attractive properties: 1) The search direction is always a sufficiently descent direction at every iteration without the line search used; 2) The search direction always satisfies the angle property, which is independent of the convexity of the objective function. Under mild conditions, the authors prove that the proposed method has global convergence, and its convergence rate is also investigated. The numerical results show that the new descent memory method is efficient for the given test problems.  
Èë²ØºÅ: WOS:000293858300014  
ÎÄÏ×ÀàÐÍ: Article  
ÓïÖÖ: English  
×÷Õ߹ؼü´Ê: Global convergence; memory gradient method; sufficiently descent  
KeyWords Plus: LINE SEARCH; CONJUGATE  
ͨѶ×÷ÕßµØÖ·: Sun, M (ͨѶ×÷Õß),Zaozhuang Univ, Dept Math & Informat Sci, Zaozhuang 277160, Peoples R China  
µØÖ·:
1. Zaozhuang Univ, Dept Math & Informat Sci, Zaozhuang 277160, Peoples R China
2. Qufu Normal Univ, Sch Management, Rizhao 276826, Peoples R China  
µç×ÓÓʼþµØÖ·: sunmin_2008@yahoo.com.cn, qfnubaiqg@163.com  
»ù½ð×ÊÖúÖÂл:
»ù½ð×ÊÖú»ú¹¹ ÊÚȨºÅ
National Science Foundation of China  70971076  
Foundation of Shandong Provincial Education Department  J10LA59  

[ÏÔʾ»ù½ð×ÊÖúÐÅÏ¢][Òþ²Ø»ù½ð×ÊÖúÐÅÏ¢]   

This research is supported by the National Science Foundation of China under Grant No. 70971076 and the Foundation of Shandong Provincial Education Department under Grant No. J10LA59.

³ö°æÉÌ: SPRINGER HEIDELBERG, TIERGARTENSTRASSE 17, D-69121 HEIDELBERG, GERMANY  
Web of Science ·ÖÀà: Mathematics, Interdisciplinary Applications  
ѧ¿ÆÀà±ð: Mathematics  
IDS ºÅ: 806ZW  
ISSN: 1009-6124
4Â¥2012-02-14 20:33:48
ÒÑÔÄ   »Ø¸´´ËÂ¥   ¹Ø×¢TA ¸øTA·¢ÏûÏ¢ ËÍTAºì»¨ TAµÄ»ØÌû
Ïà¹Ø°æ¿éÌø×ª ÎÒÒª¶©ÔÄÂ¥Ö÷ tmy1977 µÄÖ÷Ìâ¸üÐÂ
×î¾ßÈËÆøÈÈÌûÍÆ¼ö [²é¿´È«²¿] ×÷Õß »Ø/¿´ ×îºó·¢±í
[¿¼ÑÐ] ¸´ÊÔµ÷¼Á +5 »ý¼«ÏòÉÏ£» 2026-04-10 7/350 2026-04-11 00:27 by onlyÖÜ
[¿¼ÑÐ] µ÷¼Á +12 ¾í¾í¾íÐIJË_ 2026-04-09 13/650 2026-04-10 22:36 by Ftglcn90
[¿¼ÑÐ] 085404 298·ÖÇóµ÷¼Á +10 ºôÀ²ºôÀ²ºôºôºô 2026-04-10 11/550 2026-04-10 16:44 by wangy0907
[¿¼ÑÐ] 301Çóµ÷¼Á +5 149. 2026-04-10 5/250 2026-04-10 15:45 by ²ñС°×
[¿¼ÑÐ] 308Çóµ÷¼Á +21 ÌÈÈôÆð·çÁËÄØ 2026-04-05 21/1050 2026-04-10 08:13 by Sammy2
[¿¼ÑÐ] ²ÄÁÏÓ뻯¹¤371Çóµ÷¼Á +17 ÅãÁÕ¿´º£ 2026-04-04 18/900 2026-04-10 07:51 by 314126402
[¿¼ÑÐ] 085404£¬285·ÖÇóµ÷¼Á +12 Þ±Þ±¿¼ÑÐ 2026-04-07 14/700 2026-04-09 23:10 by parmtree
[¿¼ÑÐ] ¿¼Ñе÷¼Á-²ÄÁÏÀà-284 +28 Ïë»»ÊÖ»ú²»Ïë½âÊ 2026-04-08 28/1400 2026-04-09 20:08 by µ¹Êý321?
[¿¼ÑÐ] 305Çóµ÷¼Á +4 77Qi 2026-04-07 4/200 2026-04-09 17:27 by wp06
[¿¼ÑÐ] 0703»¯Ñ§µ÷¼Á325·Ö +13 15771691647 2026-04-04 15/750 2026-04-09 16:55 by 15771691647
[¿¼ÑÐ] 262Çóµ÷¼Á +10 ÌìϵÚÒ»ÎÄ 2026-04-04 13/650 2026-04-09 15:16 by ̽123
[¿¼ÑÐ] 070300»¯Ñ§Ñ§Ë¶311·ÖÇóµ÷¼Á +18 Áº¸»¹óÏÕÖÐÇó 2026-04-04 20/1000 2026-04-09 11:18 by ßÕßÕßÕßÉßÉßÉ
[¿¼ÑÐ] 288Çóµ÷¼Á +15 ûÓдð°¸_ 2026-04-05 15/750 2026-04-09 10:22 by 5268321
[¿¼²©] ²ÄÁÏ·½Ïò¿¼²©£¬ÇóÍÆ¼ö +3 ÑÔÓïaaa 2026-04-05 4/200 2026-04-08 22:22 by nxgogo
[¿¼ÑÐ] »úе¹¤³Ì313·ÖÕÒ¹¤¿Æµ÷¼Á +3 ˫һÁ÷±¾¿Æ»úе 2026-04-08 3/150 2026-04-08 20:41 by ÍÁľ˶ʿÕÐÉú
[¿¼ÑÐ] 285Çóµ÷¼Á +7 ¶ñ·¨´ó¶þµÄÆøÎ¶ß 2026-04-05 10/500 2026-04-08 14:34 by zhq0425
[¿¼ÑÐ] ÉúÎïÓëÒ½Ò©Çóµ÷¼Á +7 heguanhua 2026-04-05 8/400 2026-04-06 18:41 by macy2011
[¿¼ÑÐ] 22408 331·ÖÇóµ÷¼Á +4 y__1 2026-04-06 4/200 2026-04-06 17:26 by ÍÁľ˶ʿÕÐÉú
[¿¼ÑÐ] ²ÄÁϵ÷¼Á +7 dxyµ÷¼Á 2026-04-04 7/350 2026-04-05 09:15 by İÇï26
[¿¼ÑÐ] 325Çóµ÷¼Á +4 ´º·ç²»½èÒâ 2026-04-04 4/200 2026-04-04 22:08 by à£à£à£0119
ÐÅÏ¢Ìáʾ
ÇëÌî´¦ÀíÒâ¼û