| ²é¿´: 1334 | »Ø¸´: 3 | |||
tmy1977½ð³æ (ÕýʽдÊÖ)
|
[ÇóÖú]
ÇëÄÄλÅóÓѰïæ²éÒ»ÏÂÏÂÃæÕâÆªÎÄÕÂÊÇ·ñ±»Sci¼ìË÷?лл!
|
|
1. ÌâÃû: A new descent memory gradient method and its global convergence 2. ×÷Õß: Min Sun and Qingguo Bai 3. ÆÚ¿¯ÌâÃû: Journal of Systems Science and Complexity 4. ³ö°æÉç: Springer 5. Äê·Ý: 2011Äê 6. ¾íÆÚºÍÒ³Âë: Volume 24, Number 4, 784-794. |
» ²ÂÄãϲ»¶
µ÷¼ÁÇóÊÕÁô
ÒѾÓÐ7È˻ظ´
272·Ö²ÄÁÏ×ÓÇóµ÷¼Á
ÒѾÓÐ36È˻ظ´
275Çóµ÷¼Á
ÒѾÓÐ8È˻ظ´
Ò»Ö¾Ô¸211£¬»¯Ñ§Ñ§Ë¶£¬310·Ö£¬±¾¿ÆÖصãË«·Ç£¬Çóµ÷¼Á
ÒѾÓÐ20È˻ظ´
070300»¯Ñ§Ñ§Ë¶311·ÖÇóµ÷¼Á
ÒѾÓÐ19È˻ظ´
²ÄÁÏÓ뻯¹¤µ÷¼Á
ÒѾÓÐ13È˻ظ´
²ÄÁÏÓ뻯¹¤µ÷¼Á
ÒѾÓÐ33È˻ظ´
¸´ÊÔµ÷¼Á
ÒѾÓÐ7È˻ظ´
Ò»Ö¾Ô¸¹þ¹¤´ó 085600 277 12²Ä¿Æ»ùÇóµ÷¼Á
ÒѾÓÐ17È˻ظ´
»¹Óл¯¹¤¶þÂÖµ÷¼ÁµÄѧУÂð
ÒѾÓÐ47È˻ظ´
» ±¾Ö÷ÌâÏà¹Ø¼ÛÖµÌùÍÆ¼ö£¬¶ÔÄúͬÑùÓаïÖú:
Çë°ïæ²éѯ¼¸ÆªSCIÎÄÕµļìË÷ºÅ£¬·Ç³£¸Ðл£¡
ÒѾÓÐ6È˻ظ´
Ôõô֪µÀÎÄÕ±»SCI ÒÔ¼°EI ¼ìË÷¡£´ÓÄǸöÍøÕ¾½øÈ¥²éÑ¯ÄØ£¿´ó¼Ò½»Á÷½»Á÷£¬
ÒѾÓÐ5È˻ظ´
Çë°ïæ²éÎÄÕÂÊÇ·ñSCI£¨Èç¹û¼ìË÷£¬Çë¸ø³ö¼ìË÷ºÅ£©
ÒѾÓÐ4È˻ظ´
¹úÄÚ¾Ù°ìµÄ¹ú¼Ê»áÒéCCTA2011 ÎÄÕ±»ÍƼöµ½SCIµÃÔö¿¯·¢±í£¬Äܱ»SCI¼ìË÷µ½Âð£¿Ð»Ð»¡£
ÒѾÓÐ9È˻ظ´
½ô¼±ÇóÖú£¬°ïæ²éÒ»ÏÂÂÛÎÄÊÇ·ñ±»SCI¼ìË÷,лл£¡
ÒѾÓÐ3È˻ظ´
°ïÃ¦Çø·ÖÒ»ÏÂISTPºÍSCIÊÕ¼ºÅµÄÇø±ð£¬¿´Ò»ÏÂÕâ¸ö½á¹ûÊÇʲô¼ìË÷
ÒѾÓÐ6È˻ظ´
¼±Óã¡£¡Âé·³Äܲé¼ìË÷µÄÂ¥Ö÷°ïæ¼ìË÷Ò»ÏÂÎÒµÄ7ƪÎÄÕÂÊÇ·ñ±»SCI¡¢EI¼ìË÷£¬Ê®·Ö¸Ðл£¡
ÒѾÓÐ7È˻ظ´
ÇóÖú°ïæ²éÒ»ÏÂÎÒµÄÎÄÕÂÊÇ·ñ±»SCI-E¼ìË÷
ÒѾÓÐ5È˻ظ´
ËÄܰïÎÒ²éÏÂÕâÆªÎÄÕµÄSCI¼ìË÷ºÅ£¬¸Ð¼¤²»¾¡
ÒѾÓÐ9È˻ظ´
°ïæȷÈÏÕâÆªÎÄÕÂÊÇ·ñ±»SCI¼ìË÷
ÒѾÓÐ8È˻ظ´
ËÄܰïÎÒ²éÒ»ÏÂÎÄÕÂÊÇ·ñ±»SCI¼ìË÷ÁË£¬ÐªÏ¢£¡
ÒѾÓÐ4È˻ظ´
ÈýÄêÁùƪsci¼ìË÷ÎÄÕµľÑé
ÒѾÓÐ415È˻ظ´
tllijp
½ð³æ (ÕýʽдÊÖ)
- Ó¦Öú: 42 (СѧÉú)
- ½ð±Ò: 1247.5
- É¢½ð: 554
- ºì»¨: 6
- Ìû×Ó: 705
- ÔÚÏß: 252.7Сʱ
- ³æºÅ: 1585822
- ×¢²á: 2012-01-23
- ÐÔ±ð: GG
- רҵ: Ó¦Óùâѧ
¡¾´ð°¸¡¿Ó¦Öú»ØÌû
¸Ðл²ÎÓ룬ӦÖúÖ¸Êý +1
tmy1977(½ð±Ò+50): ¡ï¡ï¡ï¡ï¡ï×î¼Ñ´ð°¸ лл.¿´À´ÊǽøÈ¥ÁË! 2012-02-14 19:37:35
tmy1977(½ð±Ò+50): ¡ï¡ï¡ï¡ï¡ï×î¼Ñ´ð°¸ лл.¿´À´ÊǽøÈ¥ÁË! 2012-02-14 19:37:35
|
1. ¥¿¥¤¥È¥ë: A new descent memory gradient method and its global convergence ÖøÕßÃû: Sun Min; Bai Qingguo ¥¸¥ã©`¥Ê¥ëÃû: JOURNAL OF SYSTEMS SCIENCE & COMPLEXITY ކ: 24 ºÅ: 4 ¥Ú©`¥¸: 784-794 DOI: 10.1007/s11424-011-8150-0 °kÐÐ: AUG 2011 ±»ÒýÓÃÊý: 0 (Web of Science ¤«¤é) [ ³åh¤ò·Ç±íʾ] In this article, a new descent memory gradient method without restarts is proposed for solving large scale unconstrained optimization problems. The method has the following attractive properties: 1) The search direction is always a sufficiently descent direction at every iteration without the line search used; 2) The search direction always satisfies the angle property, which is independent of the convexity of the objective function. Under mild conditions, the authors prove that the proposed method has global convergence, and its convergence rate is also investigated. The numerical results show that the new descent memory method is efficient for the given test problems. |
2Â¥2012-02-14 19:29:00
tllijp
½ð³æ (ÕýʽдÊÖ)
- Ó¦Öú: 42 (СѧÉú)
- ½ð±Ò: 1247.5
- É¢½ð: 554
- ºì»¨: 6
- Ìû×Ó: 705
- ÔÚÏß: 252.7Сʱ
- ³æºÅ: 1585822
- ×¢²á: 2012-01-23
- ÐÔ±ð: GG
- רҵ: Ó¦Óùâѧ
3Â¥2012-02-14 19:31:51
slimmy
гæ (ÕýʽдÊÖ)
- Ó¦Öú: 108 (¸ßÖÐÉú)
- ½ð±Ò: 1632.6
- É¢½ð: 235
- ºì»¨: 10
- Ìû×Ó: 732
- ÔÚÏß: 512.6Сʱ
- ³æºÅ: 1355545
- ×¢²á: 2011-07-25
- רҵ: µçÁ¦ÏµÍ³
|
CSDL Union NCL OPAC (0) ±£´æÎª: ¸ü¶àÑ¡Ïî A new descent memory gradient method and its global convergence ×÷Õß: Sun, M (Sun, Min)1; Bai, QG (Bai, Qingguo)2 À´Ô´³ö°æÎï: JOURNAL OF SYSTEMS SCIENCE & COMPLEXITY ¾í: 24 ÆÚ: 4 Ò³: 784-794 DOI: 10.1007/s11424-011-8150-0 ³ö°æÄê: AUG 2011 ±»ÒýƵ´Î: 0 (À´×Ô Web of Science) ÒýÓõIJο¼ÎÄÏ×: 18 [ ²é¿´ Related Records ] ÒýÖ¤¹ØÏµÍ¼ ÕªÒª: In this article, a new descent memory gradient method without restarts is proposed for solving large scale unconstrained optimization problems. The method has the following attractive properties: 1) The search direction is always a sufficiently descent direction at every iteration without the line search used; 2) The search direction always satisfies the angle property, which is independent of the convexity of the objective function. Under mild conditions, the authors prove that the proposed method has global convergence, and its convergence rate is also investigated. The numerical results show that the new descent memory method is efficient for the given test problems. Èë²ØºÅ: WOS:000293858300014 ÎÄÏ×ÀàÐÍ: Article ÓïÖÖ: English ×÷Õ߹ؼü´Ê: Global convergence; memory gradient method; sufficiently descent KeyWords Plus: LINE SEARCH; CONJUGATE ͨѶ×÷ÕßµØÖ·: Sun, M (ͨѶ×÷Õß),Zaozhuang Univ, Dept Math & Informat Sci, Zaozhuang 277160, Peoples R China µØÖ·: 1. Zaozhuang Univ, Dept Math & Informat Sci, Zaozhuang 277160, Peoples R China 2. Qufu Normal Univ, Sch Management, Rizhao 276826, Peoples R China µç×ÓÓʼþµØÖ·: sunmin_2008@yahoo.com.cn, qfnubaiqg@163.com »ù½ð×ÊÖúÖÂл: »ù½ð×ÊÖú»ú¹¹ ÊÚȨºÅ National Science Foundation of China 70971076 Foundation of Shandong Provincial Education Department J10LA59 [ÏÔʾ»ù½ð×ÊÖúÐÅÏ¢][Òþ²Ø»ù½ð×ÊÖúÐÅÏ¢] This research is supported by the National Science Foundation of China under Grant No. 70971076 and the Foundation of Shandong Provincial Education Department under Grant No. J10LA59. ³ö°æÉÌ: SPRINGER HEIDELBERG, TIERGARTENSTRASSE 17, D-69121 HEIDELBERG, GERMANY Web of Science ·ÖÀà: Mathematics, Interdisciplinary Applications ѧ¿ÆÀà±ð: Mathematics IDS ºÅ: 806ZW ISSN: 1009-6124 |
4Â¥2012-02-14 20:33:48













»Ø¸´´ËÂ¥