24小时热门版块排行榜    

CyRhmU.jpeg
查看: 1223  |  回复: 3
当前只显示满足指定条件的回帖,点击这里查看本话题的所有回帖

tmy1977

金虫 (正式写手)

[求助] 请哪位朋友帮忙查一下下面这篇文章是否被Sci检索?谢谢!

1. 题名: A new descent memory gradient method and its global convergence
   2. 作者: Min Sun and Qingguo Bai
   3. 期刊题名: Journal of Systems Science and Complexity
   4. 出版社: Springer
   5. 年份: 2011年
  6. 卷期和页码: Volume 24, Number 4, 784-794.
回复此楼

» 猜你喜欢

» 本主题相关价值贴推荐,对您同样有帮助:

已阅   回复此楼   关注TA 给TA发消息 送TA红花 TA的回帖

tllijp

金虫 (正式写手)

【答案】应助回帖

标题: A new descent memory gradient method and its global convergence
作者: Sun Min; Bai Qingguo
来源出版物: JOURNAL OF SYSTEMS SCIENCE & COMPLEXITY  卷: 24   期: 4   页: 784-794   DOI: 10.1007/s11424-011-8150-0   出版年: AUG 2011
被引频次: 0 (来自 Web of Science)
3楼2012-02-14 19:31:51
已阅   回复此楼   关注TA 给TA发消息 送TA红花 TA的回帖
查看全部 4 个回答

tllijp

金虫 (正式写手)

【答案】应助回帖

感谢参与,应助指数 +1
tmy1977(金币+50): ★★★★★最佳答案 谢谢.看来是进去了! 2012-02-14 19:37:35
1.



タイトル: A new descent memory gradient method and its global convergence
著者名: Sun Min; Bai Qingguo
ジャーナル名: JOURNAL OF SYSTEMS SCIENCE & COMPLEXITY  巻: 24   号: 4   ページ: 784-794   DOI: 10.1007/s11424-011-8150-0   発行: AUG 2011
被引用数: 0 (Web of Science から)


[ 抄録を非表示] In this article, a new descent memory gradient method without restarts is proposed for solving large scale unconstrained optimization problems. The method has the following attractive properties: 1) The search direction is always a sufficiently descent direction at every iteration without the line search used; 2) The search direction always satisfies the angle property, which is independent of the convexity of the objective function. Under mild conditions, the authors prove that the proposed method has global convergence, and its convergence rate is also investigated. The numerical results show that the new descent memory method is efficient for the given test problems.
2楼2012-02-14 19:29:00
已阅   回复此楼   关注TA 给TA发消息 送TA红花 TA的回帖

slimmy

新虫 (正式写手)

CSDL Union NCL OPAC   (0)    保存为:   更多选项  

A new descent memory gradient method and its global convergence  
作者: Sun, M (Sun, Min)1; Bai, QG (Bai, Qingguo)2  
来源出版物: JOURNAL OF SYSTEMS SCIENCE & COMPLEXITY  卷: 24   期: 4   页: 784-794   DOI: 10.1007/s11424-011-8150-0   出版年: AUG 2011  
被引频次: 0 (来自 Web of Science)  
引用的参考文献: 18 [ 查看 Related Records ]     引证关系图      
摘要: In this article, a new descent memory gradient method without restarts is proposed for solving large scale unconstrained optimization problems. The method has the following attractive properties: 1) The search direction is always a sufficiently descent direction at every iteration without the line search used; 2) The search direction always satisfies the angle property, which is independent of the convexity of the objective function. Under mild conditions, the authors prove that the proposed method has global convergence, and its convergence rate is also investigated. The numerical results show that the new descent memory method is efficient for the given test problems.  
入藏号: WOS:000293858300014  
文献类型: Article  
语种: English  
作者关键词: Global convergence; memory gradient method; sufficiently descent  
KeyWords Plus: LINE SEARCH; CONJUGATE  
通讯作者地址: Sun, M (通讯作者),Zaozhuang Univ, Dept Math & Informat Sci, Zaozhuang 277160, Peoples R China  
地址:
1. Zaozhuang Univ, Dept Math & Informat Sci, Zaozhuang 277160, Peoples R China
2. Qufu Normal Univ, Sch Management, Rizhao 276826, Peoples R China  
电子邮件地址: sunmin_2008@yahoo.com.cn, qfnubaiqg@163.com  
基金资助致谢:
基金资助机构 授权号
National Science Foundation of China  70971076  
Foundation of Shandong Provincial Education Department  J10LA59  

[显示基金资助信息][隐藏基金资助信息]   

This research is supported by the National Science Foundation of China under Grant No. 70971076 and the Foundation of Shandong Provincial Education Department under Grant No. J10LA59.

出版商: SPRINGER HEIDELBERG, TIERGARTENSTRASSE 17, D-69121 HEIDELBERG, GERMANY  
Web of Science 分类: Mathematics, Interdisciplinary Applications  
学科类别: Mathematics  
IDS 号: 806ZW  
ISSN: 1009-6124
4楼2012-02-14 20:33:48
已阅   回复此楼   关注TA 给TA发消息 送TA红花 TA的回帖
信息提示
请填处理意见