24小时热门版块排行榜    

查看: 1186  |  回复: 3

xuh1991

木虫 (正式写手)

[求助] BP神经网络和梯度下降法的疑问? 已有2人参与

在BP神经网络中,默认的训练函数就是梯度下降法,即按照梯度下降法来更新权值,那么这样的BP神经网络和梯度下降法的区别在哪里呢?
回复此楼
已阅   回复此楼   关注TA 给TA发消息 送TA红花 TA的回帖

syddesk

木虫 (正式写手)

【答案】应助回帖

感谢参与,应助指数 +1
神经网络是一种建模的思想,对数据进行拟合,待求的是一组最优的参数;
梯度下降是用来求解这组参数的方法。
两者是不一样的
2楼2016-09-02 17:56:02
已阅   回复此楼   关注TA 给TA发消息 送TA红花 TA的回帖

FMStation

至尊木虫 (知名作家)

【答案】应助回帖

感谢参与,应助指数 +1
An analogy for understanding gradient descent

The basic intuition behind gradient descent can be illustrated by a hypothetical scenario. A person is stuck in the mountains and is trying to get down (i.e. trying to find the minima). There is heavy fog such that visibility is extremely low. Therefore, the path down the mountain is not visible, so he must use local information to find the minima. He can use the method of gradient descent, which involves looking at the steepness of the hill at his current position, then proceeding in the direction with the steepest descent (i.e. downhill). If he was trying to find the top of the mountain (i.e. the maxima), then he would proceed in the direction steepest ascent (i.e. uphill). Using this method, he would eventually find his way down the mountain. However, assume also that the steepness of the hill is not immediately obvious with simple observation, but rather it requires a sophisticated instrument to measure, which the person happens to have at the moment. It takes quite some time to measure the steepness of the hill with the instrument, thus he should minimize his use of the instrument if he wanted to get down the mountain before sunset. The difficulty then is choosing the frequency at which he should measure the steepness of the hill so not to go off track.

https://en.wikipedia.org/wiki/Fi ... o_input_weights.png

In this analogy,
the person represents the backpropagation algorithm, and
the path taken down the mountain represents the sequence of parameter settings that the algorithm will explore.
The steepness of the hill represents the slope of the error surface at that point.
The instrument used to measure steepness is differentiation (the slope of the error surface can be calculated by taking the derivative of the squared error function at that point).
The direction he chooses to travel in aligns with the gradient of the error surface at that point.
The amount of time he travels before taking another measurement is the learning rate of the algorithm.

https://en.wikipedia.org/wiki/Ba ... ng_gradient_descent
3楼2016-09-03 19:51:07
已阅   回复此楼   关注TA 给TA发消息 送TA红花 TA的回帖

picklas

木虫 (著名写手)

不是一个层次的方法,学习规则可以是梯度下降但也可能不是,具体要是问题对象数学模型而定

发自小木虫IOS客户端
4楼2016-09-04 04:57:27
已阅   回复此楼   关注TA 给TA发消息 送TA红花 TA的回帖
相关版块跳转 我要订阅楼主 xuh1991 的主题更新
信息提示
请填处理意见