Pytorch深度学习|Pytorch深度学习实践(b站刘二大人)_03讲(梯度下降)

刘老师讲的十分细节,易于理解,大家可以去学习,课堂地址,废话不多说,直接上代码。
梯度下降算法课堂代码:

# 梯度下降算法import matplotlib.pyplot as plt x_data = https://www.it610.com/article/[1.0,2.0,3.0] y_data = [2.0,4.0,6.0]w = 1.0def forward(x): return x * w#计算损失函数MSE def cost(xs,ys): cost = 0 for x,y in zip(xs,ys): y_pred = forward(x) cost += (y_pred - y) ** 2 return cost / len(xs)#计算梯度 def gradient(xs,ys): grad = 0 for x,y in zip(xs,ys): grad += 2 * x * (x * w - y) return grad / len(xs)print('Predict (before training)',4,forward(4))epoch_list = [] cost_list = [] #进行一百轮的训练 for epoch in range(100): cost_val = cost(x_data,y_data) grad_val = gradient(x_data,y_data) w -= 0.01 * grad_val print('Epoch:',epoch,'w = ',w,'loss = ',cost_val) epoch_list.append(epoch) cost_list.append(cost_val)print('Predict (after training)',4,forward(4)) plt.plot(epoch_list,cost_list) plt.xlabel('Epoch') plt.ylabel('Cost') plt.grid() plt.show()

【Pytorch深度学习|Pytorch深度学习实践(b站刘二大人)_03讲(梯度下降)】结果图如下:
Pytorch深度学习|Pytorch深度学习实践(b站刘二大人)_03讲(梯度下降)
文章图片

Pytorch深度学习|Pytorch深度学习实践(b站刘二大人)_03讲(梯度下降)
文章图片

随机梯度下降算法:
基本思路:只通过一个随机选取的数据 ( x n , y n ) (x_n,y_n) (xn?,yn?) 来获取“梯度”,以此对 ω \omega ω 进行更新,这种优化方法叫做随机梯度下降。
#随机梯度下降import matplotlib.pyplot as pltimport matplotlib.pyplot as plt x_data = https://www.it610.com/article/[1.0,2.0,3.0] y_data = [2.0,4.0,6.0]w = 1.0def forward(x): return x * w#计算损失函数MSE def loss(x,y): y_pred = forward(x) return (y_pred - y) ** 2#计算梯度 def gradient(x,y): return 2 * x * (x * w - y)print('Predict (before training)',4,forward(4))epoch_list = [] loss_list = [] #进行一百轮的训练 for epoch in range(100): for x,y in zip(x_data,y_data): grad = gradient(x,y) w -= 0.01 * grad print('\tgrad: ',x,y,grad) l = loss(x,y)print('progress:',epoch,'w = ',w,'loss = ',l) epoch_list.append(epoch) loss_list.append(l) print('Predict (after training)',4,forward(4)) plt.plot(epoch_list,loss_list) plt.xlabel('Epoch') plt.ylabel('Loss') plt.grid() plt.show()

运行结果图:
Pytorch深度学习|Pytorch深度学习实践(b站刘二大人)_03讲(梯度下降)
文章图片

Pytorch深度学习|Pytorch深度学习实践(b站刘二大人)_03讲(梯度下降)
文章图片

    推荐阅读